WorldWideScience

Sample records for model requires utilization

  1. Utility requirements for fusion

    Energy Technology Data Exchange (ETDEWEB)

    Vondrasek, R.J.

    1982-02-01

    This report describes work done and results obtained during performance of Task 1 of a study of Utility Requirements and Criteria for Fusion Options. The work consisted of developing a list of utility requirements for fusion optics containing definition of the requirements and showing their relative importance to the utility industry. The project team members developed a preliminary list which was refined by discussions and literature searches. The refined list was recast as a questionnaire which was sent to a substantial portion of the utility industry in this country. Forty-three questionnaire recipients responded including thirty-two utilities. A workshop was held to develop a revised requirements list using the survey responses as a major input. The list prepared by the workshop was further refined by a panel consisting of vice presidents of the three project team firms. The results of the study indicate that in addition to considering the cost of energy for a power plant, utilities consider twenty-three other requirements. Four of the requirements were judged to be vital to plant acceptability: Plant Capital Cost, Financial Liability, Plant Safety and Licensability.

  2. Computer software requirements specification for the world model light duty utility arm system

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, J.E.

    1996-02-01

    This Computer Software Requirements Specification defines the software requirements for the world model of the Light Duty Utility Arm (LDUA) System. It is intended to be used to guide the design of the application software, to be a basis for assessing the application software design, and to establish what is to be tested in the finished application software product. (This deploys end effectors into underground storage tanks by means of robotic arm on end of telescoping mast.)

  3. Utilizing inheritance in requirements engineering

    Science.gov (United States)

    Kaindl, Hermann

    1994-01-01

    The scope of this paper is the utilization of inheritance for requirements specification, i.e., the tasks of analyzing and modeling the domain, as well as forming and defining requirements. Our approach and the tool supporting it are named RETH (Requirements Engineering Through Hypertext). Actually, RETH uses a combination of various technologies, including object-oriented approaches and artificial intelligence (in particular frames). We do not attempt to exclude or replace formal representations, but try to complement and provide means for gradually developing them. Among others, RETH has been applied in the CERN (Conseil Europeen pour la Rechereche Nucleaire) Cortex project. While it would be impossible to explain this project in detail here, it should be sufficient to know that it deals with a generic distributed control system. Since this project is not finished yet, it is difficult to state its size precisely. In order to give an idea, its final goal is to substitute the many existing similar control systems at CERN by this generic approach. Currently, RETH is also tested using real-world requirements for the Pastel Mission Planning System at ESOC in Darmstadt. First, we outline how hypertext is integrated into a frame system in our approach. Moreover, the usefulness of inheritance is demonstrated as performed by the tool RETH. We then summarize our experiences of utilizing inheritance in the Cortex project. Lastly, RETH will be related to existing work.

  4. Utility supply portfolio diversity requirements

    Energy Technology Data Exchange (ETDEWEB)

    Hanser, Philip; Graves, Frank

    2007-06-15

    In general, diversification for its own sake by utilities is likely to come at significant cost from ignoring or having to overcome engineering reasons for preferring a less diversified portfolio of resources. Integrated utilities, distcos and merchant gencos are likely to pursue widely divergent strategies in this regard. (author)

  5. Predictive Modeling of Massive Transfusion Requirements During Liver Transplantation and Its Potential to Reduce Utilization of Blood Bank Resources.

    Science.gov (United States)

    Pustavoitau, Aliaksei; Lesley, Maggie; Ariyo, Promise; Latif, Asad; Villamayor, April J; Frank, Steven M; Rizkalla, Nicole; Merritt, William; Cameron, Andrew; Dagher, Nabil; Philosophe, Benjamin; Gurakar, Ahmet; Gottschalk, Allan

    2017-05-01

    Patients undergoing liver transplantation frequently but inconsistently require massive blood transfusion. The ability to predict massive transfusion (MT) could reduce the impact on blood bank resources through customization of the blood order schedule. Current predictive models of MT for blood product utilization during liver transplantation are not generally applicable to individual institutions owing to variability in patient population, intraoperative management, and definitions of MT. Moreover, existing models may be limited by not incorporating cirrhosis stage or thromboelastography (TEG) parameters. This retrospective cohort study included all patients who underwent deceased-donor liver transplantation at the Johns Hopkins Hospital between 2010 and 2014. We defined MT as intraoperative transfusion of > 10 units of packed red blood cells (pRBCs) and developed a multivariable predictive model of MT that incorporated cirrhosis stage and TEG parameters. The accuracy of the model was assessed with the goodness-of-fit test, receiver operating characteristic analysis, and bootstrap resampling. The distribution of correct patient classification was then determined as we varied the model threshold for classifying MT. Finally, the potential impact of these predictions on blood bank resources was examined. Two hundred three patients were included in the study. Sixty (29.6%) patients met the definition for MT and received a median (interquartile range) of 19.0 (14.0-27.0) pRBC units intraoperatively compared with 4.0 units (1.0-6.0) for those who did not satisfy the criterion for MT. The multivariable model for predicting MT included Model for End-stage Liver Disease score, whether simultaneous liver and kidney transplant was performed, cirrhosis stage, hemoglobin concentration, platelet concentration, and TEG R interval and angle. This model demonstrated good calibration (Hosmer-Lemeshow goodness-of-fit test P = .45) and good discrimination (c statistic: 0.835; 95

  6. Testing agile requirements models

    Institute of Scientific and Technical Information of China (English)

    BOTASCHANJAN Jewgenij; PISTER Markus; RUMPE Bernhard

    2004-01-01

    This paper discusses a model-based approach to validate software requirements in agile development processes by simulation and in particular automated testing. The use of models as central development artifact needs to be added to the portfolio of software engineering techniques, to further increase efficiency and flexibility of the development beginning already early in the requirements definition phase. Testing requirements are some of the most important techniques to give feedback and to increase the quality of the result. Therefore testing of artifacts should be introduced as early as possible, even in the requirements definition phase.

  7. Utility survey of requirements for a HTS fault current limiter

    DEFF Research Database (Denmark)

    Nielsen, Jan Nygaard; Jørgensen, P.; Østergaard, Jacob;

    2000-01-01

    The application of superconducting fault current limiters (SFCL) in the electric utility sector will clearly dependent on to what extent the needs and requirements of electric utilities can be met by the ongoing development of SFCL technology. This paper considers a questionnaire survey of which ...... needs and expectations the Danish electric utilities have to this new technology. A bus-tie application of SFCL in a distribution substation with three parallel-coupled transformers is discussed......The application of superconducting fault current limiters (SFCL) in the electric utility sector will clearly dependent on to what extent the needs and requirements of electric utilities can be met by the ongoing development of SFCL technology. This paper considers a questionnaire survey of which...

  8. Utility interface requirements for a solar power system

    Energy Technology Data Exchange (ETDEWEB)

    Donalek, P.J.; Whysong, J.L.

    1978-09-01

    This study specifies that the southern tier of the US (south of the 36th parallel) should be examined to see what problems might develop with the installation of a Satellite Power System (SPS) in the year 2000. One or more 5-GW SPS units could be installed in the utility systems of the southern states in the year 2000. The 345- and 500-kV transmission systems that will probably exist at that time could be readily extended to accommodate the SPS units. The operation of the units will present the utilities with new and difficult problems in system stability and frequency control. The problems will arise because a somewhat variable 5-GW output will be produced by a generator having no mechanical inertia. The unavoidable time lag in controlling the position of the energy beam at the receiving station may have a very critical effect on the stability of the utility systems. The maintenance problems associated with the energy-receiving device, a continuous structure covering more than 40 mi/sup 2/, must be given careful consideration. Repair of lightning damage while maintaining SPS operation may be the most critical requirement. Acquisition and preparation of the 90 mi/sup 2/ land required for the receiving antenna (rectenna) will create many new and difficult environmental problems.

  9. Deriving minimal models for resource utilization

    NARCIS (Netherlands)

    te Brinke, Steven; Bockisch, Christoph; Bergmans, Lodewijk; Malakuti Khah Olun Abadi, Somayeh; Aksit, Mehmet; Katz, Shmuel

    2013-01-01

    We show how compact Resource Utilization Models (RUMs) can be extracted from concrete overly-detailed models of systems or sub-systems in order to model energy-aware software. Using the Counterexample-Guided Abstraction Refinement (CEGAR) approach, along with model-checking tools, abstract models

  10. Modeling Requirements for Cohort and Register IT.

    Science.gov (United States)

    Stäubert, Sebastian; Weber, Ulrike; Michalik, Claudia; Dress, Jochen; Ngouongo, Sylvie; Stausberg, Jürgen; Winter, Alfred

    2016-01-01

    The project KoRegIT (funded by TMF e.V.) aimed to develop a generic catalog of requirements for research networks like cohort studies and registers (KoReg). The catalog supports such kind of research networks to build up and to manage their organizational and IT infrastructure. To make transparent the complex relationships between requirements, which are described in use cases from a given text catalog. By analyzing and modeling the requirements a better understanding and optimizations of the catalog are intended. There are two subgoals: a) to investigate one cohort study and two registers and to model the current state of their IT infrastructure; b) to analyze the current state models and to find simplifications within the generic catalog. Processing the generic catalog was performed by means of text extraction, conceptualization and concept mapping. Then methods of enterprise architecture planning (EAP) are used to model the extracted information. To work on objective a) questionnaires are developed by utilizing the model. They are used for semi-structured interviews, whose results are evaluated via qualitative content analysis. Afterwards the current state was modeled. Objective b) was done by model analysis. A given generic text catalog of requirements was transferred into a model. As result of objective a) current state models of one existing cohort study and two registers are created and analyzed. An optimized model called KoReg-reference-model is the result of objective b). It is possible to use methods of EAP to model requirements. This enables a better overview of the partly connected requirements by means of visualization. The model based approach also enables the analysis and comparison of the empirical data from the current state models. Information managers could reduce the effort of planning the IT infrastructure utilizing the KoReg-reference-model. Modeling the current state and the generation of reports from the model, which could be used as

  11. National Maglev initiative: California line electric utility power system requirements

    Science.gov (United States)

    Save, Phil

    1994-05-01

    The electrical utility power system requirements were determined for a Maglev line from San Diego to San Francisco and Sacramento with a maximum capacity of 12,000 passengers an hour in each direction at a speed of 300 miles per hour, or one train every 30 seconds in each direction. Basically the Maglev line requires one 50-MVA substation every 12.5 miles. The need for new power lines to serve these substations and their voltage levels are based not only on equipment loading criteria but also on limitations due to voltage flicker and harmonics created by the Maglev system. The resulting power system requirements and their costs depend mostly on the geographical area, urban or suburban with 'strong' power systems, or mountains and rural areas with 'weak' power systems. A reliability evaluation indicated that emergency power sources, such as a 10-MW battery at each substation, were not justified if sufficient redundancy is provided in the design of the substations and the power lines serving them. With a cost of $5.6 M per mile, the power system requirements, including the 12-kV DC cables and the inverters along the Maglev line, were found to be the second largest cost component of the Maglev system, after the cost of the guideway system ($9.1 M per mile), out of a total cost of $23 M per mile.

  12. Continuous utility factor in segregation models

    Science.gov (United States)

    Roy, Parna; Sen, Parongama

    2016-02-01

    We consider the constrained Schelling model of social segregation in which the utility factor of agents strictly increases and nonlocal jumps of the agents are allowed. In the present study, the utility factor u is defined in a way such that it can take continuous values and depends on the tolerance threshold as well as the fraction of unlike neighbors. Two models are proposed: in model A the jump probability is determined by the sign of u only, which makes it equivalent to the discrete model. In model B the actual values of u are considered. Model A and model B are shown to differ drastically as far as segregation behavior and phase transitions are concerned. In model A, although segregation can be achieved, the cluster sizes are rather small. Also, a frozen state is obtained in which steady states comprise many unsatisfied agents. In model B, segregated states with much larger cluster sizes are obtained. The correlation function is calculated to show quantitatively that larger clusters occur in model B. Moreover for model B, no frozen states exist even for very low dilution and small tolerance parameter. This is in contrast to the unconstrained discrete model considered earlier where agents can move even when utility remains the same. In addition, we also consider a few other dynamical aspects which have not been studied in segregation models earlier.

  13. The Service Utility Model in Service Management

    Institute of Scientific and Technical Information of China (English)

    LI Yan; ZHOU Wen-an; SONG Jun-de

    2005-01-01

    Aiming to provide a measurable service Quality of Service (QoS) evaluating method for service inventory management, this paper proposes a new mobile Service Utility Model (SUM), considers the service and business layer elements into the service utility influence profile, and proposes an self-adaptive service inventory management algorithm as a QoS control scheme based on SUM. It can be concluded from the simulation result that the service inventory utility can be fully reflected by SUM and the whole system efficiency is greatly increased by using SUM as the adaptive rule.

  14. Agent Based Multiviews Requirements Model

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Based on the current researches of viewpoints oriented requirements engineering and intelligent agent, we present the concept of viewpoint agent and its abstract model based on a meta-language for multiviews requirements engineering. It provided a basis for consistency checking and integration of different viewpoint requirements, at the same time, these checking and integration works can automatically realized in virtue of intelligent agent's autonomy, proactiveness and social ability. Finally, we introduce the practical application of the model by the case study of data flow diagram.

  15. The Utility of Ada for Army Modeling

    Science.gov (United States)

    1990-04-10

    34 Ada " for Ada Lovelace (1815-1851), a mathematician who worked with Charles Babbage on his difference and analytic engines.9 Later in 1979, the HOLWG...OF ADA FOR ARMY MODELING BY COLONEL MICHAEL L. YOCOM DISTRIBUTION STATEMENT A: Approved for publie releases distribution is unlimited. 1% LF-, EC TE...TITLE (ad Subtitle) a. TYPE OF REPORT & PERIOD COVERED The Utility of Ada for Army Modeling Individual Study Project 6 PERFORMING ORG. REPORT NUMBER

  16. Modelling of biomass utilization for energy purpose

    Energy Technology Data Exchange (ETDEWEB)

    Grzybek, Anna (ed.)

    2010-07-01

    the overall farms structure, farms land distribution on several separate subfields for one farm, villages' overpopulation and very high employment in agriculture (about 27% of all employees in national economy works in agriculture). Farmers have low education level. In towns 34% of population has secondary education and in rural areas - only 15-16%. Less than 2% inhabitants of rural areas have higher education. The structure of land use is as follows: arable land 11.5%, meadows and pastures 25.4%, forests 30.1%. Poland requires implementation of technical and technological progress for intensification of agricultural production. The reason of competition for agricultural land is maintenance of the current consumption level and allocation of part of agricultural production for energy purposes. Agricultural land is going to be key factor for biofuels production. In this publication research results for the Project PL0073 'Modelling of energetical biomass utilization for energy purposes' have been presented. The Project was financed from the Norwegian Financial Mechanism and European Economic Area Financial Mechanism. The publication is aimed at moving closer and explaining to the reader problems connected with cultivations of energy plants and dispelling myths concerning these problems. Exchange of fossil fuels by biomass for heat and electric energy production could be significant input in carbon dioxide emission reduction. Moreover, biomass crop and biomass utilization for energetical purposes play important role in agricultural production diversification in rural areas transformation. Agricultural production widening enables new jobs creation. Sustainable development is going to be fundamental rule for Polish agriculture evolution in long term perspective. Energetical biomass utilization perfectly integrates in the evolution frameworks, especially on local level. There are two facts. The fist one is that increase of interest in energy crops in Poland

  17. Modelling of biomass utilization for energy purpose

    Energy Technology Data Exchange (ETDEWEB)

    Grzybek, Anna (ed.)

    2010-07-01

    the overall farms structure, farms land distribution on several separate subfields for one farm, villages' overpopulation and very high employment in agriculture (about 27% of all employees in national economy works in agriculture). Farmers have low education level. In towns 34% of population has secondary education and in rural areas - only 15-16%. Less than 2% inhabitants of rural areas have higher education. The structure of land use is as follows: arable land 11.5%, meadows and pastures 25.4%, forests 30.1%. Poland requires implementation of technical and technological progress for intensification of agricultural production. The reason of competition for agricultural land is maintenance of the current consumption level and allocation of part of agricultural production for energy purposes. Agricultural land is going to be key factor for biofuels production. In this publication research results for the Project PL0073 'Modelling of energetical biomass utilization for energy purposes' have been presented. The Project was financed from the Norwegian Financial Mechanism and European Economic Area Financial Mechanism. The publication is aimed at moving closer and explaining to the reader problems connected with cultivations of energy plants and dispelling myths concerning these problems. Exchange of fossil fuels by biomass for heat and electric energy production could be significant input in carbon dioxide emission reduction. Moreover, biomass crop and biomass utilization for energetical purposes play important role in agricultural production diversification in rural areas transformation. Agricultural production widening enables new jobs creation. Sustainable development is going to be fundamental rule for Polish agriculture evolution in long term perspective. Energetical biomass utilization perfectly integrates in the evolution frameworks, especially on local level. There are two facts. The fist one is that increase of interest in energy crops in Poland

  18. Estimation of peginesatide utilization requires patient-level data

    Directory of Open Access Journals (Sweden)

    Alex Yang

    2012-06-01

    Due to the nonlinear dose relationship between peginesatide and epoetin, facilities with similar epoetin use (<2% relative difference had up to 35% difference in estimate of peginesatide use. For accurate estimation of peginesatide utilization, it is important to base conversions on epoetin dose distribution rather than mean epoetin dose.fx1

  19. Utilizing computer models for optimizing classroom acoustics

    Science.gov (United States)

    Hinckley, Jennifer M.; Rosenberg, Carl J.

    2002-05-01

    The acoustical conditions in a classroom play an integral role in establishing an ideal learning environment. Speech intelligibility is dependent on many factors, including speech loudness, room finishes, and background noise levels. The goal of this investigation was to use computer modeling techniques to study the effect of acoustical conditions on speech intelligibility in a classroom. This study focused on a simulated classroom which was generated using the CATT-acoustic computer modeling program. The computer was utilized as an analytical tool in an effort to optimize speech intelligibility in a typical classroom environment. The factors that were focused on were reverberation time, location of absorptive materials, and background noise levels. Speech intelligibility was measured with the Rapid Speech Transmission Index (RASTI) method.

  20. Modeling utilization distributions in space and time.

    Science.gov (United States)

    Keating, Kim A; Cherry, Steve

    2009-07-01

    W. Van Winkle defined the utilization distribution (UD) as a probability density that gives an animal's relative frequency of occurrence in a two-dimensional (x, y) plane. We extend Van Winkle's work by redefining the UD as the relative frequency distribution of an animal's occurrence in all four dimensions of space and time. We then describe a product kernel model estimation method, devising a novel kernel from the wrapped Cauchy distribution to handle circularly distributed temporal covariates, such as day of year. Using Monte Carlo simulations of animal movements in space and time, we assess estimator performance. Although not unbiased, the product kernel method yields models highly correlated (Pearson's r = 0.975) with true probabilities of occurrence and successfully captures temporal variations in density of occurrence. In an empirical example, we estimate the expected UD in three dimensions (x, y, and t) for animals belonging to each of two distinct bighorn sheep (Ovis canadensis) social groups in Glacier National Park, Montana, USA. Results show the method can yield ecologically informative models that successfully depict temporal variations in density of occurrence for a seasonally migratory species. Some implications of this new approach to UD modeling are discussed.

  1. Animal Models Utilized in HTLV-1 Research.

    Science.gov (United States)

    Panfil, Amanda R; Al-Saleem, Jacob J; Green, Patrick L

    2013-01-01

    Since the isolation and discovery of human T-cell leukemia virus type 1 (HTLV-1) over 30 years ago, researchers have utilized animal models to study HTLV-1 transmission, viral persistence, virus-elicited immune responses, and HTLV-1-associated disease development (ATL, HAM/TSP). Non-human primates, rabbits, rats, and mice have all been used to help understand HTLV-1 biology and disease progression. Non-human primates offer a model system that is phylogenetically similar to humans for examining viral persistence. Viral transmission, persistence, and immune responses have been widely studied using New Zealand White rabbits. The advent of molecular clones of HTLV-1 has offered the opportunity to assess the importance of various viral genes in rabbits, non-human primates, and mice. Additionally, over-expression of viral genes using transgenic mice has helped uncover the importance of Tax and Hbz in the induction of lymphoma and other lymphocyte-mediated diseases. HTLV-1 inoculation of certain strains of rats results in histopathological features and clinical symptoms similar to that of humans with HAM/TSP. Transplantation of certain types of ATL cell lines in immunocompromised mice results in lymphoma. Recently, "humanized" mice have been used to model ATL development for the first time. Not all HTLV-1 animal models develop disease and those that do vary in consistency depending on the type of monkey, strain of rat, or even type of ATL cell line used. However, the progress made using animal models cannot be understated as it has led to insights into the mechanisms regulating viral replication, viral persistence, disease development, and, most importantly, model systems to test disease treatments.

  2. Modeling regulated water utility investment incentives

    Science.gov (United States)

    Padula, S.; Harou, J. J.

    2014-12-01

    This work attempts to model the infrastructure investment choices of privatized water utilities subject to rate of return and price cap regulation. The goal is to understand how regulation influences water companies' investment decisions such as their desire to engage in transfers with neighbouring companies. We formulate a profit maximization capacity expansion model that finds the schedule of new supply, demand management and transfer schemes that maintain the annual supply-demand balance and maximize a companies' profit under the 2010-15 price control process in England. Regulatory incentives for costs savings are also represented in the model. These include: the CIS scheme for the capital expenditure (capex) and incentive allowance schemes for the operating expenditure (opex) . The profit-maximizing investment program (what to build, when and what size) is compared with the least cost program (social optimum). We apply this formulation to several water companies in South East England to model performance and sensitivity to water network particulars. Results show that if companies' are able to outperform the regulatory assumption on the cost of capital, a capital bias can be generated, due to the fact that the capital expenditure, contrarily to opex, can be remunerated through the companies' regulatory capital value (RCV). The occurrence of the 'capital bias' or its entity depends on the extent to which a company can finance its investments at a rate below the allowed cost of capital. The bias can be reduced by the regulatory penalties for underperformances on the capital expenditure (CIS scheme); Sensitivity analysis can be applied by varying the CIS penalty to see how and to which extent this impacts the capital bias effect. We show how regulatory changes could potentially be devised to partially remove the 'capital bias' effect. Solutions potentially include allowing for incentives on total expenditure rather than separately for capex and opex and allowing

  3. Process Model for Defining Space Sensing and Situational Awareness Requirements

    Science.gov (United States)

    2006-04-01

    process model for defining systems for space sensing and space situational awareness is presented. The paper concentrates on eight steps for determining the requirements to include: decision maker needs, system requirements, exploitation methods and vulnerabilities, critical capabilities, and identify attack scenarios. Utilization of the USAF anti-tamper (AT) implementation process as a process model departure point for the space sensing and situational awareness (SSSA...is presented. The AT implementation process model , as an

  4. European utilities. What are the advantages, limits and perspectives of the multi-utilities model?; Les utilities europeennes. Quels sont les atouts, les limites et les perspectives du modele multi-utilities?

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-01-01

    Since several years, the European utilities (public service companies) have started a movement of broadening of their offer. They now encompass the overall domain of collective services (mainly electricity, natural gas, wastes and water) to propose global offers. The European utilities also go deeper in their range of offers with the management of fluxes (fluid supplies), of services (energy) and the externalization of infrastructures (cogeneration unit, waste water treatment plants, etc..). The multi-utilities model (association of at least 2 collective services inside a same portfolio of activities) has become a standard. This strategic evolution, implemented since several years in Europe, has been possible thanks to regulation changes: opening of gas and electricity markets to competition, increase of environmental standard requirements, delegation of services. If the multi-utilities concept appears as attractive, several questions remain unanswered: what are the real synergies between collective service activities? Is the multi-utilities model viable? How to finance such a model? How to construct and commercialize global offers? What are the commercial and financial results of the multi-utilities model? What are the alternatives to this model? Which strategic future to the European utilities? This study makes a precise status of the European markets of collective services. It analyzes the strategy of the European groups which have adopted the multi-utilities model and shows the advantages and limits of this strategic model. The study includes an analysis of the financial performances of the 20 main European groups which allows to foresee two scenarios of mid-term evolution for this sector. (J.S.)

  5. Radiation Belt and Plasma Model Requirements

    Science.gov (United States)

    Barth, Janet L.

    2005-01-01

    Contents include the following: Radiation belt and plasma model environment. Environment hazards for systems and humans. Need for new models. How models are used. Model requirements. How can space weather community help?

  6. Integration of photovoltaic units into electric utility grids: experiment information requirements and selected issues

    Energy Technology Data Exchange (ETDEWEB)

    1980-09-01

    A number of investigations, including those conducted by The Aerospace Corporation and other contractors, have led to the recognition of technical, economic, and institutional issues relating to the interface between solar electric technologies and electric utility systems. These issues derive from three attributes of solar electric power concepts, including (1) the variability and unpredictability of the solar resources, (2) the dispersed nature of those resources which suggests the feasible deployment of small dispersed power units, and (3) a high initial capital cost coupled with relatively low operating costs. It is imperative that these integration issues be pursued in parallel with the development of each technology if the nation's electric utility systems are to effectively utilize these technologies in the near to intermediate term. Analyses of three of these issues are presented: utility information requirements, generation mix and production cost impacts, and rate structures in the context of photovoltaic units integrated into the utility system. (WHK)

  7. Reflections on Improvement of Utility Model System in China

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Even since 1985 when the Patent Law was launched in China, the utility model patent has been playing a very important role. Over the two decades, the utility model system has played an active part in encouraging invention-creations, and promoting the progress and development of science and technology.

  8. Development of utility generic functional requirements for electronic work packages and computer-based procedures

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-06-01

    The Nuclear Electronic Work Packages - Enterprise Requirements (NEWPER) initiative is a step toward a vision of implementing an eWP framework that includes many types of eWPs. This will enable immediate paper-related cost savings in work management and provide a path to future labor efficiency gains through enhanced integration and process improvement in support of the Nuclear Promise (Nuclear Energy Institute 2016). The NEWPER initiative was organized by the Nuclear Information Technology Strategic Leadership (NITSL) group, which is an organization that brings together leaders from the nuclear utility industry and regulatory agencies to address issues involved with information technology used in nuclear-power utilities. NITSL strives to maintain awareness of industry information technology-related initiatives and events and communicates those events to its membership. NITSL and LWRS Program researchers have been coordinating activities, including joint organization of NEWPER-related meetings and report development. The main goal of the NEWPER initiative was to develop a set of utility generic functional requirements for eWP systems. This set of requirements will support each utility in their process of identifying plant-specific functional and non-functional requirements. The NEWPER initiative has 140 members where the largest group of members consists of 19 commercial U.S. nuclear utilities and eleven of the most prominent vendors of eWP solutions. Through the NEWPER initiative two sets of functional requirements were developed; functional requirements for electronic work packages and functional requirements for computer-based procedures. This paper will describe the development process as well as a summary of the requirements.

  9. Model Penentuan Nilai Target Functional Requirement Berbasis Utilitas

    Directory of Open Access Journals (Sweden)

    Cucuk Nur Rosyidi

    2012-01-01

    Full Text Available In a product design and development process, a designer faces a problem to decide functional requirement (FR target values. That decision is made under a risk since it is conducted in the early design phase using incomplete information. Utility function can be used to reflect the decision maker attitude towards the risk in making such decision. In this research, we develop a utility-based model to determine FR target values using quadratic utility function and information from Quality Function Deployment (QFD. A pencil design is used as a numerical example using quadratic utility function for each FR. The model can be applied for balancing customer and designer interest in determining FR target values.

  10. Latent Utility Shocks in a Structural Empirical Asset Pricing Model

    DEFF Research Database (Denmark)

    Christensen, Bent Jesper; Raahauge, Peter

    We consider a random utility extension of the fundamental Lucas (1978) equilibriumasset pricing model. The resulting structural model leads naturally to a likelihoodfunction. We estimate the model using U.S. asset market data from 1871 to2000, using both dividends and earnings as state variables....... We find that current dividendsdo not forecast future utility shocks, whereas current utility shocks do forecastfuture dividends. The estimated structural model produces a sequence of predictedutility shocks which provide better forecasts of future long-horizon stock market returnsthan the classical...... dividend-price ratio.KEYWORDS: Randomutility, asset pricing, maximumlikelihood, structuralmodel,return predictability...

  11. A Framework for Modelling Software Requirements

    Directory of Open Access Journals (Sweden)

    Dhirendra Pandey

    2011-05-01

    Full Text Available Requirement engineering plays an important role in producing quality software products. In recent past years, some approaches of requirement framework have been designed to provide an end-to-end solution for system development life cycle. Textual requirements specifications are difficult to learn, design, understand, review, and maintain whereas pictorial modelling is widely recognized as an effective requirement analysis tool. In this paper, we will present a requirement modelling framework with the analysis of modern requirements modelling techniques. Also, we will discuss various domains of requirement engineering with the help of modelling elements such as semantic map of business concepts, lifecycles of business objects, business processes, business rules, system context diagram, use cases and their scenarios, constraints, and user interface prototypes. The proposed framework will be illustrated with the case study of inventory management system.

  12. 78 FR 65427 - Pipeline Safety: Reminder of Requirements for Liquefied Petroleum Gas and Utility Liquefied...

    Science.gov (United States)

    2013-10-31

    ... operator qualification and testing requirements under Part 192. Subpart P--Distribution Pipeline Integrity... removing the exemption for small utility LP gas systems from Subpart N (Qualification of Pipeline Personnel... surveillance (Sec. 192.613). Public awareness (Sec. 192.616). Operator qualification (Subpart N)...

  13. On Modeling CPU Utilization of MapReduce Applications

    CERN Document Server

    Rizvandi, Nikzad Babaii; Zomaya, Albert Y

    2012-01-01

    In this paper, we present an approach to predict the total CPU utilization in terms of CPU clock tick of applications when running on MapReduce framework. Our approach has two key phases: profiling and modeling. In the profiling phase, an application is run several times with different sets of MapReduce configuration parameters to profile total CPU clock tick of the application on a given platform. In the modeling phase, multi linear regression is used to map the sets of MapReduce configuration parameters (number of Mappers, number of Reducers, size of File System (HDFS) and the size of input file) to total CPU clock ticks of the application. This derived model can be used for predicting total CPU requirements of the same application when using MapReduce framework on the same platform. Our approach aims to eliminate error-prone manual processes and presents a fully automated solution. Three standard applications (WordCount, Exim Mainlog parsing and Terasort) are used to evaluate our modeling technique on pseu...

  14. Generic Model to Send Secure Alerts for Utility Companies

    Directory of Open Access Journals (Sweden)

    Perez–Díaz J.A.

    2010-04-01

    Full Text Available In some industries such as logistics services, bank services, and others, the use of automated systems that deliver critical business information anytime and anywhere play an important role in the decision making process. This paper introduces a "Generic model to send secure alerts and notifications", which operates as a middleware between enterprise data sources and its mobile users. This model uses Short Message Service (SMS as its main mobile messaging technology, however is open to use new types of messaging technologies. Our model is interoperable with existing information systems, it can store any kind of information about alerts or notifications at different levels of granularity, it offers different types of notifications (as analert when critical business problems occur,asanotificationina periodical basis or as 2 way query. Notification rules can be customized by final users according to their preferences. The model provides a security framework in the cases where information requires confidentiality, it is extensible to existing and new messaging technologies (like e–mail, MMS, etc. It is a platform, mobile operator and hardware independent. Currently, our solution is being used at the Comisión Federal de Electricidad (Mexico's utility company to deliver secure alerts related to critical events registered in the main power generation plants of our country.

  15. Evaluation of Usability Utilizing Markov Models

    Science.gov (United States)

    Penedo, Janaina Rodrigues; Diniz, Morganna; Ferreira, Simone Bacellar Leal; Silveira, Denis S.; Capra, Eliane

    2012-01-01

    Purpose: The purpose of this paper is to analyze the usability of a remote learning system in its initial development phase, using a quantitative usability evaluation method through Markov models. Design/methodology/approach: The paper opted for an exploratory study. The data of interest of the research correspond to the possible accesses of users…

  16. An Extended Analysis of Requirements Traceability Model

    Institute of Scientific and Technical Information of China (English)

    Jiang Dandong(蒋丹东); Zhang Shensheng; Chen Lu

    2004-01-01

    A new extended meta model of traceability is presented. Then, a formalized fine-grained model of traceability is described. Some major issues about this model, including trace units, requirements and relations within the model, are further analyzed. Finally, a case study that comes from a key project of 863 Program is given.

  17. Integration of photovoltaic units into electric utility grids: experiment information requirements and selected issues

    Energy Technology Data Exchange (ETDEWEB)

    1980-09-01

    A number of investigations have led to the recognition of technical, economic, and institutional issues relating to the interface between solar electric technologies and electric utility systems. These issues derive from three attributes of solar electric power concepts, including (1) the variability and unpredictability of the solar resources, (2) the dispersed nature of those resources which suggest the deployment of small dispersed power units, and (3) a high initial capital cost coupled with relatively low operating costs. An important part of the DOE programs to develop new source technologies, in particular photovoltaic systems, is the experimental testing of complete or nearby complete power units. These experiments provide an opportunity to examine operational and integration issues which must be understood before widespread commercial deployment of these technologies can be achieved. Experiments may also be required to explicitly examine integration, operational, and control aspects of single and multiple new source technology power units within a utility system. An identification of utility information requirements, a review of planned experiments, and a preliminary determination of additional experimental needs and opportunities are presented. Other issues discussed include: (1) the impacts of on-site photovoltaic units on load duration curves and optimal generation mixes are considered; (2) the impacts of on-site photovoltaic units on utility production costs, with and without dedicated storage and with and without sellback, are analyzed; and (3) current utility rate structure experiments, rationales, policies, practices, and plans are reviewed.

  18. Long-term dynamics simulation: Modeling requirements

    Energy Technology Data Exchange (ETDEWEB)

    Morched, A.S.; Kar, P.K.; Rogers, G.J.; Morison, G.K. (Ontario Hydro, Toronto, ON (Canada))

    1989-12-01

    This report details the required performance and modelling capabilities of a computer program intended for the study of the long term dynamics of power systems. Following a general introduction which outlines the need for long term dynamic studies, the modelling requirements for the conduct of such studies is discussed in detail. Particular emphasis is placed on models for system elements not normally modelled in power system stability programs, which will have a significant impact in the long term time frame of minutes to hours following the initiating disturbance. The report concludes with a discussion of the special computational and programming requirements for a long term stability program. 43 refs., 36 figs.

  19. Requirements for clinical information modelling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Jódar-Sánchez, Francisco; Kalra, Dipak

    2015-07-01

    This study proposes consensus requirements for clinical information modelling tools that can support modelling tasks in medium/large scale institutions. Rather than identify which functionalities are currently available in existing tools, the study has focused on functionalities that should be covered in order to provide guidance about how to evolve the existing tools. After identifying a set of 56 requirements for clinical information modelling tools based on a literature review and interviews with experts, a classical Delphi study methodology was applied to conduct a two round survey in order to classify them as essential or recommended. Essential requirements are those that must be met by any tool that claims to be suitable for clinical information modelling, and if we one day have a certified tools list, any tool that does not meet essential criteria would be excluded. Recommended requirements are those more advanced requirements that may be met by tools offering a superior product or only needed in certain modelling situations. According to the answers provided by 57 experts from 14 different countries, we found a high level of agreement to enable the study to identify 20 essential and 21 recommended requirements for these tools. It is expected that this list of identified requirements will guide developers on the inclusion of new basic and advanced functionalities that have strong support by end users. This list could also guide regulators in order to identify requirements that could be demanded of tools adopted within their institutions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. Network Bandwidth Utilization Forecast Model on High Bandwidth Network

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Wucherl; Sim, Alex

    2014-07-07

    With the increasing number of geographically distributed scientific collaborations and the scale of the data size growth, it has become more challenging for users to achieve the best possible network performance on a shared network. We have developed a forecast model to predict expected bandwidth utilization for high-bandwidth wide area network. The forecast model can improve the efficiency of resource utilization and scheduling data movements on high-bandwidth network to accommodate ever increasing data volume for large-scale scientific data applications. Univariate model is developed with STL and ARIMA on SNMP path utilization data. Compared with traditional approach such as Box-Jenkins methodology, our forecast model reduces computation time by 83.2percent. It also shows resilience against abrupt network usage change. The accuracy of the forecast model is within the standard deviation of the monitored measurements.

  1. A Requirements Analysis Model Based on QFD

    Institute of Scientific and Technical Information of China (English)

    TANG Zhi-wei; Nelson K.H.Tang

    2004-01-01

    The enterprise resource planning (ERP) system has emerged to offer an integrated IT solution and more and more enterprises are increasing by adopting this system and regarding it as an important innovation. However, there is already evidence of high failure risks in ERP project implementation, one major reason is poor analysis of the requirements for system implementation. In this paper, the importance of requirements analysis for ERP project implementation is highlighted, and a requirements analysis model by applying quality function deployment (QFD) is presented, which will support to conduct requirements analysis for ERP project.

  2. Modeling and Testing Legacy Data Consistency Requirements

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard

    2003-01-01

    An increasing number of data sources are available on the Internet, many of which offer semantically overlapping data, but based on different schemas, or models. While it is often of interest to integrate such data sources, the lack of consistency among them makes this integration difficult....... This paper addresses the need for new techniques that enable the modeling and consistency checking for legacy data sources. Specifically, the paper contributes to the development of a framework that enables consistency testing of data coming from different types of data sources. The vehicle is UML and its...... accompanying XMI. The paper presents techniques for modeling consistency requirements using OCL and other UML modeling elements: it studies how models that describe the required consistencies among instances of legacy models can be designed in standard UML tools that support XMI. The paper also considers...

  3. Utility of Social Modeling for Proliferation Assessment - Preliminary Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Coles, Garill A.; Gastelum, Zoe N.; Brothers, Alan J.; Thompson, Sandra E.

    2009-06-01

    This Preliminary Assessment draft report will present the results of a literature search and preliminary assessment of the body of research, analysis methods, models and data deemed to be relevant to the Utility of Social Modeling for Proliferation Assessment research. This report will provide: 1) a description of the problem space and the kinds of information pertinent to the problem space, 2) a discussion of key relevant or representative literature, 3) a discussion of models and modeling approaches judged to be potentially useful to the research, and 4) the next steps of this research that will be pursued based on this preliminary assessment. This draft report represents a technical deliverable for the NA-22 Simulations, Algorithms, and Modeling (SAM) program. Specifically this draft report is the Task 1 deliverable for project PL09-UtilSocial-PD06, Utility of Social Modeling for Proliferation Assessment. This project investigates non-traditional use of social and cultural information to improve nuclear proliferation assessment, including nonproliferation assessment, proliferation resistance assessments, safeguards assessments and other related studies. These assessments often use and create technical information about the State’s posture towards proliferation, the vulnerability of a nuclear energy system to an undesired event, and the effectiveness of safeguards. This project will find and fuse social and technical information by explicitly considering the role of cultural, social and behavioral factors relevant to proliferation. The aim of this research is to describe and demonstrate if and how social science modeling has utility in proliferation assessment.

  4. Multiobjective financial planning model for electric-utility rate regulation

    Energy Technology Data Exchange (ETDEWEB)

    Linke, C.M.; Whitford, D.T.

    1983-08-01

    The interests of the three parties to the regulatory process (investors in an electric utility, consumers, and regulators) are often in conflict. Investors are concerned with shareholder wealth maximization, while consumers desire dependable service at low rates. If the desired end product of regulation is to establish rates that balance the interests of consumers and investors, then a financial planning model is needed that accurately reflects the multi-objective nature of the regulatory decision process. This article develops such a multi-objective programming model for examining the efficient trade-offs available to utility regulators in setting rates of return. 8 references, 2 figures, 7 tables.

  5. A fuzzy model for exploiting customer requirements

    Directory of Open Access Journals (Sweden)

    Zahra Javadirad

    2016-09-01

    Full Text Available Nowadays, Quality function deployment (QFD is one of the total quality management tools, where customers’ views and requirements are perceived and using various techniques improves the production requirements and operations. The QFD department, after identification and analysis of the competitors, takes customers’ feedbacks to meet the customers’ demands for the products compared with the competitors. In this study, a comprehensive model for assessing the importance of the customer requirements in the products or services for an organization is proposed. The proposed study uses linguistic variables, as a more comprehensive approach, to increase the precision of the expression evaluations. The importance of these requirements specifies the strengths and weaknesses of the organization in meeting the requirements relative to competitors. The results of these experiments show that the proposed method performs better than the other methods.

  6. The changing utility workforce and the emergence of building information modeling in utilities

    Energy Technology Data Exchange (ETDEWEB)

    Saunders, A. [Autodesk Inc., San Rafael, CA (United States)

    2010-07-01

    Utilities are faced with the extensive replacement of a workforce that is now reaching retirement age. New personnel will have varying skill levels and different expectations in relation to design tools. This paper discussed methods of facilitating knowledge transfer from the retiring workforce to new staff using rules-based design software. It was argued that while nothing can replace the experiential knowledge of long-term engineers, software with built-in validations can accelerate training and building information modelling (BIM) processes. Younger personnel will expect a user interface paradigm that is based on their past gaming and work experiences. Visualization, simulation, and modelling approaches were reviewed. 3 refs.

  7. Continuous utility factor in segregation models: a few surprises

    CERN Document Server

    Roy, Parna

    2015-01-01

    We consider the constrained Schelling model of social segregation which allows non-local jumps of the agents. In the present study, the utility factor u is defined in a way such that it can take continuous values and depends on the tolerance threshold as well as fraction of unlike neighbours. Two models are proposed: in model A the jump probability is determined by the sign of u only which makes it equivalent to the discrete model. In model B the actual values of u are considered. Model A and model B are shown to differ drastically as far as segregation behaviour and phase transitions are concerned. The constrained model B turns out to be as efficient as the unconstrained discrete model, if not more. In addition, we also consider a few other dynamical aspects which have not been studied in segregation models earlier.

  8. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  9. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  10. DISTRIBUTED PROCESSING TRADE-OFF MODEL FOR ELECTRIC UTILITY OPERATION

    Science.gov (United States)

    Klein, S. A.

    1994-01-01

    The Distributed processing Trade-off Model for Electric Utility Operation is based upon a study performed for the California Institute of Technology's Jet Propulsion Laboratory. This study presented a technique that addresses the question of trade-offs between expanding a communications network or expanding the capacity of distributed computers in an electric utility Energy Management System (EMS). The technique resulted in the development of a quantitative assessment model that is presented in a Lotus 1-2-3 worksheet environment. The model gives EMS planners a macroscopic tool for evaluating distributed processing architectures and the major technical and economic tradeoffs as well as interactions within these architectures. The model inputs (which may be varied according to application and need) include geographic parameters, data flow and processing workload parameters, operator staffing parameters, and technology/economic parameters. The model's outputs are total cost in various categories, a number of intermediate cost and technical calculation results, as well as graphical presentation of Costs vs. Percent Distribution for various parameters. The model has been implemented on an IBM PC using the LOTUS 1-2-3 spreadsheet environment and was developed in 1986. Also included with the spreadsheet model are a number of representative but hypothetical utility system examples.

  11. Unified Model for Generation Complex Networks with Utility Preferential Attachment

    Institute of Scientific and Technical Information of China (English)

    WU Jian-Jun; GAO Zi-You; SUN Hui-Jun

    2006-01-01

    In this paper, based on the utility preferential attachment, we propose a new unified model to generate different network topologies such as scale-free, small-world and random networks. Moreover, a new network structure named super scale network is found, which has monopoly characteristic in our simulation experiments. Finally, the characteristics ofthis new network are given.

  12. Optimization Model to Enhance Sustainable Utilization of Resources

    Institute of Scientific and Technical Information of China (English)

    ZHAO Guo-hao; SHEN Tu-jing

    2002-01-01

    Resources are material foundation for sustainable development. To enhance resources utilization is the key factor for realizing sustainable development. Based on the idea of sustainable development, the paper establishes the model to effectively distribute resources and proposes a method for studies on sustainable development.

  13. User Requirements and Domain Model Engineering

    NARCIS (Netherlands)

    Specht, Marcus; Glahn, Christian

    2006-01-01

    Specht, M., & Glahn, C. (2006). User requirements and domain model engineering. Presentation at International Workshop in Learning Networks for Lifelong Competence Development. March, 30-31, 2006. Sofia, Bulgaria: TENCompetence Conference. Retrieved June 30th, 2006, from http://dspace.learningnetwor

  14. User Requirements and Domain Model Engineering

    NARCIS (Netherlands)

    Specht, Marcus; Glahn, Christian

    2006-01-01

    Specht, M., & Glahn, C. (2006). User requirements and domain model engineering. Presentation at International Workshop in Learning Networks for Lifelong Competence Development. March, 30-31, 2006. Sofia, Bulgaria: TENCompetence Conference. Retrieved June 30th, 2006, from http://dspace.learningnetwor

  15. Modeling requirements for in situ vitrification

    Energy Technology Data Exchange (ETDEWEB)

    MacKinnon, R.J.; Mecham, D.C.; Hagrman, D.L.; Johnson, R.W.; Murray, P.E.; Slater, C.E.; Marwil, E.S.; Weaver, R.A.; Argyle, M.D.

    1991-11-01

    This document outlines the requirements for the model being developed at the INEL which will provide analytical support for the ISV technology assessment program. The model includes representations of the electric potential field, thermal transport with melting, gas and particulate release, vapor migration, off-gas combustion and process chemistry. The modeling objectives are to (1) help determine the safety of the process by assessing the air and surrounding soil radionuclide and chemical pollution hazards, the nuclear criticality hazard, and the explosion and fire hazards, (2) help determine the suitability of the ISV process for stabilizing the buried wastes involved, and (3) help design laboratory and field tests and interpret results therefrom.

  16. Improving surgeon utilization in an orthopedic department using simulation modeling

    Directory of Open Access Journals (Sweden)

    Simwita YW

    2016-10-01

    Full Text Available Yusta W Simwita, Berit I Helgheim Department of Logistics, Molde University College, Molde, Norway Purpose: Worldwide more than two billion people lack appropriate access to surgical services due to mismatch between existing human resource and patient demands. Improving utilization of existing workforce capacity can reduce the existing gap between surgical demand and available workforce capacity. In this paper, the authors use discrete event simulation to explore the care process at an orthopedic department. Our main focus is improving utilization of surgeons while minimizing patient wait time.Methods: The authors collaborated with orthopedic department personnel to map the current operations of orthopedic care process in order to identify factors that influence poor surgeons utilization and high patient waiting time. The authors used an observational approach to collect data. The developed model was validated by comparing the simulation output with the actual patient data that were collected from the studied orthopedic care process. The authors developed a proposal scenario to show how to improve surgeon utilization.Results: The simulation results showed that if ancillary services could be performed before the start of clinic examination services, the orthopedic care process could be highly improved. That is, improved surgeon utilization and reduced patient waiting time. Simulation results demonstrate that with improved surgeon utilizations, up to 55% increase of future demand can be accommodated without patients reaching current waiting time at this clinic, thus, improving patient access to health care services.Conclusion: This study shows how simulation modeling can be used to improve health care processes. This study was limited to a single care process; however the findings can be applied to improve other orthopedic care process with similar operational characteristics. Keywords: waiting time, patient, health care process

  17. PPE Surface Proteins Are Required for Heme Utilization by Mycobacterium tuberculosis

    Science.gov (United States)

    Mitra, Avishek; Speer, Alexander; Lin, Kan; Ehrt, Sabine

    2017-01-01

    ABSTRACT Iron is essential for replication of Mycobacterium tuberculosis, but iron is efficiently sequestered in the human host during infection. Heme constitutes the largest iron reservoir in the human body and is utilized by many bacterial pathogens as an iron source. While heme acquisition is well studied in other bacterial pathogens, little is known in M. tuberculosis. To identify proteins involved in heme utilization by M. tuberculosis, a transposon mutant library was screened for resistance to the toxic heme analog gallium(III)-porphyrin (Ga-PIX). Inactivation of the ppe36, ppe62, and rv0265c genes resulted in resistance to Ga-PIX. Growth experiments using isogenic M. tuberculosis deletion mutants showed that PPE36 is essential for heme utilization by M. tuberculosis, while the functions of PPE62 and Rv0265c are partially redundant. None of the genes restored growth of the heterologous M. tuberculosis mutants, indicating that the proteins encoded by the genes have separate functions. PPE36, PPE62, and Rv0265c bind heme as shown by surface plasmon resonance spectroscopy and are associated with membranes. Both PPE36 and PPE62 proteins are cell surface accessible, while the Rv0265c protein is probably located in the periplasm. PPE36 and PPE62 are, to our knowledge, the first proline-proline-glutamate (PPE) proteins of M. tuberculosis that bind small molecules and are involved in nutrient acquisition. The absence of a virulence defect of the ppe36 deletion mutant indicates that the different iron acquisition pathways of M. tuberculosis may substitute for each other during growth and persistence in mice. The emerging model of heme utilization by M. tuberculosis as derived from this study is substantially different from those of other bacteria. PMID:28119467

  18. Understanding requirements via natural language information modeling

    Energy Technology Data Exchange (ETDEWEB)

    Sharp, J.K.; Becker, S.D.

    1993-07-01

    Information system requirements that are expressed as simple English sentences provide a clear understanding of what is needed between system specifiers, administrators, users, and developers of information systems. The approach used to develop the requirements is the Natural-language Information Analysis Methodology (NIAM). NIAM allows the processes, events, and business rules to be modeled using natural language. The natural language presentation enables the people who deal with the business issues that are to be supported by the information system to describe exactly the system requirements that designers and developers will implement. Computer prattle is completely eliminated from the requirements discussion. An example is presented that is based upon a section of a DOE Order involving nuclear materials management. Where possible, the section is analyzed to specify the process(es) to be done, the event(s) that start the process, and the business rules that are to be followed during the process. Examples, including constraints, are developed. The presentation steps through the modeling process and shows where the section of the DOE Order needs clarification, extensions or interpretations that could provide a more complete and accurate specification.

  19. Vibrio cholerae phosphatases required for the utilization of nucleotides and extracellular DNA as phosphate sources.

    Science.gov (United States)

    McDonough, EmilyKate; Kamp, Heather; Camilli, Andrew

    2016-02-01

    Phosphate is essential for life, being used in many core processes such as signal transduction and synthesis of nucleic acids. The waterborne agent of cholera, Vibrio cholerae, encounters phosphate limitation in both the aquatic environment and human intestinal tract. This bacterium can utilize extracellular DNA (eDNA) as a phosphate source, a phenotype dependent on secreted endo- and exonucleases. However, no transporter of nucleotides has been identified in V. cholerae, suggesting that in order for the organism to utilize the DNA as a phosphate source, it must first separate the phosphate and nucleoside groups before transporting phosphate into the cell. In this study, we investigated the factors required for assimilation of phosphate from eDNA. We identified PhoX, and the previously unknown proteins UshA and CpdB as the major phosphatases that allow phosphate acquisition from eDNA and nucleotides. We demonstrated separable but partially overlapping roles for the three phosphatases and showed that the activity of PhoX and CpdB is induced by phosphate limitation. Thus, this study provides mechanistic insight into how V. cholerae can acquire phosphate from extracellular DNA, which is likely to be an important phosphate source in the environment and during infection.

  20. A Unified Derivation of Classical Subjective Expected Utility Models through Cardinal Utility

    NARCIS (Netherlands)

    Zank, H.; Wakker, P.P.

    1999-01-01

    Classical foundations of expected utility were provided by Ramsey, de Finetti, von Neumann & Morgenstern, Anscombe & Aumann, and others. These foundations describe preference conditions to capture the empirical content of expected utility. The assumed preference conditions, however, vary among the m

  1. User Guide and Documentation for Five MODFLOW Ground-Water Modeling Utility Programs

    Science.gov (United States)

    Banta, Edward R.; Paschke, Suzanne S.; Litke, David W.

    2008-01-01

    This report documents five utility programs designed for use in conjunction with ground-water flow models developed with the U.S. Geological Survey's MODFLOW ground-water modeling program. One program extracts calculated flow values from one model for use as input to another model. The other four programs extract model input or output arrays from one model and make them available in a form that can be used to generate an ArcGIS raster data set. The resulting raster data sets may be useful for visual display of the data or for further geographic data processing. The utility program GRID2GRIDFLOW reads a MODFLOW binary output file of cell-by-cell flow terms for one (source) model grid and converts the flow values to input flow values for a different (target) model grid. The spatial and temporal discretization of the two models may differ. The four other utilities extract selected 2-dimensional data arrays in MODFLOW input and output files and write them to text files that can be imported into an ArcGIS geographic information system raster format. These four utilities require that the model cells be square and aligned with the projected coordinate system in which the model grid is defined. The four raster-conversion utilities are * CBC2RASTER, which extracts selected stress-package flow data from a MODFLOW binary output file of cell-by-cell flows; * DIS2RASTER, which extracts cell-elevation data from a MODFLOW Discretization file; * MFBIN2RASTER, which extracts array data from a MODFLOW binary output file of head or drawdown; and * MULT2RASTER, which extracts array data from a MODFLOW Multiplier file.

  2. Propeller aircraft interior noise model utilization study and validation

    Science.gov (United States)

    Pope, L. D.

    1984-01-01

    Utilization and validation of a computer program designed for aircraft interior noise prediction is considered. The program, entitled PAIN (an acronym for Propeller Aircraft Interior Noise), permits (in theory) predictions of sound levels inside propeller driven aircraft arising from sidewall transmission. The objective of the work reported was to determine the practicality of making predictions for various airplanes and the extent of the program's capabilities. The ultimate purpose was to discern the quality of predictions for tonal levels inside an aircraft occurring at the propeller blade passage frequency and its harmonics. The effort involved three tasks: (1) program validation through comparisons of predictions with scale-model test results; (2) development of utilization schemes for large (full scale) fuselages; and (3) validation through comparisons of predictions with measurements taken in flight tests on a turboprop aircraft. Findings should enable future users of the program to efficiently undertake and correctly interpret predictions.

  3. Utilization of a silkworm model for understanding host-pathogen interactions

    Directory of Open Access Journals (Sweden)

    C Kaito

    2012-10-01

    Full Text Available Studies of the interactions between humans and pathogenic microorganisms require adequate representative animal infection models. Further, the availability of invertebrate models overcomes the ethical and financial issues of studying vertebrate materials. Insects have an innate immune system that is conserved in mammals. The recent utilization of silkworms as an animal infection model led to the identification of novel virulence genes of human pathogenic microorganisms and novel innate immune factors in the silkworm. The silkworm infection model is effective for identifying and evaluating novel factors involved in host-pathogen interactions.

  4. Rodent models of diabetic nephropathy: their utility and limitations.

    Science.gov (United States)

    Kitada, Munehiro; Ogura, Yoshio; Koya, Daisuke

    2016-01-01

    Diabetic nephropathy is the most common cause of end-stage renal disease. Therefore, novel therapies for the suppression of diabetic nephropathy must be developed. Rodent models are useful for elucidating the pathogenesis of diseases and testing novel therapies, and many type 1 and type 2 diabetic rodent models have been established for the study of diabetes and diabetic complications. Streptozotocin (STZ)-induced diabetic animals are widely used as a model of type 1 diabetes. Akita diabetic mice that have an Ins2+/C96Y mutation and OVE26 mice that overexpress calmodulin in pancreatic β-cells serve as a genetic model of type 1 diabetes. In addition, db/db mice, KK-Ay mice, Zucker diabetic fatty rats, Wistar fatty rats, Otsuka Long-Evans Tokushima Fatty rats and Goto-Kakizaki rats serve as rodent models of type 2 diabetes. An animal model of diabetic nephropathy should exhibit progressive albuminuria and a decrease in renal function, as well as the characteristic histological changes in the glomeruli and the tubulointerstitial lesions that are observed in cases of human diabetic nephropathy. A rodent model that strongly exhibits all these features of human diabetic nephropathy has not yet been developed. However, the currently available rodent models of diabetes can be useful in the study of diabetic nephropathy by increasing our understanding of the features of each diabetic rodent model. Furthermore, the genetic background and strain of each mouse model result in differences in susceptibility to diabetic nephropathy with albuminuria and the development of glomerular and tubulointerstitial lesions. Therefore, the validation of an animal model reproducing human diabetic nephropathy will significantly facilitate our understanding of the underlying genetic mechanisms that contribute to the development of diabetic nephropathy. In this review, we focus on rodent models of diabetes and discuss the utility and limitations of these models for the study of diabetic

  5. High Hardware Utilization and Low Memory Block Requirement Decoding of QC-LDPC Codes

    Institute of Scientific and Technical Information of China (English)

    ZHAO Ling; LIU Rongke; HOU Yi; ZHANG Xiaolin

    2012-01-01

    This paper presents a simple yet effective decoding for general quasi-cyclic low-density parity-check (QC-LDPC) codes,which not only achieves high hardware utility efficiency (HUE),but also brings about great memory block reduction without any performance degradation.The main idea is to split the check matrix into several row blocks,then to perform the improved message passing computations sequentially block by block.As the decoding algorithm improves,the sequential tie between the two-phase computations is broken,so that the two-phase computations can be overlapped which bring in high HUE.Two overlapping schemes are also presented,each of which suits a different situation.In addition,an efficient memory arrangement scheme is proposed to reduce the great memory block requirement of the LDPC decoder.As an example,for the 0.4 rate LDPC code selected from Chinese Digital TV Terrestrial Broadcasting (DTTB),our decoding saves over 80% memory blocks compared with the conventional decoding,and the decoder achieves 0.97 HUE.Finally,the 0.4 rate LDPC decoder is implemented on an FPGA device EP2S30 (speed grade-5).Using 8 row processing units,the decoder can achieve a maximum net throughput of 28.5 Mbps at 20 iterations.

  6. Malliavin's calculus in insider models: Additional utility and free lunches

    OpenAIRE

    2002-01-01

    We consider simple models of financial markets with regular traders and insiders possessing some extra information hidden in a random variable which is accessible to the regular trader only at the end of the trading interval. The problems we focus on are the calculation of the additional utility of the insider and a study of his free lunch possibilities. The information drift, i.e. the drift to eliminate in order to preserve the martingale property in the insider's filtration, turns out to be...

  7. Rigorously testing multialternative decision field theory against random utility models.

    Science.gov (United States)

    Berkowitsch, Nicolas A J; Scheibehenne, Benjamin; Rieskamp, Jörg

    2014-06-01

    Cognitive models of decision making aim to explain the process underlying observed choices. Here, we test a sequential sampling model of decision making, multialternative decision field theory (MDFT; Roe, Busemeyer, & Townsend, 2001), on empirical grounds and compare it against 2 established random utility models of choice: the probit and the logit model. Using a within-subject experimental design, participants in 2 studies repeatedly choose among sets of options (consumer products) described on several attributes. The results of Study 1 showed that all models predicted participants' choices equally well. In Study 2, in which the choice sets were explicitly designed to distinguish the models, MDFT had an advantage in predicting the observed choices. Study 2 further revealed the occurrence of multiple context effects within single participants, indicating an interdependent evaluation of choice options and correlations between different context effects. In sum, the results indicate that sequential sampling models can provide relevant insights into the cognitive process underlying preferential choices and thus can lead to better choice predictions.

  8. Improving imaging utilization through practice quality improvement (maintenance of certification part IV): a review of requirements and approach to implementation.

    Science.gov (United States)

    Griffith, Brent; Brown, Manuel L; Jain, Rajan

    2014-04-01

    The purposes of this article are to review the American Board of Radiology requirements for practice quality improvement and to describe our approach to improving imaging utilization while offering a guide to implementing similar projects at other institutions, emphasizing the plan-do-study-act approach. There is increased emphasis on improving quality in health care. Our institution has undertaken a multiphase practice quality improvement project addressing the appropriate utilization of screening cervical spinal CT in an emergency department.

  9. Rodent models of diabetic nephropathy: their utility and limitations

    Directory of Open Access Journals (Sweden)

    Kitada M

    2016-11-01

    significantly facilitate our understanding of the underlying genetic mechanisms that contribute to the development of diabetic nephropathy. In this review, we focus on rodent models of diabetes and discuss the utility and limitations of these models for the study of diabetic nephropathy. Keywords: diabetic nephropathy, rodent model, albuminuria, mesangial matrix expansion, tubulointerstitial fibrosis

  10. An integrated utility-based model of conflict evaluation and resolution in the Stroop task.

    Science.gov (United States)

    Chuderski, Adam; Smolen, Tomasz

    2016-04-01

    Cognitive control allows humans to direct and coordinate their thoughts and actions in a flexible way, in order to reach internal goals regardless of interference and distraction. The hallmark test used to examine cognitive control is the Stroop task, which elicits both the weakly learned but goal-relevant and the strongly learned but goal-irrelevant response tendencies, and requires people to follow the former while ignoring the latter. After reviewing the existing computational models of cognitive control in the Stroop task, its novel, integrated utility-based model is proposed. The model uses 3 crucial control mechanisms: response utility reinforcement learning, utility-based conflict evaluation using the Festinger formula for assessing the conflict level, and top-down adaptation of response utility in service of conflict resolution. Their complex, dynamic interaction led to replication of 18 experimental effects, being the largest data set explained to date by 1 Stroop model. The simulations cover the basic congruency effects (including the response latency distributions), performance dynamics and adaptation (including EEG indices of conflict), as well as the effects resulting from manipulations applied to stimulation and responding, which are yielded by the extant Stroop literature.

  11. Integrated modelling requires mass collaboration (Invited)

    Science.gov (United States)

    Moore, R. V.

    2009-12-01

    The need for sustainable solutions to the world’s problems is self evident; the challenge is to anticipate where, in the environment, economy or society, the proposed solution will have negative consequences. If we failed to realise that the switch to biofuels would have the seemingly obvious result of reduced food production, how much harder will it be to predict the likely impact of policies whose impacts may be more subtle? It has been clear for a long time that models and data will be important tools for assessing the impact of events and the measures for their mitigation. They are an effective way of encapsulating knowledge of a process and using it for prediction. However, most models represent a single or small group of processes. The sustainability challenges that face us now require not just the prediction of a single process but the prediction of how many interacting processes will respond in given circumstances. These processes will not be confined to a single discipline but will often straddle many. For example, the question, “What will be the impact on river water quality of the medical plans for managing a ‘flu pandemic and could they cause a further health hazard?” spans medical planning, the absorption of drugs by the body, the spread of disease, the hydraulic and chemical processes in sewers and sewage treatment works and river water quality. This question nicely reflects the present state of the art. We have models of the processes and standards, such as the Open Modelling Interface (the OpenMI), allow them to be linked together and to datasets. We can therefore answer the question but with the important proviso that we thought to ask it. The next and greater challenge is to deal with the open question, “What are the implications of the medical plans for managing a ‘flu pandemic?”. This implies a system that can make connections that may well not have occurred to us and then evaluate their probable impact. The final touch will be to

  12. A workflow learning model to improve geovisual analytics utility.

    Science.gov (United States)

    Roth, Robert E; Maceachren, Alan M; McCabe, Craig A

    2009-01-01

    INTRODUCTION: This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. OBJECTIVES: The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. METHODOLOGY: The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on

  13. Economic analysis of open space box model utilization in spacecraft

    Science.gov (United States)

    Mohammad, Atif F.; Straub, Jeremy

    2015-05-01

    It is a known fact that the amount of data about space that is stored is getting larger on an everyday basis. However, the utilization of Big Data and related tools to perform ETL (Extract, Transform and Load) applications will soon be pervasive in the space sciences. We have entered in a crucial time where using Big Data can be the difference (for terrestrial applications) between organizations underperforming and outperforming their peers. The same is true for NASA and other space agencies, as well as for individual missions and the highly-competitive process of mission data analysis and publication. In most industries, conventional opponents and new candidates alike will influence data-driven approaches to revolutionize and capture the value of Big Data archives. The Open Space Box Model is poised to take the proverbial "giant leap", as it provides autonomic data processing and communications for spacecraft. We can find economic value generated from such use of data processing in our earthly organizations in every sector, such as healthcare, retail. We also can easily find retailers, performing research on Big Data, by utilizing sensors driven embedded data in products within their stores and warehouses to determine how these products are actually used in the real world.

  14. Gamified Requirements Engineering: Model and Experimentation

    NARCIS (Netherlands)

    Lombriser, Philipp; Dalpiaz, Fabiano; Lucassen, Garm; Brinkkemper, Sjaak

    2016-01-01

    [Context & Motivation] Engaging stakeholders in requirements engineering (RE) influences the quality of the requirements and ultimately of the system to-be. Unfortunately, stakeholder engagement is often insufficient, leading to too few, low-quality requirements. [Question/problem] We aim to

  15. Gamified Requirements Engineering: Model and Experimentation

    NARCIS (Netherlands)

    Lombriser, Philipp; Dalpiaz, Fabiano; Lucassen, Garm; Brinkkemper, Sjaak

    2016-01-01

    [Context & Motivation] Engaging stakeholders in requirements engineering (RE) influences the quality of the requirements and ultimately of the system to-be. Unfortunately, stakeholder engagement is often insufficient, leading to too few, low-quality requirements. [Question/problem] We aim to evaluat

  16. Gamified Requirements Engineering: Model and Experimentation

    NARCIS (Netherlands)

    Lombriser, Philipp; Dalpiaz, Fabiano|info:eu-repo/dai/nl/369508394; Lucassen, Garm; Brinkkemper, Sjaak|info:eu-repo/dai/nl/07500707X

    2016-01-01

    [Context & Motivation] Engaging stakeholders in requirements engineering (RE) influences the quality of the requirements and ultimately of the system to-be. Unfortunately, stakeholder engagement is often insufficient, leading to too few, low-quality requirements. [Question/problem] We aim to evaluat

  17. 17 CFR 210.3A-05 - Special requirements as to public utility holding companies.

    Science.gov (United States)

    2010-04-01

    ... companies. There shall be shown in the consolidated balance sheet of a public utility holding company the... COMPANY ACT OF 1940, INVESTMENT ADVISERS ACT OF 1940, AND ENERGY POLICY AND CONSERVATION ACT OF...

  18. Understanding Emerging Impacts and Requirements Related to Utility-Scale Solar Development

    Energy Technology Data Exchange (ETDEWEB)

    Hartmann, Heidi M. [Argonne National Lab. (ANL), Argonne, IL (United States); Grippo, Mark A. [Argonne National Lab. (ANL), Argonne, IL (United States); Heath, Garvin A. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Smith, Karen P. [Argonne National Lab. (ANL), Argonne, IL (United States); Sullivan, Robert G. [Argonne National Lab. (ANL), Argonne, IL (United States); Walston, Leroy J. [Argonne National Lab. (ANL), Argonne, IL (United States); Wescott, Konstance L. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-09-01

    Utility-scale solar energy plays an important role in the nation’s strategy to address climate change threats through increased deployment of renewable energy technologies, and both the federal government and individual states have established specific goals for increased solar energy development. In order to achieve these goals, much attention is paid to making utility-scale solar energy cost-competitive with other conventional energy sources, while concurrently conducting solar development in an environmentally sound manner.

  19. User requirements for hydrological models with remote sensing input

    Energy Technology Data Exchange (ETDEWEB)

    Kolberg, Sjur

    1997-10-01

    Monitoring the seasonal snow cover is important for several purposes. This report describes user requirements for hydrological models utilizing remotely sensed snow data. The information is mainly provided by operational users through a questionnaire. The report is primarily intended as a basis for other work packages within the Snow Tools project which aim at developing new remote sensing products for use in hydrological models. The HBV model is the only model mentioned by users in the questionnaire. It is widely used in Northern Scandinavia and Finland, in the fields of hydroelectric power production, flood forecasting and general monitoring of water resources. The current implementation of HBV is not based on remotely sensed data. Even the presently used HBV implementation may benefit from remotely sensed data. However, several improvements can be made to hydrological models to include remotely sensed snow data. Among these the most important are a distributed version, a more physical approach to the snow depletion curve, and a way to combine data from several sources. 1 ref.

  20. Modeling Resource Utilization of a Large Data Acquisition System

    CERN Document Server

    Santos, Alejandro; The ATLAS collaboration

    2017-01-01

    The ATLAS 'Phase-II' upgrade, scheduled to start in 2024, will significantly change the requirements under which the data-acquisition system operates. The input data rate, currently fixed around 150 GB/s, is anticipated to reach 5 TB/s. In order to deal with the challenging conditions, and exploit the capabilities of newer technologies, a number of architectural changes are under consideration. Of particular interest is a new component, known as the Storage Handler, which will provide a large buffer area decoupling real-time data taking from event filtering. Dynamic operational models of the upgraded system can be used to identify the required resources and to select optimal techniques. In order to achieve a robust and dependable model, the current data-acquisition architecture has been used as a test case. This makes it possible to verify and calibrate the model against real operation data. Such a model can then be evolved toward the future ATLAS Phase-II architecture. In this paper we introduce the current ...

  1. Modelling Resource Utilization of a Large Data Acquisition System

    CERN Document Server

    Santos, Alejandro; The ATLAS collaboration

    2017-01-01

    The ATLAS 'Phase-II' upgrade, scheduled to start in 2024, will significantly change the requirements under which the data-acquisition system operates. The input data rate, currently fixed around 150 GB/s, is anticipated to reach 5 TB/s. In order to deal with the challenging conditions, and exploit the capabilities of newer technologies, a number of architectural changes are under consideration. Of particular interest is a new component, known as the Storage Handler, which will provide a large buffer area decoupling real-time data taking from event filtering. Dynamic operational models of the upgraded system can be used to identify the required resources and to select optimal techniques. In order to achieve a robust and dependable model, the current data-acquisition architecture has been used as a test case. This makes it possible to verify and calibrate the model against real operation data. Such a model can then be evolved toward the future ATLAS Phase-II architecture. In this paper we introduce the current ...

  2. From requirements to Java in a snap model-driven requirements engineering in practice

    CERN Document Server

    Smialek, Michal

    2015-01-01

    This book provides a coherent methodology for Model-Driven Requirements Engineering which stresses the systematic treatment of requirements within the realm of modelling and model transformations. The underlying basic assumption is that detailed requirements models are used as first-class artefacts playing a direct role in constructing software. To this end, the book presents the Requirements Specification Language (RSL) that allows precision and formality, which eventually permits automation of the process of turning requirements into a working system by applying model transformations and co

  3. Modeling uncertainty in requirements engineering decision support

    Science.gov (United States)

    Feather, Martin S.; Maynard-Zhang, Pedrito; Kiper, James D.

    2005-01-01

    One inherent characteristic of requrements engineering is a lack of certainty during this early phase of a project. Nevertheless, decisions about requirements must be made in spite of this uncertainty. Here we describe the context in which we are exploring this, and some initial work to support elicitation of uncertain requirements, and to deal with the combination of such information from multiple stakeholders.

  4. Modeling uncertainty in requirements engineering decision support

    Science.gov (United States)

    Feather, Martin S.; Maynard-Zhang, Pedrito; Kiper, James D.

    2005-01-01

    One inherent characteristic of requrements engineering is a lack of certainty during this early phase of a project. Nevertheless, decisions about requirements must be made in spite of this uncertainty. Here we describe the context in which we are exploring this, and some initial work to support elicitation of uncertain requirements, and to deal with the combination of such information from multiple stakeholders.

  5. A goal-oriented requirements modelling language for enterprise architecture

    NARCIS (Netherlands)

    Quartel, Dick; Engelsman, Wilco; Jonkers, Henk; Sinderen, van Marten

    2009-01-01

    Methods for enterprise architecture, such as TOGAF, acknowledge the importance of requirements engineering in the development of enterprise architectures. Modelling support is needed to specify, document, communicate and reason about goals and requirements. Current modelling techniques for enterpris

  6. A goal-oriented requirements modelling language for enterprise architecture

    NARCIS (Netherlands)

    Quartel, Dick; Engelsman, W.; Jonkers, Henk; van Sinderen, Marten J.

    2009-01-01

    Methods for enterprise architecture, such as TOGAF, acknowledge the importance of requirements engineering in the development of enterprise architectures. Modelling support is needed to specify, document, communicate and reason about goals and requirements. Current modelling techniques for

  7. ADVANCED UTILITY SIMULATION MODEL, REPORT OF SENSITIVITY TESTING, CALIBRATION, AND MODEL OUTPUT COMPARISONS (VERSION 3.0)

    Science.gov (United States)

    The report gives results of activities relating to the Advanced Utility Simulation Model (AUSM): sensitivity testing. comparison with a mature electric utility model, and calibration to historical emissions. The activities were aimed at demonstrating AUSM's validity over input va...

  8. Modeling non-monotone risk aversion using SAHARA utility functions

    NARCIS (Netherlands)

    A. Chen; A. Pelsser; M. Vellekoop

    2011-01-01

    We develop a new class of utility functions, SAHARA utility, with the distinguishing feature that it allows absolute risk aversion to be non-monotone and implements the assumption that agents may become less risk averse for very low values of wealth. The class contains the well-known exponential and

  9. Development and verification of a model for estimating the screening utility in the detection of PCBs in transformer oil.

    Science.gov (United States)

    Terakado, Shingo; Glass, Thomas R; Sasaki, Kazuhiro; Ohmura, Naoya

    2014-01-01

    A simple new model for estimating the screening performance (false positive and false negative rates) of a given test for a specific sample population is presented. The model is shown to give good results on a test population, and is used to estimate the performance on a sampled population. Using the model developed in conjunction with regulatory requirements and the relative costs of the confirmatory and screening tests allows evaluation of the screening test's utility in terms of cost savings. Testers can use the methods developed to estimate the utility of a screening program using available screening tests with their own sample populations.

  10. Process and utility water requirements for cellulosic ethanol production processes via fermentation pathway

    Science.gov (United States)

    The increasing need of additional water resources for energy production is a growing concern for future economic development. In technology development for ethanol production from cellulosic feedstocks, a detailed assessment of the quantity and quality of water required, and the ...

  11. Process and utility water requirements for cellulosic ethanol production processes via fermentation pathway

    Science.gov (United States)

    The increasing need of additional water resources for energy production is a growing concern for future economic development. In technology development for ethanol production from cellulosic feedstocks, a detailed assessment of the quantity and quality of water required, and the ...

  12. Effect of energy and protein levels on nutrient utilization and their requirements in growing Murrah buffaloes.

    Science.gov (United States)

    Prusty, Sonali; Kundu, Shivlal Singh; Mondal, Goutam; Sontakke, Umesh; Sharma, Vijay Kumar

    2016-04-01

    To evaluate different levels of energy and protein for optimum growth of Murrah male buffalo calves, a growth trial (150 days) was conducted on 30 calves (body weight 202.5 ± 6.8 kg). Six diets were formulated to provide 90, 100 and 110% protein level and 90 and 110% energy level requirements for buffalo calves, derived from ICAR 2013 recommendations for buffaloes. The crude protein (CP) intake was increased with higher dietary CP, whereas no effect of energy levels or interaction between protein and energy was observed on CP intake. There were significant effects (P interaction between protein and energy (P nutrient intake (protein or energy) per kg body weight (BW)(0.75) at various fortnight intervals was regressed linearly from the average daily gain (ADG) per kg BW(0.75). By setting the average daily gain at zero in the developed regression equation, a maintenance requirement was obtained, i.e. 133.1 kcal ME, 6.45 g CP and 3.95 g metabolizable protein (MP) per kg BW(0.75). Requirement for growth was 6.12 kcal ME, 0.46 g CP and 0.32 g MP per kg BW(0.75) per day. Metabolizable amino acid requirement was estimated from partitioning of MP intake and ADG. The ME requirements were lower, whereas the MP requirement of Murrah buffaloes was higher than ICAR (2013) recommendations.

  13. DECISION MAKING MODELING OF CONCRETE REQUIREMENTS

    Directory of Open Access Journals (Sweden)

    Suhartono Irawan

    2001-01-01

    Full Text Available This paper presents the results of an experimental evaluation between predicted and practice concrete strength. The scope of the evaluation is the optimisation of the cement content for different concrete grades as a result of bringing the target mean value of tests cubes closer to the required characteristic strength value by reducing the standard deviation. Abstract in Bahasa Indonesia : concrete+mix+design%2C+acceptance+control%2C+optimisation%2C+cement+content.

  14. A MODEL FOR ALIGNING SOFTWARE PROJECTS REQUIREMENTS WITH PROJECT TEAM MEMBERS REQUIREMENTS

    Directory of Open Access Journals (Sweden)

    Robert Hans

    2013-02-01

    Full Text Available The fast-paced, dynamic environment within which information and communication technology (ICT projects are run as well as ICT professionals’ constant changing requirements present a challenge for project managers in terms of aligning projects’ requirements with project team members’ requirements. This research paper purports that if projects’ requirements are properly aligned with team members’ requirements, then this will result in a balanced decision approach. Moreover, such an alignment will result in the realization of employee’s needs as well as meeting project’s needs. This paper presents a Project’s requirements and project Team members’ requirements (PrTr alignment model and argues that a balanced decision which meets both software project’s requirements and team members’ requirements can be achieved through the application of the PrTr alignment model.

  15. Supporting requirements model evolution throughout the system life-cycle

    OpenAIRE

    Ernst, Neil; Mylopoulos, John; Yu, Yijun; Ngyuen, Tien T.

    2008-01-01

    Requirements models are essential not just during system implementation, but also to manage system changes post-implementation. Such models should be supported by a requirements model management framework that allows users to create, manage and evolve models of domains, requirements, code and other design-time artifacts along with traceability links between their elements. We propose a comprehensive framework which delineates the operations and elements necessary, and then describe a tool imp...

  16. The Joint Venture Model of Knowledge Utilization: a guide for change in nursing.

    Science.gov (United States)

    Edgar, Linda; Herbert, Rosemary; Lambert, Sylvie; MacDonald, Jo-Ann; Dubois, Sylvie; Latimer, Margot

    2006-05-01

    Knowledge utilization (KU) is an essential component of today's nursing practice and healthcare system. Despite advances in knowledge generation, the gap in knowledge transfer from research to practice continues. KU models have moved beyond factors affecting the individual nurse to a broader perspective that includes the practice environment and the socio-political context. This paper proposes one such theoretical model the Joint Venture Model of Knowledge Utilization (JVMKU). Key components of the JVMKU that emerged from an extensive multidisciplinary review of the literature include leadership, emotional intelligence, person, message, empowered workplace and the socio-political environment. The model has a broad and practical application and is not specific to one type of KU or one population. This paper provides a description of the JVMKU, its development and suggested uses at both local and organizational levels. Nurses in both leadership and point-of-care positions will recognize the concepts identified and will be able to apply this model for KU in their own workplace for assessment of areas requiring strengthening and support.

  17. Utilization of respiratory energy in higher plants : requirements for 'maintenance' and transport processes

    NARCIS (Netherlands)

    Bouma, T.J.

    1995-01-01

    Quantitative knowledge of both photosynthesis and respiration is required to understand plant growth and resulting crop yield. However, especially the nature of the energy demanding processes that are dependent on dark respiration in full-grown tissues is largely unknown. The main objective

  18. Utilization of respiratory energy in higher plants. Requirements for 'maintenance' and transport processes.

    NARCIS (Netherlands)

    Bouma, T.J.

    1995-01-01

    Quantitative knowledge of both photosynthesis and respiration is required to understand plant growth and resulting crop yield. However, especially the nature of the energy demanding processes that are dependent on dark respiration in full-grown tissues is largely unknown. The main objective of the p

  19. Requirements model for an e-Health awareness portal

    Science.gov (United States)

    Hussain, Azham; Mkpojiogu, Emmanuel O. C.; Nawi, Mohd Nasrun M.

    2016-08-01

    Requirements engineering is at the heart and foundation of software engineering process. Poor quality requirements inevitably lead to poor quality software solutions. Also, poor requirement modeling is tantamount to designing a poor quality product. So, quality assured requirements development collaborates fine with usable products in giving the software product the needed quality it demands. In the light of the foregoing, the requirements for an e-Ebola Awareness Portal were modeled with a good attention given to these software engineering concerns. The requirements for the e-Health Awareness Portal are modeled as a contribution to the fight against Ebola and helps in the fulfillment of the United Nation's Millennium Development Goal No. 6. In this study requirements were modeled using UML 2.0 modeling technique.

  20. Extending enterprise architecture modelling with business goals and requirements

    NARCIS (Netherlands)

    Engelsman, Wilco; Quartel, Dick; Jonkers, Henk; Sinderen, van Marten

    2011-01-01

    The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling te

  1. User Utility Oriented Queuing Model for Resource Allocation in Cloud Environment

    Directory of Open Access Journals (Sweden)

    Zhe Zhang

    2015-01-01

    Full Text Available Resource allocation is one of the most important research topics in servers. In the cloud environment, there are massive hardware resources of different kinds, and many kinds of services are usually run on virtual machines of the cloud server. In addition, cloud environment is commercialized, and economical factor should also be considered. In order to deal with commercialization and virtualization of cloud environment, we proposed a user utility oriented queuing model for task scheduling. Firstly, we modeled task scheduling in cloud environment as an M/M/1 queuing system. Secondly, we classified the utility into time utility and cost utility and built a linear programming model to maximize total utility for both of them. Finally, we proposed a utility oriented algorithm to maximize the total utility. Massive experiments validate the effectiveness of our proposed model.

  2. Exploration Requirements Development Utilizing the Strategy-to-Task-to-Technology Development Approach

    Science.gov (United States)

    Drake, Bret G.; Josten, B. Kent; Monell, Donald W.

    2004-01-01

    The Vision for Space Exploration provides direction for the National Aeronautics and Space Administration to embark on a robust space exploration program that will advance the Nation s scientific, security, and economic interests. This plan calls for a progressive expansion of human capabilities beyond low earth orbit seeking to answer profound scientific and philosophical questions while responding to discoveries along the way. In addition, the Vision articulates the strategy for developing the revolutionary new technologies and capabilities required for the future exploration of the solar system. The National Aeronautics and Space Administration faces new challenges in successfully implementing the Vision. In order to implement a sustained and affordable exploration endeavor it is vital for NASA to do business differently. This paper provides an overview of the strategy-to-task-to-technology process being used by NASA s Exploration Systems Mission Directorate to develop the requirements and system acquisition details necessary for implementing a sustainable exploration vision.

  3. Goal-Programming Model Based on the Utility Function of the Decision-maker

    Institute of Scientific and Technical Information of China (English)

    WANG Zhi-jiang

    2001-01-01

    Based on the analysis of the problems in traditional GP model, this paper provides the model with the utility function of the decision-maker and compares this model with the one presented in reference article [1].

  4. A Study of Airbase Facility/Utility Energy R and D Requirements

    Science.gov (United States)

    1992-04-01

    The boiler on top of the tower captures the heat focused on it by the heliostat field and transfers it to a working fluid. The working fluid is then...The operation of these airbases, each similar to a small city, in a posture of readiness requires large amounts of energy for electric power, heating ...118 76 Simplified Schematic of a Stirling Cycle Heat Engine

  5. In-silico ADME models: a general assessment of their utility in drug discovery applications.

    Science.gov (United States)

    Gleeson, M Paul; Hersey, Anne; Hannongbua, Supa

    2011-01-01

    ADME prediction is an extremely challenging area as many of the properties we try to predict are a result of multiple physiological processes. In this review we consider how in-silico predictions of ADME processes can be used to help bias medicinal chemistry into more ideal areas of property space, minimizing the number of compounds needed to be synthesized to obtain the required biochemical/physico-chemical profile. While such models are not sufficiently accurate to act as a replacement for in-vivo or in-vitro methods, in-silico methods nevertheless can help us to understand the underlying physico-chemical dependencies of the different ADME properties, and thus can give us inspiration on how to optimize them. Many global in-silico ADME models (i.e generated on large, diverse datasets) have been reported in the literature. In this paper we selectively review representatives from each distinct class and discuss their relative utility in drug discovery. For each ADME parameter, we limit our discussion to the most recent, most predictive or most insightful examples in the literature to highlight the current state of the art. In each case we briefly summarize the different types of models available for each parameter (i.e simple rules, physico-chemical and 3D based QSAR predictions), their overall accuracy and the underlying SAR. We also discuss the utility of the models as related to lead generation and optimization phases of discovery research.

  6. Expected Utility and Catastrophic Risk in a Stochastic Economy-Climate Model

    NARCIS (Netherlands)

    Ikefuji, M.; Laeven, R.J.A.; Magnus, J.R.; Muris, C.H.M.

    2010-01-01

    In the context of extreme climate change, we ask how to conduct expected utility analysis in the presence of catastrophic risks. Economists typically model decision making under risk and uncertainty by expected util- ity with constant relative risk aversion (power utility); statisticians typi- cally

  7. Extending enterprise architecture modelling with business goals and requirements

    Science.gov (United States)

    Engelsman, Wilco; Quartel, Dick; Jonkers, Henk; van Sinderen, Marten

    2011-02-01

    The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling techniques for EA focus on the products, services, processes and applications of an enterprise. In addition, techniques may be provided to describe structured requirements lists and use cases. Little support is available however for modelling the underlying motivation of EAs in terms of stakeholder concerns and the high-level goals that address these concerns. This article describes a language that supports the modelling of this motivation. The definition of the language is based on existing work on high-level goal and requirements modelling and is aligned with an existing standard for enterprise modelling: the ArchiMate language. Furthermore, the article illustrates how EA can benefit from analysis techniques from the requirements engineering domain.

  8. Mixing Formal and Informal Model Elements for Tracing Requirements

    DEFF Research Database (Denmark)

    Jastram, Michael; Hallerstede, Stefan; Ladenberger, Lukas

    2011-01-01

    a system for traceability with a state-based formal method that supports refinement. We do not require all specification elements to be modelled formally and support incremental incorporation of new specification elements into the formal model. Refinement is used to deal with larger amounts of requirements......Tracing between informal requirements and formal models is challenging. A method for such tracing should permit to deal efficiently with changes to both the requirements and the model. A particular challenge is posed by the persisting interplay of formal and informal elements. In this paper, we...

  9. The Sustainable Energy Utility (SEU) Model for Energy Service Delivery

    Science.gov (United States)

    Houck, Jason; Rickerson, Wilson

    2009-01-01

    Climate change, energy price spikes, and concerns about energy security have reignited interest in state and local efforts to promote end-use energy efficiency, customer-sited renewable energy, and energy conservation. Government agencies and utilities have historically designed and administered such demand-side measures, but innovative…

  10. Unbounded Utility for Savage's "Foundations of Statistics," and Other Models

    NARCIS (Netherlands)

    P.P. Wakker (Peter)

    1993-01-01

    textabstractA general procedure for extending finite-dimensional "additive-like" representations for binary relations to infinite-dimensional "integral-like" representations is developed by means of a condition called truncation-continuity. The restriction of boundedness of utility, met throughout t

  11. Unbounded utility for Savage's "Foundations of statistics," and other models

    NARCIS (Netherlands)

    Wakker, P.

    1993-01-01

    A general procedure for extending finite-dimensional additive-like representations to infinite-dimensional integral-like representations is developed by means of a condition called truncation-continuity. The restriction of boundedness of utility, met throughout the literature, can now be dispensed w

  12. Mathematical model of a utility firm. Executive summary

    Energy Technology Data Exchange (ETDEWEB)

    1983-08-21

    The project was aimed at developing an understanding of the economic and behavioral processes that take place within a utility firm, and without it. This executive summary, one of five documents, gives the project goals and objectives, outlines the subject areas of investigation, discusses the findings and results, and finally considers applications within the electric power industry and future research directions. (DLC)

  13. Electric utility capacity expansion and energy production models for energy policy analysis

    Energy Technology Data Exchange (ETDEWEB)

    Aronson, E.; Edenburn, M.

    1997-08-01

    This report describes electric utility capacity expansion and energy production models developed for energy policy analysis. The models use the same principles (life cycle cost minimization, least operating cost dispatching, and incorporation of outages and reserve margin) as comprehensive utility capacity planning tools, but are faster and simpler. The models were not designed for detailed utility capacity planning, but they can be used to accurately project trends on a regional level. Because they use the same principles as comprehensive utility capacity expansion planning tools, the models are more realistic than utility modules used in present policy analysis tools. They can be used to help forecast the effects energy policy options will have on future utility power generation capacity expansion trends and to help formulate a sound national energy strategy. The models make renewable energy source competition realistic by giving proper value to intermittent renewable and energy storage technologies, and by competing renewables against each other as well as against conventional technologies.

  14. A Framework for Organizing Current and Future Electric Utility Regulatory and Business Models

    Energy Technology Data Exchange (ETDEWEB)

    Satchwell, Andrew [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Cappers, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Schwartz, Lisa [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Fadrhonc, Emily Martin [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-06-01

    In this report, we will present a descriptive and organizational framework for incremental and fundamental changes to regulatory and utility business models in the context of clean energy public policy goals. We will also discuss the regulated utility's role in providing value-added services that relate to distributed energy resources, identify the "openness" of customer information and utility networks necessary to facilitate change, and discuss the relative risks, and the shifting of risks, for utilities and customers.

  15. Flexible simulation framework to couple processes in complex 3D models for subsurface utilization assessment

    Science.gov (United States)

    Kempka, Thomas; Nakaten, Benjamin; De Lucia, Marco; Nakaten, Natalie; Otto, Christopher; Pohl, Maik; Tillner, Elena; Kühn, Michael

    2016-04-01

    Utilization of the geological subsurface for production and storage of hydrocarbons, chemical energy and heat as well as for waste disposal requires the quantification and mitigation of environmental impacts as well as the improvement of georesources utilization in terms of efficiency and sustainability. The development of tools for coupled process simulations is essential to tackle these challenges, since reliable assessments are only feasible by integrative numerical computations. Coupled processes at reservoir to regional scale determine the behaviour of reservoirs, faults and caprocks, generally demanding for complex 3D geological models to be considered besides available monitoring and experimenting data in coupled numerical simulations. We have been developing a flexible numerical simulation framework that provides efficient workflows for integrating the required data and software packages to carry out coupled process simulations considering, e.g., multiphase fluid flow, geomechanics, geochemistry and heat. Simulation results are stored in structured data formats to allow for an integrated 3D visualization and result interpretation as well as data archiving and its provision to collaborators. The main benefits in using the flexible simulation framework are the integration of data geological and grid data from any third party software package as well as data export to generic 3D visualization tools and archiving formats. The coupling of the required process simulators in time and space is feasible, while different spatial dimensions in the coupled simulations can be integrated, e.g., 0D batch with 3D dynamic simulations. User interaction is established via high-level programming languages, while computational efficiency is achieved by using low-level programming languages. We present three case studies on the assessment of geological subsurface utilization based on different process coupling approaches and numerical simulations.

  16. The utility of Earth system Models of Intermediate Complexity

    NARCIS (Netherlands)

    Weber, S.L.

    2010-01-01

    Intermediate-complexity models are models which describe the dynamics of the atmosphere and/or ocean in less detail than conventional General Circulation Models (GCMs). At the same time, they go beyond the approach taken by atmospheric Energy Balance Models (EBMs) or ocean box models by

  17. Using cognitive modeling for requirements engineering in anesthesiology

    NARCIS (Netherlands)

    Pott, C; le Feber, J

    2005-01-01

    Cognitive modeling is a complexity reducing method to describe significant cognitive processes under a specified research focus. Here, a cognitive process model for decision making in anesthesiology is presented and applied in requirements engineering. Three decision making situations of

  18. Grid-connection of large offshore windfarms utilizing VSC-HVDC: Modeling and grid impact

    DEFF Research Database (Denmark)

    Xue, Yijing; Akhmatov, Vladislav

    2009-01-01

    Utilization of Voltage Source Converter (VSC) – High Voltage Direct Current (HVDC) systems for grid-connection of large offshore windfarms becomes relevant as installed power capacities as well as distances to the connection points of the on-land transmission systems increase. At the same time......, the grid code requirements of the Transmission System Operators (TSO), including ancillary system services and Low-Voltage Fault-Ride-Through (LVFRT) capability of large offshore windfarms, become more demanding. This paper presents a general-level model of and a LVFRT solution for a VSC-HVDC system...... for grid-connection of large offshore windfarms. The VSC-HVDC model is implemented using a general approach of independent control of active and reactive power in normal operation situations. The on-land VSC inverter, which is also called a grid-side inverter, provides voltage support to the transmission...

  19. Modeling energy flexibility of low energy buildings utilizing thermal mass

    DEFF Research Database (Denmark)

    Foteinaki, Kyriaki; Heller, Alfred; Rode, Carsten

    2016-01-01

    the load shifting potential of an apartment of a low energy building in Copenhagen is assessed, utilizing the heat storage capacity of the thermal mass when the heating system is switched off for relieving the energy system. It is shown that when using a 4-hour preheating period before switching off...... of the external envelope and the thermal capacity of the internal walls as the main parameters that affect the load shifting potential of the apartment....... to match the production patterns, shifting demand from on-peak hours to off-peak hours. Buildings could act as flexibility suppliers to the energy system, through load shifting potential, provided that the large thermal mass of the building stock could be utilized for energy storage. In the present study...

  20. A knowledge based model of electric utility operations. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-08-11

    This report consists of an appendix to provide a documentation and help capability for an analyst using the developed expert system of electric utility operations running in CLIPS. This capability is provided through a separate package running under the WINDOWS Operating System and keyed to provide displays of text, graphics and mixed text and graphics that explain and elaborate on the specific decisions being made within the knowledge based expert system.

  1. Software Requirements Specification Verifiable Fuel Cycle Simulation (VISION) Model

    Energy Technology Data Exchange (ETDEWEB)

    D. E. Shropshire; W. H. West

    2005-11-01

    The purpose of this Software Requirements Specification (SRS) is to define the top-level requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). This simulation model is intended to serve a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies.

  2. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    with simple unified modelling language (UML) requirements models, it is not easy for the development team to get confidence on the stakeholders' requirements validation. This paper describes an approach, based on the construction of executable interactive prototypes, to support the validation of workflow...

  3. Meta-Model and UML Profile for Requirements Management of Software and Embedded Systems

    Directory of Open Access Journals (Sweden)

    Arpinen Tero

    2011-01-01

    Full Text Available Software and embedded system companies today encounter problems related to requirements management tool integration, incorrect tool usage, and lack of traceability. This is due to utilized tools with no clear meta-model and semantics to communicate requirements between different stakeholders. This paper presents a comprehensive meta-model for requirements management. The focus is on software and embedded system domains. The goal is to define generic requirements management domain concepts and abstract interfaces between requirements management and system development. This leads to a portable requirements management meta-model which can be adapted with various system modeling languages. The created meta-model is prototyped by translating it into a UML profile. The profile is imported into a UML tool which is used for rapid evaluation of meta-model concepts in practice. The developed profile is associated with a proof of concept report generator tool that automatically produces up-to-date documentation from the models in form of web pages. The profile is adopted to create an example model of embedded system requirement specification which is built with the profile.

  4. FadD is required for utilization of endogenous fatty acids released from membrane lipids.

    Science.gov (United States)

    Pech-Canul, Ángel; Nogales, Joaquina; Miranda-Molina, Alfonso; Álvarez, Laura; Geiger, Otto; Soto, María José; López-Lara, Isabel M

    2011-11-01

    FadD is an acyl coenzyme A (CoA) synthetase responsible for the activation of exogenous long-chain fatty acids (LCFA) into acyl-CoAs. Mutation of fadD in the symbiotic nitrogen-fixing bacterium Sinorhizobium meliloti promotes swarming motility and leads to defects in nodulation of alfalfa plants. In this study, we found that S. meliloti fadD mutants accumulated a mixture of free fatty acids during the stationary phase of growth. The composition of the free fatty acid pool and the results obtained after specific labeling of esterified fatty acids with a Δ5-desaturase (Δ5-Des) were in agreement with membrane phospholipids being the origin of the released fatty acids. Escherichia coli fadD mutants also accumulated free fatty acids released from membrane lipids in the stationary phase. This phenomenon did not occur in a mutant of E. coli with a deficient FadL fatty acid transporter, suggesting that the accumulation of fatty acids in fadD mutants occurs inside the cell. Our results indicate that, besides the activation of exogenous LCFA, in bacteria FadD plays a major role in the activation of endogenous fatty acids released from membrane lipids. Furthermore, expression analysis performed with S. meliloti revealed that a functional FadD is required for the upregulation of genes involved in fatty acid degradation and suggested that in the wild-type strain, the fatty acids released from membrane lipids are degraded by β-oxidation in the stationary phase of growth.

  5. Defining Requirements and Applying Information Modeling for Protecting Enterprise Assets

    Science.gov (United States)

    Fortier, Stephen C.; Volk, Jennifer H.

    The advent of terrorist threats has heightened local, regional, and national governments' interest in emergency response and disaster preparedness. The threat of natural disasters also challenges emergency responders to act swiftly and in a coordinated fashion. When a disaster occurs, an ad hoc coalition of pre-planned groups usually forms to respond to the incident. History has shown that these “system of systems” do not interoperate very well. Communications between fire, police and rescue components either do not work or are inefficient. Government agencies, non-governmental organizations (NGOs), and private industry use a wide array of software platforms for managing data about emergency conditions, resources and response activities. Most of these are stand-alone systems with very limited capability for data sharing with other agencies or other levels of government. Information technology advances have facilitated the movement towards an integrated and coordinated approach to emergency management. Other communication mechanisms, such as video teleconferencing, digital television and radio broadcasting, are being utilized to combat the challenges of emergency information exchange. Recent disasters, such as Hurricane Katrina and the tsunami in Indonesia, have illuminated the weaknesses in emergency response. This paper will discuss the need for defining requirements for components of ad hoc coalitions which are formed to respond to disasters. A goal of our effort was to develop a proof of concept that applying information modeling to the business processes used to protect and mitigate potential loss of an enterprise was feasible. These activities would be modeled both pre- and post-incident.

  6. TonB Energy Transduction Systems of Riemerella anatipestifer Are Required for Iron and Hemin Utilization.

    Science.gov (United States)

    Liao, HeBin; Cheng, XingJun; Zhu, DeKang; Wang, MingShu; Jia, RenYong; Chen, Shun; Chen, XiaoYue; Biville, Francis; Liu, MaFeng; Cheng, AnChun

    2015-01-01

    Riemerella anatipestifer (R. anatipestifer) is one of the most important pathogens in ducks. The bacteria causes acute or chronic septicemia characterized by fibrinous pericarditis and meningitis. The R. anatipestifer genome encodes multiple iron/hemin-uptake systems that facilitate adaptation to iron-limited host environments. These systems include several TonB-dependent transporters and three TonB proteins responsible for energy transduction. These three tonB genes are present in all the R. anatipestifer genomes sequenced so far. Two of these genes are contained within the exbB-exbD-tonB1 and exbB-exbD-exbD-tonB2 operons. The third, tonB3, forms a monocistronic transcription unit. The inability to recover derivatives deleted for this gene suggests its product is essential for R. anatipestifer growth. Here, we show that deletion of tonB1 had no effect on hemin uptake of R. anatipestifer, though disruption of tonB2 strongly decreases hemin uptake, and disruption of both tonB1 and tonB2 abolishes the transport of exogenously added hemin. The ability of R. anatipestifer to grow on iron-depleted medium is decreased by tonB2 but not tonB1 disruption. When expressed in an E. coli model strain, the TonB1 complex, TonB2 complex, and TonB3 protein from R. anatipestifer cannot energize heterologous hemin transporters. Further, only the TonB1 complex can energize a R. anatipestifer hemin transporter when co-expressed in an E. coli model strain.

  7. TonB Energy Transduction Systems of Riemerella anatipestifer Are Required for Iron and Hemin Utilization.

    Directory of Open Access Journals (Sweden)

    HeBin Liao

    Full Text Available Riemerella anatipestifer (R. anatipestifer is one of the most important pathogens in ducks. The bacteria causes acute or chronic septicemia characterized by fibrinous pericarditis and meningitis. The R. anatipestifer genome encodes multiple iron/hemin-uptake systems that facilitate adaptation to iron-limited host environments. These systems include several TonB-dependent transporters and three TonB proteins responsible for energy transduction. These three tonB genes are present in all the R. anatipestifer genomes sequenced so far. Two of these genes are contained within the exbB-exbD-tonB1 and exbB-exbD-exbD-tonB2 operons. The third, tonB3, forms a monocistronic transcription unit. The inability to recover derivatives deleted for this gene suggests its product is essential for R. anatipestifer growth. Here, we show that deletion of tonB1 had no effect on hemin uptake of R. anatipestifer, though disruption of tonB2 strongly decreases hemin uptake, and disruption of both tonB1 and tonB2 abolishes the transport of exogenously added hemin. The ability of R. anatipestifer to grow on iron-depleted medium is decreased by tonB2 but not tonB1 disruption. When expressed in an E. coli model strain, the TonB1 complex, TonB2 complex, and TonB3 protein from R. anatipestifer cannot energize heterologous hemin transporters. Further, only the TonB1 complex can energize a R. anatipestifer hemin transporter when co-expressed in an E. coli model strain.

  8. Modeling and Analysis Compute Environments, Utilizing Virtualization Technology in the Climate and Earth Systems Science domain

    Science.gov (United States)

    Michaelis, A.; Nemani, R. R.; Wang, W.; Votava, P.; Hashimoto, H.

    2010-12-01

    Given the increasing complexity of climate modeling and analysis tools, it is often difficult and expensive to build or recreate an exact replica of the software compute environment used in past experiments. With the recent development of new technologies for hardware virtualization, an opportunity exists to create full modeling, analysis and compute environments that are “archiveable”, transferable and may be easily shared amongst a scientific community or presented to a bureaucratic body if the need arises. By encapsulating and entire modeling and analysis environment in a virtual machine image, others may quickly gain access to the fully built system used in past experiments, potentially easing the task and reducing the costs of reproducing and verify past results produced by other researchers. Moreover, these virtual machine images may be used as a pedagogical tool for others that are interested in performing an academic exercise but don't yet possess the broad expertise required. We built two virtual machine images, one with the Community Earth System Model (CESM) and one with Weather Research Forecast Model (WRF), then ran several small experiments to assess the feasibility, performance overheads costs, reusability, and transferability. We present a list of the pros and cons as well as lessoned learned from utilizing virtualization technology in the climate and earth systems modeling domain.

  9. The Ustilago maydis Nit2 homolog regulates nitrogen utilization and is required for efficient induction of filamentous growth.

    Science.gov (United States)

    Horst, Robin J; Zeh, Christine; Saur, Alexandra; Sonnewald, Sophia; Sonnewald, Uwe; Voll, Lars M

    2012-03-01

    Nitrogen catabolite repression (NCR) is a regulatory strategy found in microorganisms that restricts the utilization of complex and unfavored nitrogen sources in the presence of favored nitrogen sources. In fungi, this concept has been best studied in yeasts and filamentous ascomycetes, where the GATA transcription factors Gln3p and Gat1p (in yeasts) and Nit2/AreA (in ascomycetes) constitute the main positive regulators of NCR. The reason why functional Nit2 homologs of some phytopathogenic fungi are required for full virulence in their hosts has remained elusive. We have identified the Nit2 homolog in the basidiomycetous phytopathogen Ustilago maydis and show that it is a major, but not the exclusive, positive regulator of nitrogen utilization. By transcriptome analysis of sporidia grown on artificial media devoid of favored nitrogen sources, we show that only a subset of nitrogen-responsive genes are regulated by Nit2, including the Gal4-like transcription factor Ton1 (a target of Nit2). Ustilagic acid biosynthesis is not under the control of Nit2, while nitrogen starvation-induced filamentous growth is largely dependent on functional Nit2. nit2 deletion mutants show the delayed initiation of filamentous growth on maize leaves and exhibit strongly compromised virulence, demonstrating that Nit2 is required to efficiently initiate the pathogenicity program of U. maydis.

  10. Requirements for Logical Models for Value-Added Tax Legislation

    DEFF Research Database (Denmark)

    Nielsen, Morten Ib; Simonsen, Jakob Grue; Larsen, Ken Friis

    -specific needs. Currently, these difficulties are handled in most major ERP systems by customising and localising the native code of the ERP systems for each specific country and industry. We propose an alternative that uses logical modeling of VAT legislation. The potential benefit is to eventually transform...... such a model automatically into programs that essentially will replace customisation and localisation by con¿guration by changing parameters in the model. In particular, we: (1) identify a number of requirements for such modeling, including requirements for the underlying logic; (2) model salient parts...

  11. National coal utilization assessment: modeling long-term coal production with the Argonne coal market model

    Energy Technology Data Exchange (ETDEWEB)

    Dux, C.D.; Kroh, G.C.; VanKuiken, J.C.

    1977-08-01

    The Argonne Coal Market Model was developed as part of the National Coal Utilization Assessment, a comprehensive study of coal-related environmental, health, and safety impacts. The model was used to generate long-term coal market scenarios that became the basis for comparing the impacts of coal-development options. The model has a relatively high degree of regional detail concerning both supply and demand. Coal demands are forecast by a combination of trend and econometric analysis and then input exogenously into the model. Coal supply in each region is characterized by a linearly increasing function relating increments of new mine capacity to the marginal cost of extraction. Rail-transportation costs are econometrically estimated for each supply-demand link. A quadratic programming algorithm is used to calculate flow patterns that minimize consumer costs for the system.

  12. Estimating Independent Locally Shifted Random Utility Models for Ranking Data

    Science.gov (United States)

    Lam, Kar Yin; Koning, Alex J.; Franses, Philip Hans

    2011-01-01

    We consider the estimation of probabilistic ranking models in the context of conjoint experiments. By using approximate rather than exact ranking probabilities, we avoided the computation of high-dimensional integrals. We extended the approximation technique proposed by Henery (1981) in the context of the Thurstone-Mosteller-Daniels model to any…

  13. Recent advances in modeling nutrient utilization in ruminants1

    NARCIS (Netherlands)

    Kebreab, E.; Dijkstra, J.; Bannink, A.; France, J.

    2009-01-01

    Mathematical modeling techniques have been applied to study various aspects of the ruminant, such as rumen function, post-absorptive metabolism and product composition. This review focuses on advances made in modeling rumen fermentation and its associated rumen disorders, and energy and nutrient uti

  14. Digital Avionics Information System (DAIS): Training Requirements Analysis Model (TRAMOD).

    Science.gov (United States)

    Czuchry, Andrew J.; And Others

    The training requirements analysis model (TRAMOD) described in this report represents an important portion of the larger effort called the Digital Avionics Information System (DAIS) Life Cycle Cost (LCC) Study. TRAMOD is the second of three models that comprise an LCC impact modeling system for use in the early stages of system development. As…

  15. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Eve...... requirements, where the system to be built must explicitly support the interaction between people within a pervasive cooperative workflow execution. A case study from a real project is used to illustrate the proposed approach.......Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Even...... with simple unified modelling language (UML) requirements models, it is not easy for the development team to get confidence on the stakeholders' requirements validation. This paper describes an approach, based on the construction of executable interactive prototypes, to support the validation of workflow...

  16. Surplus thermal energy model of greenhouses and coefficient analysis for effective utilization

    Directory of Open Access Journals (Sweden)

    Seung-Hwan Yang

    2016-03-01

    Full Text Available If a greenhouse in the temperate and subtropical regions is maintained in a closed condition, the indoor temperature commonly exceeds that required for optimal plant growth, even in the cold season. This study considered this excess energy as surplus thermal energy (STE, which can be recovered, stored and used when heating is necessary. To use the STE economically and effectively, the amount of STE must be estimated before designing a utilization system. Therefore, this study proposed an STE model using energy balance equations for the three steps of the STE generation process. The coefficients in the model were determined by the results of previous research and experiments using the test greenhouse. The proposed STE model produced monthly errors of 17.9%, 10.4% and 7.4% for December, January and February, respectively. Furthermore, the effects of the coefficients on the model accuracy were revealed by the estimation error assessment and linear regression analysis through fixing dynamic coefficients. A sensitivity analysis of the model coefficients indicated that the coefficients have to be determined carefully. This study also provides effective ways to increase the amount of STE.

  17. Surplus thermal energy model of greenhouses and coefficient analysis for effective utilization

    Energy Technology Data Exchange (ETDEWEB)

    Yang, S.H.; Son, J.E.; Lee, S.D.; Cho, S.I.; Ashtiani-Araghi, A.; Rhee, J.Y.

    2016-11-01

    If a greenhouse in the temperate and subtropical regions is maintained in a closed condition, the indoor temperature commonly exceeds that required for optimal plant growth, even in the cold season. This study considered this excess energy as surplus thermal energy (STE), which can be recovered, stored and used when heating is necessary. To use the STE economically and effectively, the amount of STE must be estimated before designing a utilization system. Therefore, this study proposed an STE model using energy balance equations for the three steps of the STE generation process. The coefficients in the model were determined by the results of previous research and experiments using the test greenhouse. The proposed STE model produced monthly errors of 17.9%, 10.4% and 7.4% for December, January and February, respectively. Furthermore, the effects of the coefficients on the model accuracy were revealed by the estimation error assessment and linear regression analysis through fixing dynamic coefficients. A sensitivity analysis of the model coefficients indicated that the coefficients have to be determined carefully. This study also provides effective ways to increase the amount of STE. (Author)

  18. Workshop on Functional Requirements for the Modeling of Fate and Transport of Waterborne CBRN Materials

    Energy Technology Data Exchange (ETDEWEB)

    Giles, GE

    2005-02-03

    The purpose of this Workshop on ''Functional Requirements for the Modeling of Fate and Transport of Waterborne CBRN Materials'' was to solicit functional requirements for tools that help Incident Managers plan for and deal with the consequences of industrial or terrorist releases of materials into the nation's waterways and public water utilities. Twenty representatives attended and several made presentations. Several hours of discussions elicited a set of requirements. These requirements were summarized in a form for the attendees to vote on their highest priority requirements. These votes were used to determine the prioritized requirements that are reported in this paper and can be used to direct future developments.

  19. Rodent models of cardiopulmonary bypass: utility in improving perioperative outcomes

    NARCIS (Netherlands)

    de Lange, F.

    2008-01-01

    Despite advances in surgical and anesthesia techniques, subtle neurologic injury still remains an important complication after cardiac surgery. Because the causes are multifactorial and complex, research in an appropriate small animal model for cardiopulmonary bypass (CPB) is warranted. This thesis

  20. Kinetic models of cell growth, substrate utilization and bio ...

    African Journals Online (AJOL)

    STORAGESEVER

    2008-05-02

    May 2, 2008 ... A simple model was proposed using the Logistic Equation for the growth,. Leudeking-Piret ... (melanoidin) which may create many problems and also .... Where, the constant µ is defined as the specific growth rate. Equation 1 ...

  1. GENERAL REQUIREMENTS FOR SIMULATION MODELS IN WASTE MANAGEMENT

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Ian; Kossik, Rick; Voss, Charlie

    2003-02-27

    Most waste management activities are decided upon and carried out in a public or semi-public arena, typically involving the waste management organization, one or more regulators, and often other stakeholders and members of the public. In these environments, simulation modeling can be a powerful tool in reaching a consensus on the best path forward, but only if the models that are developed are understood and accepted by all of the parties involved. These requirements for understanding and acceptance of the models constrain the appropriate software and model development procedures that are employed. This paper discusses requirements for both simulation software and for the models that are developed using the software. Requirements for the software include transparency, accessibility, flexibility, extensibility, quality assurance, ability to do discrete and/or continuous simulation, and efficiency. Requirements for the models that are developed include traceability, transparency, credibility/validity, and quality control. The paper discusses these requirements with specific reference to the requirements for performance assessment models that are used for predicting the long-term safety of waste disposal facilities, such as the proposed Yucca Mountain repository.

  2. Animal models of obsessive–compulsive disorder: utility and limitations

    Science.gov (United States)

    Alonso, Pino; López-Solà, Clara; Real, Eva; Segalàs, Cinto; Menchón, José Manuel

    2015-01-01

    Obsessive–compulsive disorder (OCD) is a disabling and common neuropsychiatric condition of poorly known etiology. Many attempts have been made in the last few years to develop animal models of OCD with the aim of clarifying the genetic, neurochemical, and neuroanatomical basis of the disorder, as well as of developing novel pharmacological and neurosurgical treatments that may help to improve the prognosis of the illness. The latter goal is particularly important given that around 40% of patients with OCD do not respond to currently available therapies. This article summarizes strengths and limitations of the leading animal models of OCD including genetic, pharmacologically induced, behavioral manipulation-based, and neurodevelopmental models according to their face, construct, and predictive validity. On the basis of this evaluation, we discuss that currently labeled “animal models of OCD” should be regarded not as models of OCD but, rather, as animal models of different psychopathological processes, such as compulsivity, stereotypy, or perseverance, that are present not only in OCD but also in other psychiatric or neurological disorders. Animal models might constitute a challenging approach to study the neural and genetic mechanism of these phenomena from a trans-diagnostic perspective. Animal models are also of particular interest as tools for developing new therapeutic options for OCD, with the greatest convergence focusing on the glutamatergic system, the role of ovarian and related hormones, and the exploration of new potential targets for deep brain stimulation. Finally, future research on neurocognitive deficits associated with OCD through the use of analogous animal tasks could also provide a genuine opportunity to disentangle the complex etiology of the disorder. PMID:26346234

  3. Utilization of remote sensing observations in hydrologic models

    Science.gov (United States)

    Ragan, R. M.

    1977-01-01

    Most of the remote sensing related work in hydrologic modeling has centered on modifying existing models to take advantage of the capabilities of new sensor techniques. There has been enough success with this approach to insure that remote sensing is a powerful tool in modeling the watershed processes. Unfortunately, many of the models in use were designed without recognizing the growth of remote sensing technology. Thus, their parameters were selected to be map or field crew definable. It is believed that the real benefits will come through the evolution of new models having new parameters that are developed specifically to take advantage of our capabilities in remote sensing. The ability to define hydrologically active areas could have a significant impact. The ability to define soil moisture and the evolution of new techniques to estimate evoportransportation could significantly modify our approach to hydrologic modeling. Still, without a major educational effort to develop an understanding of the techniques used to extract parameter estimates from remote sensing data, the potential offered by this new technology will not be achieved.

  4. Tritium Specific Adsorption Simulation Utilizing the OSPREY Model

    Energy Technology Data Exchange (ETDEWEB)

    Veronica Rutledge; Lawrence Tavlarides; Ronghong Lin; Austin Ladshaw

    2013-09-01

    During the processing of used nuclear fuel, volatile radionuclides will be discharged to the atmosphere if no recovery processes are in place to limit their release. The volatile radionuclides of concern are 3H, 14C, 85Kr, and 129I. Methods are being developed, via adsorption and absorption unit operations, to capture these radionuclides. It is necessary to model these unit operations to aid in the evaluation of technologies and in the future development of an advanced used nuclear fuel processing plant. A collaboration between Fuel Cycle Research and Development Offgas Sigma Team member INL and a NEUP grant including ORNL, Syracuse University, and Georgia Institute of Technology has been formed to develop off gas models and support off gas research. This report is discusses the development of a tritium specific adsorption model. Using the OSPREY model and integrating it with a fundamental level isotherm model developed under and experimental data provided by the NEUP grant, the tritium specific adsorption model was developed.

  5. Dispersion modeling of accidental releases of toxic gases - Comparison of the models and their utility for the fire brigades.

    Science.gov (United States)

    Stenzel, S.; Baumann-Stanzer, K.

    2009-04-01

    Dispersion modeling of accidental releases of toxic gases - Comparison of the models and their utility for the fire brigades. Sirma Stenzel, Kathrin Baumann-Stanzer In the case of accidental release of hazardous gases in the atmosphere, the emergency responders need a reliable and fast tool to assess the possible consequences and apply the optimal countermeasures. For hazard prediction and simulation of the hazard zones a number of air dispersion models are available. The most model packages (commercial or free of charge) include a chemical database, an intuitive graphical user interface (GUI) and automated graphical output for display the results, they are easy to use and can operate fast and effective during stress situations. The models are designed especially for analyzing different accidental toxic release scenarios ("worst-case scenarios"), preparing emergency response plans and optimal countermeasures as well as for real-time risk assessment and management. There are also possibilities for model direct coupling to automatic meteorological stations, in order to avoid uncertainties in the model output due to insufficient or incorrect meteorological data. Another key problem in coping with accidental toxic release is the relative width spectrum of regulations and values, like IDLH, ERPG, AEGL, MAK etc. and the different criteria for their application. Since the particulate emergency responders and organizations require for their purposes unequal regulations and values, it is quite difficult to predict the individual hazard areas. There are a quite number of research studies and investigations coping with the problem, anyway the end decision is up to the authorities. The research project RETOMOD (reference scenarios calculations for toxic gas releases - model systems and their utility for the fire brigade) was conducted by the Central Institute for Meteorology and Geodynamics (ZAMG) in cooperation with the Vienna fire brigade, OMV Refining & Marketing GmbH and

  6. Requirements engineering for cross-sectional information chain models.

    Science.gov (United States)

    Hübner, U; Cruel, E; Gök, M; Garthaus, M; Zimansky, M; Remmers, H; Rienhoff, O

    2012-01-01

    Despite the wealth of literature on requirements engineering, little is known about engineering very generic, innovative and emerging requirements, such as those for cross-sectional information chains. The IKM health project aims at building information chain reference models for the care of patients with chronic wounds, cancer-related pain and back pain. Our question therefore was how to appropriately capture information and process requirements that are both generally applicable and practically useful. To this end, we started with recommendations from clinical guidelines and put them up for discussion in Delphi surveys and expert interviews. Despite the heterogeneity we encountered in all three methods, it was possible to obtain requirements suitable for building reference models. We evaluated three modelling languages and then chose to write the models in UML (class and activity diagrams). On the basis of the current project results, the pros and cons of our approach are discussed.

  7. Inferring Requirement Goals from Model Implementing in UML

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    UML is used widely in many software developmentprocesses.However,it does not make explicit requirement goals.Here is a method tending to establish the semantic relationship between requirements goals and UML models.Before the method is introduced,some relevant concepts are described

  8. Predictive Modeling of Defibrillation utilizing Hexahedral and Tetrahedral Finite Element Models: Recent Advances

    Science.gov (United States)

    Triedman, John K.; Jolley, Matthew; Stinstra, Jeroen; Brooks, Dana H.; MacLeod, Rob

    2008-01-01

    ICD implants may be complicated by body size and anatomy. One approach to this problem has been the adoption of creative, extracardiac implant strategies using standard ICD components. Because data on safety or efficacy of such ad hoc implant strategies is lacking, we have developed image-based finite element models (FEMs) to compare electric fields and expected defibrillation thresholds (DFTs) using standard and novel electrode locations. In this paper, we review recently published studies by our group using such models, and progress in meshing strategies to improve efficiency and visualization. Our preliminary observations predict that they may be large changes in DFTs with clinically relevant variations of electrode placement. Extracardiac ICDs of various lead configurations are predicted to be effective in both children and adults. This approach may aid both ICD development and patient-specific optimization of electrode placement, but the simplified nature of current models dictates further development and validation prior to clinical or industrial utilization. PMID:18817926

  9. Utilization-Based Modeling and Optimization for Cognitive Radio Networks

    Science.gov (United States)

    Liu, Yanbing; Huang, Jun; Liu, Zhangxiong

    The cognitive radio technique promises to manage and allocate the scarce radio spectrum in the highly varying and disparate modern environments. This paper considers a cognitive radio scenario composed of two queues for the primary (licensed) users and cognitive (unlicensed) users. According to the Markov process, the system state equations are derived and an optimization model for the system is proposed. Next, the system performance is evaluated by calculations which show the rationality of our system model. Furthermore, discussions among different parameters for the system are presented based on the experimental results.

  10. User-owned utility models for rural electrification

    Energy Technology Data Exchange (ETDEWEB)

    Waddle, D.

    1997-12-01

    The author discusses the history of rural electric cooperatives (REC) in the United States, and the broader question of whether such organizations can serve as a model for rural electrification in other countries. The author points out the features of such cooperatives which have given them stability and strength, and emphasizes that for success of such programs, many of these same features must be present. He definitely feels the cooperative models are not outdated, but they need strong local support, and a governmental structure which is supportive, and in particular not negative.

  11. Recursive inter-generational utility in global climate risk modeling

    Energy Technology Data Exchange (ETDEWEB)

    Minh, Ha-Duong [Centre International de Recherche sur l' Environnement et le Developpement (CIRED-CNRS), 75 - Paris (France); Treich, N. [Institut National de Recherches Agronomiques (INRA-LEERNA), 31 - Toulouse (France)

    2003-07-01

    This paper distinguishes relative risk aversion and resistance to inter-temporal substitution in climate risk modeling. Stochastic recursive preferences are introduced in a stylized numeric climate-economy model using preliminary IPCC 1998 scenarios. It shows that higher risk aversion increases the optimal carbon tax. Higher resistance to inter-temporal substitution alone has the same effect as increasing the discount rate, provided that the risk is not too large. We discuss implications of these findings for the debate upon discounting and sustainability under uncertainty. (author)

  12. Transgenic models of Alzheimer's disease: better utilization of existing models through viral transgenesis.

    Science.gov (United States)

    Platt, Thomas L; Reeves, Valerie L; Murphy, M Paul

    2013-09-01

    Animal models have been used for decades in the Alzheimer's disease (AD) research field and have been crucial for the advancement of our understanding of the disease. Most models are based on familial AD mutations of genes involved in the amyloidogenic process, such as the amyloid precursor protein (APP) and presenilin 1 (PS1). Some models also incorporate mutations in tau (MAPT) known to cause frontotemporal dementia, a neurodegenerative disease that shares some elements of neuropathology with AD. While these models are complex, they fail to display pathology that perfectly recapitulates that of the human disease. Unfortunately, this level of pre-existing complexity creates a barrier to the further modification and improvement of these models. However, as the efficacy and safety of viral vectors improves, their use as an alternative to germline genetic modification is becoming a widely used research tool. In this review we discuss how this approach can be used to better utilize common mouse models in AD research. This article is part of a Special Issue entitled: Animal Models of Disease.

  13. On the Utility of Island Models in Dynamic Optimization

    DEFF Research Database (Denmark)

    Lissovoi, Andrei; Witt, Carsten

    2015-01-01

    to λ=O(n1-ε), the (1+λ) EA is still not able to track the optimum of Maze. If the migration interval is increased, the algorithm is able to track the optimum even for logarithmic λ. Finally, the relationship of τ, λ, and the ability of the island model to track the optimum is investigated more closely....

  14. Utility covariances and context effects in conjoint MNP models

    NARCIS (Netherlands)

    Haaijer, M.E.; Wedel, M.; Vriens, M.; Wansbeek, T.J.

    1998-01-01

    Experimental conjoint choice analysis is among the most frequently used methods for measuring and analyzing consumer preferences. The data from such experiments have been typically analyzed with the Multinomial Legit (MNL) model. However, there are several problems associated with the standard MNL

  15. Utility covariances and context effects in conjoint MNP models

    NARCIS (Netherlands)

    Haaijer, M.E.; Wedel, M.; Vriens, M.; Wansbeek, T.J.

    1998-01-01

    Experimental conjoint choice analysis is among the most frequently used methods for measuring and analyzing consumer preferences. The data from such experiments have been typically analyzed with the Multinomial Legit (MNL) model. However, there are several problems associated with the standard MNL m

  16. Animal models of β-hemoglobinopathies: utility and limitations

    Directory of Open Access Journals (Sweden)

    McColl B

    2016-11-01

    Full Text Available Bradley McColl, Jim Vadolas Cell and Gene Therapy Laboratory, Murdoch Childrens Research Institute, Royal Children’s Hospital, Parkville, VIC, Australia Abstract: The structural and functional conservation of hemoglobin throughout mammals has made the laboratory mouse an exceptionally useful organism in which to study both the protein and the individual globin genes. Early researchers looked to the globin genes as an excellent model in which to examine gene regulation – bountifully expressed and displaying a remarkably consistent pattern of developmental activation and silencing. In parallel with the growth of research into expression of the globin genes, mutations within the β-globin gene were identified as the cause of the β-hemoglobinopathies such as sickle cell disease and β-thalassemia. These lines of enquiry stimulated the development of transgenic mouse models, first carrying individual human globin genes and then substantial human genomic fragments incorporating the multigenic human β-globin locus and regulatory elements. Finally, mice were devised carrying mutant human β-globin loci on genetic backgrounds deficient in the native mouse globins, resulting in phenotypes of sickle cell disease or β-thalassemia. These years of work have generated a group of model animals that display many features of the β-hemoglobinopathies and provided enormous insight into the mechanisms of gene regulation. Substantive differences in the expression of human and mouse globins during development have also come to light, revealing the limitations of the mouse model, but also providing opportunities to further explore the mechanisms of globin gene regulation. In addition, animal models of β-hemoglobinopathies have demonstrated the feasibility of gene therapy for these conditions, now showing success in human clinical trials. Such models remain in use to dissect the molecular events of globin gene regulation and to identify novel treatments based

  17. An Algebraic Graphical Model for Decision with Uncertainties, Feasibilities, and Utilities

    CERN Document Server

    Pralet, C; Verfaillie, G; 10.1613/jair.2151

    2011-01-01

    Numerous formalisms and dedicated algorithms have been designed in the last decades to model and solve decision making problems. Some formalisms, such as constraint networks, can express "simple" decision problems, while others are designed to take into account uncertainties, unfeasible decisions, and utilities. Even in a single formalism, several variants are often proposed to model different types of uncertainty (probability, possibility...) or utility (additive or not). In this article, we introduce an algebraic graphical model that encompasses a large number of such formalisms: (1) we first adapt previous structures from Friedman, Chu and Halpern for representing uncertainty, utility, and expected utility in order to deal with generic forms of sequential decision making; (2) on these structures, we then introduce composite graphical models that express information via variables linked by "local" functions, thanks to conditional independence; (3) on these graphical models, we finally define a simple class ...

  18. Integrating utilization-focused evaluation with business process modeling for clinical research improvement.

    Science.gov (United States)

    Kagan, Jonathan M; Rosas, Scott; Trochim, William M K

    2010-10-01

    New discoveries in basic science are creating extraordinary opportunities to design novel biomedical preventions and therapeutics for human disease. But the clinical evaluation of these new interventions is, in many instances, being hindered by a variety of legal, regulatory, policy and operational factors, few of which enhance research quality, the safety of study participants or research ethics. With the goal of helping increase the efficiency and effectiveness of clinical research, we have examined how the integration of utilization-focused evaluation with elements of business process modeling can reveal opportunities for systematic improvements in clinical research. Using data from the NIH global HIV/AIDS clinical trials networks, we analyzed the absolute and relative times required to traverse defined phases associated with specific activities within the clinical protocol lifecycle. Using simple median duration and Kaplan-Meyer survival analysis, we show how such time-based analyses can provide a rationale for the prioritization of research process analysis and re-engineering, as well as a means for statistically assessing the impact of policy modifications, resource utilization, re-engineered processes and best practices. Successfully applied, this approach can help researchers be more efficient in capitalizing on new science to speed the development of improved interventions for human disease.

  19. ASPEN+ and economic modeling of equine waste utilization for localized hot water heating via fast pyrolysis

    Science.gov (United States)

    ASPEN Plus based simulation models have been developed to design a pyrolysis process for the on-site production and utilization of pyrolysis oil from equine waste at the Equine Rehabilitation Center at Morrisville State College (MSC). The results indicate that utilization of all available Equine Reh...

  20. Validation of Power Requirement Model for Active Loudspeakers

    DEFF Research Database (Denmark)

    Schneider, Henrik; Madsen, Anders Normann; Bjerregaard, Ruben

    2015-01-01

    The actual power requirement of an active loudspeaker during playback of music has not received much attention in the literature. This is probably because no single and simple solution exists and because a complete system knowledge from input voltage to output sound pressure level is required....... There are however many advantages that could be harvested from such knowledge like size, cost and efficiency improvements. In this paper a recently proposed power requirement model for active loudspeakers is experimentally validated and the model is expanded to include the closed and vented type enclosures...

  1. Towards a Formalized Ontology-Based Requirements Model

    Institute of Scientific and Technical Information of China (English)

    JIANG Dan-dong; ZHANG Shen-sheng; WANG Ying-lin

    2005-01-01

    The goal of this paper is to take a further step towards an ontological approach for representing requirements information. The motivation for ontologies was discussed. The definitions of ontology and requirements ontology were given. Then, it presented a collection of informal terms, including four subject areas. It also discussed the formalization process of ontology. The underlying meta-ontology was determined, and the formalized requirements ontology was analyzed. This formal ontology is built to serve as a basis for requirements model. Finally, the implementation of software system was given.

  2. Rodent models of diabetic nephropathy: their utility and limitations

    OpenAIRE

    Kitada M; Ogura Y; Koya D

    2016-01-01

    Munehiro Kitada,1,2 Yoshio Ogura,2 Daisuke Koya1,2 1Division of Anticipatory Molecular Food Science and Technology, Medical Research Institute, 2Department of Diabetology and Endocrinology, Kanazawa Medical University, Uchinada, Ishikawa, Japan Abstract: Diabetic nephropathy is the most common cause of end-stage renal disease. Therefore, novel therapies for the suppression of diabetic nephropathy must be developed. Rodent models are useful for elucidating the pathogenesis of diseases and test...

  3. Assessment of energy utilization and leakages in buildings with building information model energy

    Directory of Open Access Journals (Sweden)

    Egwunatum I. Samuel

    2017-03-01

    Full Text Available Given the ability of building information models (BIM to serve as a multidisciplinary data repository, this study attempts to explore and exploit the sustainability value of BIM in delivering buildings that require less energy for operations, emit less carbon dioxide, and provide conducive living environments for occupants. This objective was attained by a critical and extensive literature review that covers the following: (1 building energy consumption, (2 building energy performance and analysis, and (3 BIM and energy assessment. Literature cited in this paper shows that linking an energy analysis tool with a BIM model has helped project design teams to predict and create optimized energy consumption by conducting building energy performance analysis utilizing key performance indicators on average thermal transmitters, resulting heat demand, lighting power, solar heat gains, and ventilation heat losses. An in-depth analysis was conducted on a completed BIM integrated construction project utilizing the Arboleda Project in the Dominican Republic to validate the aforementioned findings. Results show that the BIM-based energy analysis helped the design team attain the world׳s first positive energy building. This study concludes that linking an energy analysis tool with a BIM model helps to expedite the energy analysis process, provide more detailed and accurate results, and deliver energy-efficient buildings. This study further recommends that the adoption of level 2 BIM and BIM integration in energy optimization analysis must be demanded by building regulatory agencies for all projects regardless of procurement method (i.e., government funded or otherwise or size.

  4. A transformation approach for collaboration based requirement models

    CERN Document Server

    Harbouche, Ahmed; Mokhtari, Aicha

    2012-01-01

    Distributed software engineering is widely recognized as a complex task. Among the inherent complexities is the process of obtaining a system design from its global requirement specification. This paper deals with such transformation process and suggests an approach to derive the behavior of a given system components, in the form of distributed Finite State Machines, from the global system requirements, in the form of an augmented UML Activity Diagrams notation. The process of the suggested approach is summarized in three steps: the definition of the appropriate source Meta-Model (requirements Meta-Model), the definition of the target Design Meta-Model and the definition of the rules to govern the transformation during the derivation process. The derivation process transforms the global system requirements described as UML diagram activities (extended with collaborations) to system roles behaviors represented as UML finite state machines. The approach is implemented using Atlas Transformation Language (ATL).

  5. Irrigation Requirement Estimation Using Vegetation Indices and Inverse Biophysical Modeling

    Science.gov (United States)

    Bounoua, Lahouari; Imhoff, Marc L.; Franks, Shannon

    2010-01-01

    We explore an inverse biophysical modeling process forced by satellite and climatological data to quantify irrigation requirements in semi-arid agricultural areas. We constrain the carbon and water cycles modeled under both equilibrium, balance between vegetation and climate, and non-equilibrium, water added through irrigation. We postulate that the degree to which irrigated dry lands vary from equilibrium climate conditions is related to the amount of irrigation. The amount of water required over and above precipitation is considered as an irrigation requirement. For July, results show that spray irrigation resulted in an additional amount of water of 1.3 mm per occurrence with a frequency of 24.6 hours. In contrast, the drip irrigation required only 0.6 mm every 45.6 hours or 46% of that simulated by the spray irrigation. The modeled estimates account for 87% of the total reported irrigation water use, when soil salinity is not important and 66% in saline lands.

  6. Innovative Product Design Based on Customer Requirement Weight Calculation Model

    Institute of Scientific and Technical Information of China (English)

    Chen-Guang Guo; Yong-Xian Liu; Shou-Ming Hou; Wei Wang

    2010-01-01

    In the processes of product innovation and design, it is important for the designers to find and capture customer's focus through customer requirement weight calculation and ranking. Based on the fuzzy set theory and Euclidean space distance, this paper puts forward a method for customer requirement weight calculation called Euclidean space distances weighting ranking method. This method is used in the fuzzy analytic hierarchy process that satisfies the additive consistent fuzzy matrix. A model for the weight calculation steps is constructed;meanwhile, a product innovation design module on the basis of the customer requirement weight calculation model is developed. Finally, combined with the instance of titanium sponge production, the customer requirement weight calculation model is validated. By the innovation design module, the structure of the titanium sponge reactor has been improved and made innovative.

  7. Implications of workforce and financing changes for primary care practice utilization, revenue, and cost: a generalizable mathematical model for practice management.

    Science.gov (United States)

    Basu, Sanjay; Landon, Bruce E; Song, Zirui; Bitton, Asaf; Phillips, Russell S

    2015-02-01

    Primary care practice transformations require tools for policymakers and practice managers to understand the financial implications of workforce and reimbursement changes. To create a simulation model to understand how practice utilization, revenues, and expenses may change in the context of workforce and financing changes. We created a simulation model estimating clinic-level utilization, revenues, and expenses using user-specified or public input data detailing practice staffing levels, salaries and overhead expenditures, patient characteristics, clinic workload, and reimbursements. We assessed whether the model could accurately estimate clinic utilization, revenues, and expenses across the nation using labor compensation, medical expenditure, and reimbursements databases, as well as cost and revenue data from independent practices of varying size. We demonstrated the model's utility in a simulation of how utilization, revenue, and expenses would change after hiring a nurse practitioner (NP) compared with hiring a part-time physician. Modeled practice utilization and revenue closely matched independent national utilization and reimbursement data, disaggregated by patient age, sex, race/ethnicity, insurance status, and ICD diagnostic group; the model was able to estimate independent revenue and cost estimates, with highest accuracy among larger practices. A demonstration analysis revealed that hiring an NP to work independently with a subset of patients diagnosed with diabetes or hypertension could increase net revenues, if NP visits involve limited MD consultation or if NP reimbursement rates increase. A model of utilization, revenue, and expenses in primary care practices may help policymakers and managers understand the implications of workforce and financing changes.

  8. Dispersion modeling of accidental releases of toxic gases - utility for the fire brigades.

    Science.gov (United States)

    Stenzel, S.; Baumann-Stanzer, K.

    2009-09-01

    Several air dispersion models are available for prediction and simulation of the hazard areas associated with accidental releases of toxic gases. The most model packages (commercial or free of charge) include a chemical database, an intuitive graphical user interface (GUI) and automated graphical output for effective presentation of results. The models are designed especially for analyzing different accidental toxic release scenarios ("worst-case scenarios”), preparing emergency response plans and optimal countermeasures as well as for real-time risk assessment and management. The research project RETOMOD (reference scenarios calculations for toxic gas releases - model systems and their utility for the fire brigade) was conducted by the Central Institute for Meteorology and Geodynamics (ZAMG) in cooperation with the Viennese fire brigade, OMV Refining & Marketing GmbH and Synex Ries & Greßlehner GmbH. RETOMOD was funded by the KIRAS safety research program of the Austrian Ministry of Transport, Innovation and Technology (www.kiras.at). The main tasks of this project were 1. Sensitivity study and optimization of the meteorological input for modeling of the hazard areas (human exposure) during the accidental toxic releases. 2. Comparison of several model packages (based on reference scenarios) in order to estimate the utility for the fire brigades. For the purpose of our study the following models were tested and compared: ALOHA (Areal Location of Hazardous atmosphere, EPA), MEMPLEX (Keudel av-Technik GmbH), Trace (Safer System), Breeze (Trinity Consulting), SAM (Engineering office Lohmeyer). A set of reference scenarios for Chlorine, Ammoniac, Butane and Petrol were proceed, with the models above, in order to predict and estimate the human exposure during the event. Furthermore, the application of the observation-based analysis and forecasting system INCA, developed in the Central Institute for Meteorology and Geodynamics (ZAMG) in case of toxic release was

  9. Modeling the utility of binaural cues for underwater sound localization.

    Science.gov (United States)

    Schneider, Jennifer N; Lloyd, David R; Banks, Patchouly N; Mercado, Eduardo

    2014-06-01

    The binaural cues used by terrestrial animals for sound localization in azimuth may not always suffice for accurate sound localization underwater. The purpose of this research was to examine the theoretical limits of interaural timing and level differences available underwater using computational and physical models. A paired-hydrophone system was used to record sounds transmitted underwater and recordings were analyzed using neural networks calibrated to reflect the auditory capabilities of terrestrial mammals. Estimates of source direction based on temporal differences were most accurate for frequencies between 0.5 and 1.75 kHz, with greater resolution toward the midline (2°), and lower resolution toward the periphery (9°). Level cues also changed systematically with source azimuth, even at lower frequencies than expected from theoretical calculations, suggesting that binaural mechanical coupling (e.g., through bone conduction) might, in principle, facilitate underwater sound localization. Overall, the relatively limited ability of the model to estimate source position using temporal and level difference cues underwater suggests that animals such as whales may use additional cues to accurately localize conspecifics and predators at long distances. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Evaluation of Foreign Exchange Risk Capital Requirement Models

    Directory of Open Access Journals (Sweden)

    Ricardo S. Maia Clemente

    2005-12-01

    Full Text Available This paper examines capital requirement for financial institutions in order to cover market risk stemming from exposure to foreign currencies. The models examined belong to two groups according to the approach involved: standardized and internal models. In the first group, we study the Basel model and the model adopted by the Brazilian legislation. In the second group, we consider the models based on the concept of value at risk (VaR. We analyze the single and the double-window historical model, the exponential smoothing model (EWMA and a hybrid approach that combines features of both models. The results suggest that the Basel model is inadequate to the Brazilian market, exhibiting a large number of exceptions. The model of the Brazilian legislation has no exceptions, though generating higher capital requirements than other internal models based on VaR. In general, VaR-based models perform better and result in less capital allocation than the standardized approach model applied in Brazil.

  11. UTILITY OF MECHANISTIC MODELS FOR DIRECTING ADVANCED SEPARATIONS RESEARCH & DEVELOPMENT ACTIVITIES: Electrochemically Modulated Separation Example

    Energy Technology Data Exchange (ETDEWEB)

    Schwantes, Jon M.

    2009-06-01

    The objective for this work was to demonstrate the utility of mechanistic computer models designed to simulate actinide behavior for use in efficiently and effectively directing advanced laboratory R&D activities associated with developing advanced separations methods.

  12. The Benefit of Ambiguity in Understanding Goals in Requirements Modelling

    DEFF Research Database (Denmark)

    Paay, Jeni; Pedell, Sonja; Sterling, Leon

    2011-01-01

    This paper examines the benefit of ambiguity in describing goals in requirements modelling for the design of socio-technical systems using concepts from Agent-Oriented Software Engineering (AOSE) and ethnographic and cultural probe methods from Human Computer Interaction (HCI). The authors’ aim...... a holistic approach to eliciting, analyzing, and modelling socially-oriented requirements by combining a particular form of ethnographic technique, cultural probes, with Agent Oriented Software Engineering notations to model these requirements. This paper focuses on examining the value of maintaining...... of their research is to create technologies that support more flexible and meaningful social interactions, by combining best practice in software engineering with ethnographic techniques to model complex social interactions from their socially oriented life for the purposes of building rich socio...

  13. Analysis and Flexible Structural Modeling for Oscillating Wing Utilizing Aeroelasticity

    Institute of Scientific and Technical Information of China (English)

    Shao Ke; Wu Zhigang; Yang Chao

    2008-01-01

    Making use of modal characteristics of the natural vibration of flexible structure to design the oscillating wing aircraft is proposed.A series of equations concerning the oscillating wing of flexible structures are derived. The kinetic equation for aerodynamic force coupled with elastic movement is set up, and relevant formulae are derived. The unsteady aerodynamic one in that formulae is revised. The design principle, design process and range of application of such oscillating wing analytical method are elaborated. A flexible structural oscillating wing model is set up, and relevant time response analysis and frequency response analysis are conducted. The analytical results indicate that adopting the new-type driving way for the oscillating wing will not have flutter problems and will be able to produce propulsive force. Furthermore, it will consume much less power than the fixed wing for generating the same lift.

  14. A customer satisfaction model for a utility service industry

    Science.gov (United States)

    Jamil, Jastini Mohd; Nawawi, Mohd Kamal Mohd; Ramli, Razamin

    2016-08-01

    This paper explores the effect of Image, Customer Expectation, Perceived Quality and Perceived Value on Customer Satisfaction, and to investigate the effect of Image and Customer Satisfaction on Customer Loyalty of mobile phone provider in Malaysia. The result of this research is based on data gathered online from international students in one of the public university in Malaysia. Partial Least Squares Structural Equation Modeling (PLS-SEM) has been used to analyze the data that have been collected from the international students' perceptions. The results found that Image and Perceived Quality have significant impact on Customer Satisfaction. Image and Customer Satisfaction ware also found to have significantly related to Customer Loyalty. However, no significant impact has been found between Customer Expectation with Customer Satisfaction, Perceived Value with Customer Satisfaction, and Customer Expectation with Perceived Value. We hope that the findings may assist the mobile phone provider in production and promotion of their services.

  15. Business Process Simulation: Requirements for Business and Resource Models

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2015-07-01

    Full Text Available The purpose of Business Process Model and Notation (BPMN is to provide easily understandable graphical representation of business process. Thus BPMN is widely used and applied in various areas one of them being a business process simulation. This paper addresses some BPMN model based business process simulation problems. The paper formulate requirements for business process and resource models in enabling their use for business process simulation.

  16. Pseudomonas aeruginosa PA1006, which plays a role in molybdenum homeostasis, is required for nitrate utilization, biofilm formation, and virulence.

    Science.gov (United States)

    Filiatrault, Melanie J; Tombline, Gregory; Wagner, Victoria E; Van Alst, Nadine; Rumbaugh, Kendra; Sokol, Pam; Schwingel, Johanna; Iglewski, Barbara H

    2013-01-01

    Pseudomonas aeruginosa (Pae) is a clinically important opportunistic pathogen. Herein, we demonstrate that the PA1006 protein is critical for all nitrate reductase activities, growth as a biofilm in a continuous flow system, as well as virulence in mouse burn and rat lung model systems. Microarray analysis revealed that ΔPA1006 cells displayed extensive alterations in gene expression including nitrate-responsive, quorum sensing (including PQS production), and iron-regulated genes, as well as molybdenum cofactor and Fe-S cluster biosynthesis factors, members of the TCA cycle, and Type VI Secretion System components. Phenotype Microarray™ profiles of ΔPA1006 aerobic cultures using Biolog plates also revealed a reduced ability to utilize a number of TCA cycle intermediates as well as a failure to utilize xanthine as a sole source of nitrogen. As a whole, these data indicate that the loss of PA1006 confers extensive changes in Pae metabolism. Based upon homology of PA1006 to the E. coli YhhP protein and data from the accompanying study, loss of PA1006 persulfuration and/or molybdenum homeostasis are likely the cause of extensive metabolic alterations that impact biofilm development and virulence in the ΔPA1006 mutant.

  17. Pseudomonas aeruginosa PA1006, which plays a role in molybdenum homeostasis, is required for nitrate utilization, biofilm formation, and virulence.

    Directory of Open Access Journals (Sweden)

    Melanie J Filiatrault

    Full Text Available Pseudomonas aeruginosa (Pae is a clinically important opportunistic pathogen. Herein, we demonstrate that the PA1006 protein is critical for all nitrate reductase activities, growth as a biofilm in a continuous flow system, as well as virulence in mouse burn and rat lung model systems. Microarray analysis revealed that ΔPA1006 cells displayed extensive alterations in gene expression including nitrate-responsive, quorum sensing (including PQS production, and iron-regulated genes, as well as molybdenum cofactor and Fe-S cluster biosynthesis factors, members of the TCA cycle, and Type VI Secretion System components. Phenotype Microarray™ profiles of ΔPA1006 aerobic cultures using Biolog plates also revealed a reduced ability to utilize a number of TCA cycle intermediates as well as a failure to utilize xanthine as a sole source of nitrogen. As a whole, these data indicate that the loss of PA1006 confers extensive changes in Pae metabolism. Based upon homology of PA1006 to the E. coli YhhP protein and data from the accompanying study, loss of PA1006 persulfuration and/or molybdenum homeostasis are likely the cause of extensive metabolic alterations that impact biofilm development and virulence in the ΔPA1006 mutant.

  18. A Utility-Based Reputation Model for the Internet of Things

    OpenAIRE

    Aziz, Benjamin; Fremantle, Paul; Wei, Rui; Arenas, Alvaro

    2016-01-01

    Part 7: TPM and Internet of Things; International audience; The MQTT protocol has emerged over the past decade as a key protocol for a number of low power and lightweight communication scenarios including machine-to-machine and the Internet of Things. In this paper we develop a utility-based reputation model for MQTT, where we can assign a reputation score to participants in a network based on monitoring their behaviour. We mathematically define the reputation model using utility functions on...

  19. Estimation of the maintenance energy requirements, methane emissions and nitrogen utilization efficiency of two suckler cow genotypes.

    Science.gov (United States)

    Zou, C X; Lively, F O; Wylie, A R G; Yan, T

    2016-04-01

    Seventeen non-lactating dairy-bred suckler cows (LF; Limousin×Holstein-Friesian) and 17 non-lactating beef composite breed suckler cows (ST; Stabiliser) were used to study enteric methane emissions and energy and nitrogen (N) utilization from grass silage diets. Cows were housed in cubicle accommodation for 17 days, and then moved to individual tie-stalls for an 8-day digestibility balance including a 2-day adaption followed by immediate transfer to an indirect, open-circuit, respiration calorimeters for 3 days with gaseous exchange recorded over the last two of these days. Grass silage was offered ad libitum once daily at 0900 h throughout the study. There were no significant differences (P>0.05) between the genotypes for energy intakes, energy outputs or energy use efficiency, or for methane emission rates (methane emissions per unit of dry matter intake or energy intake), or for N metabolism characteristics (N intake or N output in faeces or urine). Accordingly, the data for both cow genotypes were pooled and used to develop relationships between inputs and outputs. Regression of energy retention against ME intake (r 2=0.52; Penergy requirements for maintenance of 0.386, 0.392 and 0.375 MJ/kg0.75 for LF+ST, LF and ST respectively. Methane energy output was 0.066 of gross energy intake when the intercept was omitted from the linear equation (r 2=0.59; Penergy requirement, methane emission and manure N output for suckler cows and further information is required to evaluate their application in a wide range of suckler production systems.

  20. Economic principles and fundamental model of the sustainable utilization of ecological resources

    Institute of Scientific and Technical Information of China (English)

    Du Jinpei; Li Lin

    2006-01-01

    By analyzing the basic rules and measurement principles of the sustainable utilization of ecological resources and constructing its mathematical model, this paper points out that the sustainable utilization of ecological resources is in nature to use the double-period model thousands of times for the dynamic distribution of ecological resources effectively. And it points out that in order to realize the sustainable utilization of ecological resources we must follow the basic principle - non-decreasing ecological capital and put forward corresponding standards, measures, policies and proposals.

  1. A Framework for Organizing Current and Future Electric Utility Regulatory and Business Models

    Energy Technology Data Exchange (ETDEWEB)

    Satchwell, Andrew [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Cappers, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Schwartz, Lisa C. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Fadrhonc, Emily Martin [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-06-01

    Many regulators, utilities, customer groups, and other stakeholders are reevaluating existing regulatory models and the roles and financial implications for electric utilities in the context of today’s environment of increasing distributed energy resource (DER) penetrations, forecasts of significant T&D investment, and relatively flat or negative utility sales growth. When this is coupled with predictions about fewer grid-connected customers (i.e., customer defection), there is growing concern about the potential for serious negative impacts on the regulated utility business model. Among states engaged in these issues, the range of topics under consideration is broad. Most of these states are considering whether approaches that have been applied historically to mitigate the impacts of previous “disruptions” to the regulated utility business model (e.g., energy efficiency) as well as to align utility financial interests with increased adoption of such “disruptive technologies” (e.g., shareholder incentive mechanisms, lost revenue mechanisms) are appropriate and effective in the present context. A handful of states are presently considering more fundamental changes to regulatory models and the role of regulated utilities in the ownership, management, and operation of electric delivery systems (e.g., New York “Reforming the Energy Vision” proceeding).

  2. A TRANSFORMATION APPROACH FOR COLLABORATION BASED REQUIREMENT MODELS

    Directory of Open Access Journals (Sweden)

    Ahmed Harbouche

    2012-02-01

    Full Text Available Distributed software engineering is widely recognized as a complex task. Among the inherent complexitiesis the process of obtaining a system design from its global requirement specification. This paper deals withsuch transformation process and suggests an approach to derive the behavior of a given systemcomponents, in the form of distributed Finite State Machines, from the global system requirements, in theform of an augmented UML Activity Diagrams notation. The process of the suggested approach issummarized in three steps: the definition of the appropriate source Meta-Model (requirements Meta-Model, the definition of the target Design Meta-Model and the definition of the rules to govern thetransformation during the derivation process. The derivation process transforms the global systemrequirements described as UML diagram activities (extended with collaborations to system rolesbehaviors represented as UML finite state machines. The approach is implemented using AtlasTransformation Language (ATL.

  3. A practical model for the train-set utilization: The case of Beijing-Tianjin passenger dedicated line in China.

    Science.gov (United States)

    Zhou, Yu; Zhou, Leishan; Wang, Yun; Li, Xiaomeng; Yang, Zhuo

    2017-01-01

    As a sustainable transportation mode, high-speed railway (HSR) has become an efficient way to meet the huge travel demand. However, due to the high acquisition and maintenance cost, it is impossible to build enough infrastructure and purchase enough train-sets. Great efforts are required to improve the transport capability of HSR. The utilization efficiency of train-sets (carrying tools of HSR) is one of the most important factors of the transport capacity of HSR. In order to enhance the utilization efficiency of the train-sets, this paper proposed a train-set circulation optimization model to minimize the total connection time. An innovative two-stage approach which contains segments generation and segments combination was designed to solve this model. In order to verify the feasibility of the proposed approach, an experiment was carried out in the Beijing-Tianjin passenger dedicated line, to fulfill a 174 trips train diagram. The model results showed that compared with the traditional Ant Colony Algorithm (ACA), the utilization efficiency of train-sets can be increased from 43.4% (ACA) to 46.9% (Two-Stage), and 1 train-set can be saved up to fulfill the same transportation tasks. The approach proposed in the study is faster and more stable than the traditional ones, by using which, the HSR staff can draw up the train-sets circulation plan more quickly and the utilization efficiency of the HSR system is also improved.

  4. Bioenergy crop models: Descriptions, data requirements and future challenges

    Energy Technology Data Exchange (ETDEWEB)

    Nair, S. Surendran [University of Tennessee, Knoxville (UTK); Kang, Shujiang [ORNL; Zhang, Xuesong [Pacific Northwest National Laboratory (PNNL); Miguez, Fernando [Iowa State University; Izaurralde, Dr. R. Cesar [Pacific Northwest National Laboratory (PNNL); Post, Wilfred M [ORNL; Dietze, Michael [University of Illinois, Urbana-Champaign; Lynd, L. [Dartmouth College; Wullschleger, Stan D [ORNL

    2012-01-01

    Field studies that address the production of lignocellulosic biomass as a source of renewable energy provide critical data for the development of bioenergy crop models. A literature survey revealed that 14 models have been used for simulating bioenergy crops including herbaceous and woody bioenergy crops, and for crassulacean acid metabolism (CAM) crops. These models simulate field-scale production of biomass for switchgrass (ALMANAC, EPIC, and Agro-BGC), miscanthus (MISCANFOR, MISCANMOD, and WIMOVAC), sugarcane (APSIM, AUSCANE, and CANEGRO), and poplar and willow (SECRETS and 3PG). Two models are adaptations of dynamic global vegetation models and simulate biomass yields of miscanthus and sugarcane at regional scales (Agro-IBIS and LPJmL). Although it lacks the complexity of other bioenergy crop models, the environmental productivity index (EPI) is the only model used to estimate biomass production of CAM (Agave and Opuntia) plants. Except for the EPI model, all models include representations of leaf area dynamics, phenology, radiation interception and utilization, biomass production, and partitioning of biomass to roots and shoots. A few models simulate soil water, nutrient, and carbon cycle dynamics, making them especially useful for assessing the environmental consequences (e.g., erosion and nutrient losses) associated with the large-scale deployment of bioenergy crops. The rapid increase in use of models for energy crop simulation is encouraging; however, detailed information on the influence of climate, soils, and crop management practices on biomass production is scarce. Thus considerable work remains regarding the parameterization and validation of process-based models for bioenergy crops; generation and distribution of high-quality field data for model development and validation; and implementation of an integrated framework for efficient, high-resolution simulations of biomass production for use in planning sustainable bioenergy systems.

  5. Effectiveness and Utility of a Case-Based Model for Delivering Engineering Ethics Professional Development Units

    Directory of Open Access Journals (Sweden)

    Heidi Ann Hahn

    2015-04-01

    Full Text Available This article describes an action research project conducted at Los Alamos National Laboratory (LANL to resolve a problem with the ability of licensed and/or certified engineers to obtain the ethics-related professional development units or hours (PDUs or PDHs needed to maintain their credentials. Because of the recurring requirement and the static nature of the information, an initial, in-depth training followed by annually updated refresher training was proposed. A case model approach, with online delivery, was selected as the optimal pedagogical model for the refresher training. In the first two years, the only data that was collected was throughput and information retention. Response rates indicated that the approach was effective in helping licensed professional engineers obtain the needed PDUs. The rates of correct responses suggested that knowledge transfer regarding ethical reasoning had occurred in the initial training and had been retained in the refresher. In FY13, after completing the refresher, learners received a survey asking their opinion of the effectiveness and utility of the course, as well as their impressions of the case study format vs. the typical presentation format. Results indicate that the courses have been favorably received and that the case study method supports most of the pedagogical needs of adult learners as well as, if not better than, presentation-based instruction. Future plans for improvement are focused on identifying and evaluating methods for enriching online delivery of the engineering ethics cases.

  6. NRC review of Electric Power Research Institute`s advanced light water reactor utility requirements document. Passive plant designs, chapters 2-13, project number 669

    Energy Technology Data Exchange (ETDEWEB)

    1994-08-01

    The Electric Power Research Institute (EPRI) is preparing a compendium of technical requirements, referred to as the {open_quotes}Advanced Light Water Reactor [ALWR] Utility Requirements Document{close_quotes}, that is acceptable to the design of an ALWR power plant. When completed, this document is intended to be a comprehensive statement of utility requirements for the design, construction, and performance of an ALWR power plant for the 1990s and beyond. The Requirements Document consists of three volumes. Volume I, {open_quotes}ALWR Policy and Summary of Top-Tier Requirements{close_quotes}, is a management-level synopsis of the Requirements Document, including the design objectives and philosophy, the overall physical configuration and features of a future nuclear plant design, and the steps necessary to take the proposed ALWR design criteria beyond the conceptual design state to a completed, functioning power plant. Volume II consists of 13 chapters and contains utility design requirements for an evolutionary nuclear power plant [approximately 1350 megawatts-electric (MWe)]. Volume III contains utility design requirements for nuclear plants for which passive features will be used in their designs (approximately 600 MWe). In April 1992, the staff of the Office of Nuclear Reactor Regulation, U.S. Nuclear Regulatory Commission, issued Volume 1 and Volume 2 (Parts 1 and 2) of its safety evaluation report (SER) to document the results of its review of Volumes 1 and 2 of the Requirements Document. Volume 1, {open_quotes}NRC Review of Electric Power Research Institute`s Advanced Light Water Reactor Utility Requirements Document - Program Summary{close_quotes}, provided a discussion of the overall purpose and scope of the Requirements Document, the background of the staff`s review, the review approach used by the staff, and a summary of the policy and technical issues raised by the staff during its review.

  7. NRC review of Electric Power Research Institute`s advanced light water reactor utility requirements document. Passive plant designs, chapter 1, project number 669

    Energy Technology Data Exchange (ETDEWEB)

    1994-08-01

    The Electric Power Research Institute (EPRI) is preparing a compendium of technical requirements, referred to as the {open_quotes}Advanced Light Water Reactor [ALWR] Utility Requirements Document{close_quotes}, that is acceptable to the design of an ALWR power plant. When completed, this document is intended to be a comprehensive statement of utility requirements for the design, construction, and performance of an ALWR power plant for the 1990s and beyond. The Requirements Document consists of three volumes. Volume 1, {open_quotes}ALWR Policy and Summary of Top-Tier Requirements{close_quotes}, is a management-level synopsis of the Requirements Document, including the design objectives and philosophy, the overall physical configuration and features of a future nuclear plant design, and the steps necessary to take the proposed ALWR design criteria beyond the conceptual design state to a completed, functioning power plant. Volume II consists of 13 chapters and contains utility design requirements for an evolutionary nuclear power plant [approximately 1350 megawatts-electric (MWe)]. Volume III contains utility design requirements for nuclear plants for which passive features will be used in their designs (approximately 600 MWe). In April 1992, the staff of the Office of Nuclear Reactor Regulation, U.S. Nuclear Regulatory Commission, issued Volume 1 and Volume 2 (Parts 1 and 2) of its safety evaluation report (SER) to document the results of its review of Volumes 1 and 2 of the Requirements Document. Volume 1, {open_quotes}NRC Review of Electric Power Research Institute`s Advanced Light Water Reactor Utility Requirements Document - Program Summary{close_quotes}, provided a discussion of the overall purpose and scope of the Requirements Document, the background of the staff`s review, the review approach used by the staff, and a summary of the policy and technical issues raised by the staff during its review.

  8. NVC Based Model for Selecting Effective Requirement Elicitation Technique

    Directory of Open Access Journals (Sweden)

    Md. Rizwan Beg

    2012-10-01

    Full Text Available Requirement Engineering process starts from gathering of requirements i.e.; requirements elicitation. Requirementselicitation (RE is the base building block for a software project and has very high impact onsubsequent design and builds phases as well. Accurately capturing system requirements is the major factorin the failure of most of software projects. Due to the criticality and impact of this phase, it is very importantto perform the requirements elicitation in no less than a perfect manner. One of the most difficult jobsfor elicitor is to select appropriate technique for eliciting the requirement. Interviewing and Interactingstakeholder during Elicitation process is a communication intensive activity involves Verbal and Nonverbalcommunication (NVC. Elicitor should give emphasis to Non-verbal communication along with verbalcommunication so that requirements recorded more efficiently and effectively. In this paper we proposea model in which stakeholders are classified by observing non-verbal communication and use it as a basefor elicitation technique selection. We also propose an efficient plan for requirements elicitation which intendsto overcome on the constraints, faced by elicitor.

  9. Top-level modeling of an als system utilizing object-oriented techniques

    Science.gov (United States)

    Rodriguez, L. F.; Kang, S.; Ting, K. C.

    The possible configuration of an Advanced Life Support (ALS) System capable of supporting human life for long-term space missions continues to evolve as researchers investigate potential technologies and configurations. To facilitate the decision process the development of acceptable, flexible, and dynamic mathematical computer modeling tools capable of system level analysis is desirable. Object-oriented techniques have been adopted to develop a dynamic top-level model of an ALS system.This approach has several advantages; among these, object-oriented abstractions of systems are inherently modular in architecture. Thus, models can initially be somewhat simplistic, while allowing for adjustments and improvements. In addition, by coding the model in Java, the model can be implemented via the World Wide Web, greatly encouraging the utilization of the model. Systems analysis is further enabled with the utilization of a readily available backend database containing information supporting the model. The subsystem models of the ALS system model include Crew, Biomass Production, Waste Processing and Resource Recovery, Food Processing and Nutrition, and the Interconnecting Space. Each subsystem model and an overall model have been developed. Presented here is the procedure utilized to develop the modeling tool, the vision of the modeling tool, and the current focus for each of the subsystem models.

  10. Formal Requirements Modeling for Reactive Systems with Coloured Petri Nets

    DEFF Research Database (Denmark)

    Tjell, Simon

    This dissertation presents the contributions of seven publications all concerned with the application of Coloured Petri Nets (CPN) to requirements modeling for reactive systems. The publications are introduced along with relevant background material and related work, and their contributions...... interface composed of recognizable artifacts and activities. The presentation of the three publications related to Use Cases is followed by a the presentation of a publication formalizing some of the guidelines applied for structuring the CPN requirements models|namely the guidelines that make it possible...... activity. The traces are automatically recorded during execution of the model. The second publication presents a formally specified framework for automating a large part of the tasks related to integrating Problem Frames with CPN. The framework is specified in VDM++, and allows the modeler to automatically...

  11. NRC review of Electric Power Research Institute's Advanced Light Reactor Utility Requirements Document - Program summary, Project No. 669

    Energy Technology Data Exchange (ETDEWEB)

    1992-08-01

    The staff of the US Nuclear Regulatory Commission has prepared Volume 1 of a safety evaluation report (SER), NRC Review of Electric Power Research Institute's Advanced Light Water Reactor Utility Requirements Document -- Program Summary,'' to document the results of its review of the Electric Power Research Institute's Advanced Light Water Reactor Utility Requirements Document.'' This SER provides a discussion of the overall purpose and scope of the Requirements Document, the background of the staff's review, the review approach used by the staff, and a summary of the policy and technical issues raised by the staff during its review.

  12. NASA Standard for Models and Simulations: Philosophy and Requirements Overview

    Science.gov (United States)

    Blattnig, Steve R.; Luckring, James M.; Morrison, Joseph H.; Sylvester, Andre J.; Tripathi, Ram K.; Zang, Thomas A.

    2013-01-01

    Following the Columbia Accident Investigation Board report, the NASA Administrator chartered an executive team (known as the Diaz Team) to identify those CAIB report elements with NASA-wide applicability and to develop corrective measures to address each element. One such measure was the development of a standard for the development, documentation, and operation of models and simulations. This report describes the philosophy and requirements overview of the resulting NASA Standard for Models and Simulations.

  13. Single High Fidelity Geometric Data Sets for LCM - Model Requirements

    Science.gov (United States)

    2006-11-01

    material name (example, an HY80 steel ) plus additional material requirements (heat treatment, etc.) Creation of a more detailed description of the data...57 Figure 2.22. Typical Stress-Strain Curve for Steel (adapted from Ref 59) .............................. 60 Figure...structures are steel , aluminum and composites. The structural components that make up a global FEA model drive the fidelity of the model. For example

  14. An Analysis/Synthesis System of Audio Signal with Utilization of an SN Model

    Directory of Open Access Journals (Sweden)

    G. Rozinaj

    2004-12-01

    Full Text Available An SN (sinusoids plus noise model is a spectral model, in which theperiodic components of the sound are represented by sinusoids withtime-varying frequencies, amplitudes and phases. The remainingnon-periodic components are represented by a filtered noise. Thesinusoidal model utilizes physical properties of musical instrumentsand the noise model utilizes the human inability to perceive the exactspectral shape or the phase of stochastic signals. SN modeling can beapplied in a compression, transformation, separation of sounds, etc.The designed system is based on methods used in the SN modeling. Wehave proposed a model that achieves good results in audio perception.Although many systems do not save phases of the sinusoids, they areimportant for better modelling of transients, for the computation ofresidual and last but not least for stereo signals, too. One of thefundamental properties of the proposed system is the ability of thesignal reconstruction not only from the amplitude but from the phasepoint of view, as well.

  15. Requirements for a next generation global flood inundation models

    Science.gov (United States)

    Bates, P. D.; Neal, J. C.; Smith, A.; Sampson, C. C.

    2016-12-01

    In this paper we review the current status of global hydrodynamic models for flood inundation prediction and highlight recent successes and current limitations. Building on this analysis we then go on to consider what is required to develop the next generation of such schemes and show that to achieve this a number of fundamental science problems will need to be overcome. New data sets and new types of analysis will be required, and we show that these will only partially be met by currently planned satellite missions and data collection initiatives. A particular example is the quality of available global Digital Elevation data. The current best data set for flood modelling, SRTM, is only available at a relatively modest 30m resolution, contains pixel-to-pixel noise of 6m and is corrupted by surface artefacts. Creative processing techniques have sought to address these issues with some success, but fundamentally the quality of the available global terrain data limits flood modelling and needs to be overcome. Similar arguments can be made for many other elements of global hydrodynamic models including their bathymetry data, boundary conditions, flood defence information and model validation data. We therefore systematically review each component of global flood models and document whether planned new technology will solve current limitations and, if not, what exactly will be required to do so.

  16. A Buck-Boost Converter Modified to Utilize 600V GaN Power Devices in a PV Application Requiring 1200V Devices

    Directory of Open Access Journals (Sweden)

    SRDIC, S.

    2015-08-01

    Full Text Available This paper presents a buck-boost converter which is modified to utilize new 600 V gallium nitride (GaN power semiconductor devices in an application requiring 1200 V devices. The presented buck-boost converter is used as a part of a dc/dc stage in an all-GaN photovoltaic (PV inverter and it provides a negative voltage for the 3-level neutral-point-clamped (NPC PWM inverter which is connected to the utility grid. Since in this application the transistor and the diode of the buck-boost converter need to block the sum of the PV string voltage (which is normally in the range from 150 to 350 V and the dc bus voltage (which is in the order of 400 V, the 1200 V devices or series connection of 600 V devices need to be employed. Currently, 1200 V GaN power semiconductor devices are not commercially available. Therefore, the standard buck-boost converter is modified to enable the use of 600 V GaN devices in this particular application. Based on the proposed converter topology, a PSpice simulation model and a 600 W converter prototype were developed. Both simulation and experimental results show successful operation of the converter.

  17. Using Analytical and Numerical Modeling to Assess the Utility of Groundwater Monitoring Parameters at Carbon Capture, Utilization, and Storage Sites

    Science.gov (United States)

    Porse, S. L.; Hovorka, S. D.; Young, M.; Zeidouni, M.

    2012-12-01

    Carbon capture, utilization, and storage (CCUS) is becoming an important bridge to commercial geologic sequestration (GS) to help reduce anthropogenic CO2 emissions. While CCUS at brownfield sites (i.e. mature oil and gas fields) has operational advantages over GS at greenfield sites (i.e. saline formations) such as the use of existing well infrastructure, previous site activities can add a layer of complexity that must be accounted for when developing groundwater monitoring protection networks. Extensive work has been done on developing monitoring networks at GS sites for CO2 accounting and groundwater protection. However, the development of appropriate monitoring strategies at commercial brownfield sites continues to develop. The goals of this research are to address the added monitoring complexity by adapting simple analytical and numerical models to test these approaches using two common subsurface monitoring parameters, pressure and aqueous geochemistry. The analytical pressure model solves for diffusivity in radial coordinates and the leakage rate derived from Darcy's law. The aqueous geochemical calculation computer program PHREEQC solves the advection-reaction-dispersion equation for 1-D transport and mixing of fluids .The research was conducted at a CO2 enhanced oil recovery (EOR) field on the Gulf Coast of Texas. We modeled the performance over time of one monitoring well from the EOR field using physical and operational data including lithology and water chemistry samples, and formation pressure data. We explored through statistical analyses the probability of leakage detection using the analytical and numerical methods by varying the monitoring well location spatially and vertically with respect to a leaky fault. Preliminary results indicate that a pressure based subsurface monitoring system provides a better probability of leakage detection than geochemistry alone, but together these monitoring parameters can improve the chances of leakage detection

  18. Models of protein and amino acid requirements for cattle

    Directory of Open Access Journals (Sweden)

    Luis Orlindo Tedeschi

    2015-03-01

    Full Text Available Protein supply and requirements by ruminants have been studied for more than a century. These studies led to the accumulation of lots of scientific information about digestion and metabolism of protein by ruminants as well as the characterization of the dietary protein in order to maximize animal performance. During the 1980s and 1990s, when computers became more accessible and powerful, scientists began to conceptualize and develop mathematical nutrition models, and to program them into computers to assist with ration balancing and formulation for domesticated ruminants, specifically dairy and beef cattle. The most commonly known nutrition models developed during this period were the National Research Council (NRC in the United States, Agricultural Research Council (ARC in the United Kingdom, Institut National de la Recherche Agronomique (INRA in France, and the Commonwealth Scientific and Industrial Research Organization (CSIRO in Australia. Others were derivative works from these models with different degrees of modifications in the supply or requirement calculations, and the modeling nature (e.g., static or dynamic, mechanistic, or deterministic. Circa 1990s, most models adopted the metabolizable protein (MP system over the crude protein (CP and digestible CP systems to estimate supply of MP and the factorial system to calculate MP required by the animal. The MP system included two portions of protein (i.e., the rumen-undegraded dietary CP - RUP - and the contributions of microbial CP - MCP as the main sources of MP for the animal. Some models would explicitly account for the impact of dry matter intake (DMI on the MP required for maintenance (MPm; e.g., Cornell Net Carbohydrate and Protein System - CNCPS, the Dutch system - DVE/OEB, while others would simply account for scurf, urinary, metabolic fecal, and endogenous contributions independently of DMI. All models included milk yield and its components in estimating MP required for lactation

  19. Stock Selection for Portfolios Using Expected Utility-Entropy Decision Model

    Directory of Open Access Journals (Sweden)

    Jiping Yang

    2017-09-01

    Full Text Available Yang and Qiu proposed and then recently improved an expected utility-entropy (EU-E measure of risk and decision model. When segregation holds, Luce et al. derived an expected utility term, plus a constant multiplies the Shannon entropy as the representation of risky choices, further demonstrating the reasonability of the EU-E decision model. In this paper, we apply the EU-E decision model to selecting the set of stocks to be included in the portfolios. We first select 7 and 10 stocks from the 30 component stocks of Dow Jones Industrial Average index, and then derive and compare the efficient portfolios in the mean-variance framework. The conclusions imply that efficient portfolios composed of 7(10 stocks selected using the EU-E model with intermediate intervals of the tradeoff coefficients are more efficient than that composed of the sets of stocks selected using the expected utility model. Furthermore, the efficient portfolio of 7(10 stocks selected by the EU-E decision model have almost the same efficient frontier as that of the sample of all stocks. This suggests the necessity of incorporating both the expected utility and Shannon entropy together when taking risky decisions, further demonstrating the importance of Shannon entropy as the measure of uncertainty, as well as the applicability of the EU-E model as a decision-making model.

  20. National Utility Financial Statement model (NUFS). Volume III of III: software description. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1981-10-29

    This volume contains a description of the software comprising the National Utility Financial Statement Model (NUFS). This is the third of three volumes describing NUFS provided by ICF Incorporated under contract DEAC-01-79EI-10579. The three volumes are entitled: model overview and description, user's guide, and software guide.

  1. Modeling requirements for in situ vitrification. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    MacKinnon, R.J.; Mecham, D.C.; Hagrman, D.L.; Johnson, R.W.; Murray, P.E.; Slater, C.E.; Marwil, E.S.; Weaver, R.A.; Argyle, M.D.

    1991-11-01

    This document outlines the requirements for the model being developed at the INEL which will provide analytical support for the ISV technology assessment program. The model includes representations of the electric potential field, thermal transport with melting, gas and particulate release, vapor migration, off-gas combustion and process chemistry. The modeling objectives are to (1) help determine the safety of the process by assessing the air and surrounding soil radionuclide and chemical pollution hazards, the nuclear criticality hazard, and the explosion and fire hazards, (2) help determine the suitability of the ISV process for stabilizing the buried wastes involved, and (3) help design laboratory and field tests and interpret results therefrom.

  2. Required experimental accuracy to select between supersymmetrical models

    Indian Academy of Sciences (India)

    David Grellscheid

    2004-03-01

    We will present a method to decide a priori whether various supersymmetrical scenarios can be distinguished based on sparticle mass data alone. For each model, a scan over all free SUSY breaking parameters reveals the extent of that model's physically allowed region of sparticle-mass-space. Based on the geometrical configuration of these regions in mass-space, it is possible to obtain an estimate of the required accuracy of future sparticle mass measurements to distinguish between the models. We will illustrate this algorithm with an example. Ths talk is based on work done in collaboration with B C Allanach (LAPTH, Annecy) and F Quevedo (DAMTP, Cambridge).

  3. Complications of rhBMP-2 utilization for posterolateral lumbar fusions requiring reoperation: a single practice, retrospective case series report.

    Science.gov (United States)

    Hoffmann, Martin F; Jones, Clifford B; Sietsema, Debra L

    2013-10-01

    Recombinant human bone morphogenetic protein-2 (rhBMP-2) (INFUSE, Medtronic, Memphis, TN, USA) has been used off-label for posterolateral lumbar fusions for many years. The goal of this study was to evaluate the complications requiring reoperation associated with rhBMP-2 application for posterolateral lumbar fusions. During a 7-year period of time (2002-2009), all patients undergoing lumbar posterolateral fusion using rhBMP-2 (INFUSE) were retrospectively evaluated within a large orthopedic surgery private practice. A total of 1,158 consecutive patients were evaluated with 468 (40.4%) males and 690 (59.6%) females. Complications related to rhBMP were defined as reoperation secondary to symptomatic failed fusion (nonunion), symptomatic seroma formation, symptomatic reformation of foraminal bone, and infection. Inclusion criteria were posterolateral fusion with rhBMP-2 implant and age equal to or older than 18 years. Surgical indications and treatment were performed in accordance with the surgeon's best knowledge, discretion, and experience. Patients consented to lumbar decompression and arthrodesis using rhBMP-2. All patients were educated and informed of the off-label utilization of rhBMP-2. Patient follow-up was performed at regular intervals of 2 weeks, 6 weeks, 12 weeks, 6 months, 1 year, and later if required or indicated. Average age was 59.2 years, and body mass index was 30.7 kg/m². Numbers of levels fused were 1 (414, 35.8%), 2 (469, 40.5%), 3 (162, 14.0%), 4 (70, 6.0%), 5 (19, 1.6%), 6 (11, 0.9%), 7 (7, 0.6%), 8 (4, 0.3%), and 9 (2, 0.2%). Patients having complications requiring reoperation were 117 of 1,158 (10.1%): symptomatic nonunion requiring redo fusion and instrumentation 41 (3.5%), seroma with acute neural compression 32 (2.8%), excess bone formation with delayed neural compression 4 (0.3%), and infection requiring debridement 26 (2.2%). Nonunion was related to male sex and previous BMP exposure. Seroma formation was significantly higher in

  4. Thermodynamic models for bounding pressurant mass requirements of cryogenic tanks

    Science.gov (United States)

    Vandresar, Neil T.; Haberbusch, Mark S.

    1994-01-01

    Thermodynamic models have been formulated to predict lower and upper bounds for the mass of pressurant gas required to pressurize a cryogenic tank and then expel liquid from the tank. Limiting conditions are based on either thermal equilibrium or zero energy exchange between the pressurant gas and initial tank contents. The models are independent of gravity level and allow specification of autogenous or non-condensible pressurants. Partial liquid fill levels may be specified for initial and final conditions. Model predictions are shown to successfully bound results from limited normal-gravity tests with condensable and non-condensable pressurant gases. Representative maximum collapse factor maps are presented for liquid hydrogen to show the effects of initial and final fill level on the range of pressurant gas requirements. Maximum collapse factors occur for partial expulsions with large final liquid fill fractions.

  5. Model Waveform Accuracy Requirements for the $\\chi^2$ Discriminator

    CERN Document Server

    Lindblom, Lee

    2016-01-01

    This paper derives accuracy standards for model gravitational waveforms required to ensure proper use of the $\\chi^2$ discriminator test in gravitational wave (GW) data analysis. These standards are different from previously established requirements for detection and waveform parameter measurement based on signal-to-noise optimization. We present convenient formulae both for evaluating and interpreting the contribution of model errors to measured $\\chi^2$ values. Motivated by these formula, we also present an enhanced, complexified variant of the standard $\\chi^2$ statistic used in GW searches. While our results are not directly relevant to current searches (which use the $\\chi^2$ test only to veto signal candidates with extremely high $\\chi^2$ values), they could be useful in future GW searches and as figures of merit for model gravitational waveforms.

  6. A commuting generation model requiring only aggregated data

    CERN Document Server

    Lenormand, Maxime; Gargiulo, Floriana

    2011-01-01

    We recently proposed, in (Gargiulo et al., 2011), an innova tive stochastic model with only one parameter to calibrate. It reproduces the complete network by an iterative process stochastically choosing, for each commuter living in the municipality of a region, a workplace in the region. The choice is done considering the job offer in each municipality of the region and the distance to all the possible destinations. The model is quite effective if the region is sufficiently autonomous in terms of job offers. However, calibrating or being sure of this autonomy require data or expertise which are not necessarily available. Moreover the region can be not autonomous. In the present, we overcome these limitations, extending the job search geographical base of the commuters to the outside of the region, and changing the deterrence function form. We also found a law to calibrate the improvement model which does not require data.

  7. Cognition and procedure representational requirements for predictive human performance models

    Science.gov (United States)

    Corker, K.

    1992-01-01

    Models and modeling environments for human performance are becoming significant contributors to early system design and analysis procedures. Issues of levels of automation, physical environment, informational environment, and manning requirements are being addressed by such man/machine analysis systems. The research reported here investigates the close interaction between models of human cognition and models that described procedural performance. We describe a methodology for the decomposition of aircrew procedures that supports interaction with models of cognition on the basis of procedures observed; that serves to identify cockpit/avionics information sources and crew information requirements; and that provides the structure to support methods for function allocation among crew and aiding systems. Our approach is to develop an object-oriented, modular, executable software representation of the aircrew, the aircraft, and the procedures necessary to satisfy flight-phase goals. We then encode in a time-based language, taxonomies of the conceptual, relational, and procedural constraints among the cockpit avionics and control system and the aircrew. We have designed and implemented a goals/procedures hierarchic representation sufficient to describe procedural flow in the cockpit. We then execute the procedural representation in simulation software and calculate the values of the flight instruments, aircraft state variables and crew resources using the constraints available from the relationship taxonomies. The system provides a flexible, extensible, manipulative and executable representation of aircrew and procedures that is generally applicable to crew/procedure task-analysis. The representation supports developed methods of intent inference, and is extensible to include issues of information requirements and functional allocation. We are attempting to link the procedural representation to models of cognitive functions to establish several intent inference methods

  8. Mathematical Modeling of Programmatic Requirements for Yaws Eradication

    Science.gov (United States)

    Mitjà, Oriol; Fitzpatrick, Christopher; Asiedu, Kingsley; Solomon, Anthony W.; Mabey, David C.W.; Funk, Sebastian

    2017-01-01

    Yaws is targeted for eradication by 2020. The mainstay of the eradication strategy is mass treatment followed by case finding. Modeling has been used to inform programmatic requirements for other neglected tropical diseases and could provide insights into yaws eradication. We developed a model of yaws transmission varying the coverage and number of rounds of treatment. The estimated number of cases arising from an index case (basic reproduction number [R0]) ranged from 1.08 to 3.32. To have 80% probability of achieving eradication, 8 rounds of treatment with 80% coverage were required at low estimates of R0 (1.45). This requirement increased to 95% at high estimates of R0 (2.47). Extending the treatment interval to 12 months increased requirements at all estimates of R0. At high estimates of R0 with 12 monthly rounds of treatment, no combination of variables achieved eradication. Models should be used to guide the scale-up of yaws eradication. PMID:27983500

  9. The Benefit of Ambiguity in Understanding Goals in Requirements Modelling

    DEFF Research Database (Denmark)

    Paay, Jeni; Pedell, Sonja; Sterling, Leon

    2011-01-01

    of their research is to create technologies that support more flexible and meaningful social interactions, by combining best practice in software engineering with ethnographic techniques to model complex social interactions from their socially oriented life for the purposes of building rich socio......This paper examines the benefit of ambiguity in describing goals in requirements modelling for the design of socio-technical systems using concepts from Agent-Oriented Software Engineering (AOSE) and ethnographic and cultural probe methods from Human Computer Interaction (HCI). The authors’ aim...... of abstraction, ambiguous and open for conversations through the modelling process add richness to goal models, and communicate quality attributes of the interaction being modelled to the design phase, where this ambiguity is regarded as a resource for design....

  10. Information Models, Data Requirements, and Agile Data Curation

    Science.gov (United States)

    Hughes, John S.; Crichton, Dan; Ritschel, Bernd; Hardman, Sean; Joyner, Ron

    2015-04-01

    The Planetary Data System's next generation system, PDS4, is an example of the successful use of an ontology-based Information Model (IM) to drive the development and operations of a data system. In traditional systems engineering, requirements or statements about what is necessary for the system are collected and analyzed for input into the design stage of systems development. With the advent of big data the requirements associated with data have begun to dominate and an ontology-based information model can be used to provide a formalized and rigorous set of data requirements. These requirements address not only the usual issues of data quantity, quality, and disposition but also data representation, integrity, provenance, context, and semantics. In addition the use of these data requirements during system's development has many characteristics of Agile Curation as proposed by Young et al. [Taking Another Look at the Data Management Life Cycle: Deconstruction, Agile, and Community, AGU 2014], namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. For example customers can be satisfied through early and continuous delivery of system software and services that are configured directly from the information model. This presentation will describe the PDS4 architecture and its three principle parts: the ontology-based Information Model (IM), the federated registries and repositories, and the REST-based service layer for search, retrieval, and distribution. The development of the IM will be highlighted with special emphasis on knowledge acquisition, the impact of the IM on development and operations, and the use of shared ontologies at multiple governance levels to promote system interoperability and data correlation.

  11. Experienced Practitioners’ Beliefs Utilized to Create a Successful Massage Therapist Conceptual Model: a Qualitative Investigation

    Science.gov (United States)

    Kennedy, Anne B.; Munk, Niki

    2017-01-01

    Background The massage therapy profession in the United States has grown exponentially, with 35% of the profession’s practitioners in practice for three years or less. Investigating personal and social factors with regard to the massage therapy profession could help to identify constructs needed to be successful in the field. Purpose This data-gathering exercise explores massage therapists’ perceptions on what makes a successful massage therapist that will provide guidance for future research. Success is defined as supporting oneself and practice solely through massage therapy and related, revenue-generating field activity. Participants and Setting Ten successful massage therapy practitioners from around the United States who have a minimum of five years of experience. Research Design Semistructured qualitative interviews were used in an analytic induction framework; index cards with preidentified concepts printed on them were utilized to enhance conversation. An iterative process of interview coding and analysis was used to determine themes and subthemes. Results Based on the participants input, the categories in which therapists needed to be successful were organized into four main themes: effectively establish therapeutic relationships, develop massage therapy business acumen, seek valuable learning environments and opportunities, and cultivate strong social ties and networks. The four themes operate within specific contexts (e.g., regulation and licensing requirements in the therapists’ state), which may also influence the success of the massage therapist. Conclusions The model needs to be tested to explore which constructs explain variability in success and attrition rate. Limitations and future research implications are discussed. PMID:28690704

  12. Utilizing Gaze Behavior for Inferring Task Transitions Using Abstract Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Daniel Fernando Tello Gamarra

    2016-12-01

    Full Text Available We demonstrate an improved method for utilizing observed gaze behavior and show that it is useful in inferring hand movement intent during goal directed tasks. The task dynamics and the relationship between hand and gaze behavior are learned using an Abstract Hidden Markov Model (AHMM. We show that the predicted hand movement transitions occur consistently earlier in AHMM models with gaze than those models that do not include gaze observations.

  13. A Model for Forecasting Enlisted Student IA Billet Requirements

    Science.gov (United States)

    2016-03-01

    were promised and had at least one course failure . Training times Student execution depends on TTT. TTT includes under-instruction (UI) time and...Cleared for Public Release A Model for Forecasting Enlisted Student IA Billet Requirements Steven W. Belcher with David L. Reese...and Kletus S. Lawler March 2016 Copyright © 2016 CNA This document contains the best opinion of CNA at the time of issue. It does

  14. The transparency, reliability and utility of tropical rainforest land-use and land-cover change models.

    Science.gov (United States)

    Rosa, Isabel M D; Ahmed, Sadia E; Ewers, Robert M

    2014-06-01

    Land-use and land-cover (LULC) change is one of the largest drivers of biodiversity loss and carbon emissions globally. We use the tropical rainforests of the Amazon, the Congo basin and South-East Asia as a case study to investigate spatial predictive models of LULC change. Current predictions differ in their modelling approaches, are highly variable and often poorly validated. We carried out a quantitative review of 48 modelling methodologies, considering model spatio-temporal scales, inputs, calibration and validation methods. In addition, we requested model outputs from each of the models reviewed and carried out a quantitative assessment of model performance for tropical LULC predictions in the Brazilian Amazon. We highlight existing shortfalls in the discipline and uncover three key points that need addressing to improve the transparency, reliability and utility of tropical LULC change models: (1) a lack of openness with regard to describing and making available the model inputs and model code; (2) the difficulties of conducting appropriate model validations; and (3) the difficulty that users of tropical LULC models face in obtaining the model predictions to help inform their own analyses and policy decisions. We further draw comparisons between tropical LULC change models in the tropics and the modelling approaches and paradigms in other disciplines, and suggest that recent changes in the climate change and species distribution modelling communities may provide a pathway that tropical LULC change modellers may emulate to further improve the discipline. Climate change models have exerted considerable influence over public perceptions of climate change and now impact policy decisions at all political levels. We suggest that tropical LULC change models have an equally high potential to influence public opinion and impact the development of land-use policies based on plausible future scenarios, but, to do that reliably may require further improvements in the

  15. Estimating health state utility values from discrete choice experiments--a QALY space model approach.

    Science.gov (United States)

    Gu, Yuanyuan; Norman, Richard; Viney, Rosalie

    2014-09-01

    Using discrete choice experiments (DCEs) to estimate health state utility values has become an important alternative to the conventional methods of Time Trade-Off and Standard Gamble. Studies using DCEs have typically used the conditional logit to estimate the underlying utility function. The conditional logit is known for several limitations. In this paper, we propose two types of models based on the mixed logit: one using preference space and the other using quality-adjusted life year (QALY) space, a concept adapted from the willingness-to-pay literature. These methods are applied to a dataset collected using the EQ-5D. The results showcase the advantages of using QALY space and demonstrate that the preferred QALY space model provides lower estimates of the utility values than the conditional logit, with the divergence increasing with worsening health states. Copyright © 2014 John Wiley & Sons, Ltd.

  16. Comprehensive benefit of flood resources utilization through dynamic successive fuzzy evaluation model: A case study

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    Taking the flood resources utilization in Baicheng, Jilin during 2002–2007 as the research background, and based on the entropy weight and multi-level & multi-objective fuzzy optimization theory, this research established a multi-level & semi-constructive index system and dynamic successive evaluation model for comprehensive benefit evaluation of regional flood resources utilization. With the year 2002 as the base year, the analyzing results showed that there existed a close positive correlation between flood utilization volume and its benefits, comprehensive evaluation value and its comparison increment. Within the six successive evaluation years, the comprehensive benefit of 2003 was the best, in which the benefit evaluation increment reached 82.8% whereas the year of 2004 was the worst, in which the increment was only 18.2%. Thus the sustainability and correctness of the evaluation were verified by six years successive evaluation and increment comparison. The analyzing results showed that the economic benefits, ecological benefits and social benefits of flood utilization were remarkable, and that the comprehensive benefit could be improved by increasing flood utilization capacity, which would promote the regional sustainable development as well. The established dynamic successive evaluation provides a stable theoretical basis and technical support for further flood utilization.

  17. Clinical inferences and decisions--III. Utility assessment and the Bayesian decision model.

    Science.gov (United States)

    Aspinall, P A; Hill, A R

    1984-01-01

    It is accepted that errors of misclassifications, however small, can occur in clinical decisions but it cannot be assumed that the importance associated with false positive errors is the same as that for false negatives. The relative importance of these two types of error is frequently implied by a decision maker in the different weighting factors or utilities he assigns to the alternative consequences of his decisions. Formal procedures are available by which it is possible to make explicit in numerical form the value or worth of the outcome of a decision process. The two principal methods are described for generating utilities as associated with clinical decisions. The concept and application of utility is then expanded from a unidimensional to a multidimensional problem where, for example, one variable may be state of health and another monetary assets. When combined with the principles of subjective probability and test criterion selection outlined in Parts I and II of this series, the consequent use of utilities completes the framework upon which the general Bayesian model of clinical decision making is based. The five main stages in this general decision making model are described and applications of the model are illustrated with clinical examples from the field of ophthalmology. These include examples for unidimensional and multidimensional problems which are worked through in detail to illustrate both the principles and methodology involved in a rationalized normative model of clinical decision making behaviour.

  18. A Family Therapy Model For Preserving Independence in Older Persons: Utilization of the Family of Procreation.

    Science.gov (United States)

    Quinn, William H.; Keller, James F.

    1981-01-01

    Presents a family therapy model that utilizes the Bowen theory systems framework. The framework is adapted to the family of procreation, which takes on increased importance in the lives of the elderly. Family therapy with the aged can create more satisfying intergenerational relationships and preserve independence. (Author)

  19. Utilizing the PREPaRE Model When Multiple Classrooms Witness a Traumatic Event

    Science.gov (United States)

    Bernard, Lisa J.; Rittle, Carrie; Roberts, Kathy

    2011-01-01

    This article presents an account of how the Charleston County School District responded to an event by utilizing the PREPaRE model (Brock, et al., 2009). The acronym, PREPaRE, refers to a range of crisis response activities: P (prevent and prepare for psychological trauma), R (reaffirm physical health and perceptions of security and safety), E…

  20. Modeling Water Utility Investments and Improving Regulatory Policies using Economic Optimisation in England and Wales

    Science.gov (United States)

    Padula, S.; Harou, J. J.

    2012-12-01

    Water utilities in England and Wales are regulated natural monopolies called 'water companies'. Water companies must obtain periodic regulatory approval for all investments (new supply infrastructure or demand management measures). Both water companies and their regulators use results from least economic cost capacity expansion optimisation models to develop or assess water supply investment plans. This presentation first describes the formulation of a flexible supply-demand planning capacity expansion model for water system planning. The model uses a mixed integer linear programming (MILP) formulation to choose the least-cost schedule of future supply schemes (reservoirs, desalination plants, etc.) and demand management (DM) measures (leakage reduction, water efficiency and metering options) and bulk transfers. Decisions include what schemes to implement, when to do so, how to size schemes and how much to use each scheme during each year of an n-year long planning horizon (typically 30 years). In addition to capital and operating (fixed and variable) costs, the estimated social and environmental costs of schemes are considered. Each proposed scheme is costed discretely at one or more capacities following regulatory guidelines. The model uses a node-link network structure: water demand nodes are connected to supply and demand management (DM) options (represented as nodes) or to other demand nodes (transfers). Yields from existing and proposed are estimated separately using detailed water resource system simulation models evaluated over the historical period. The model simultaneously considers multiple demand scenarios to ensure demands are met at required reliability levels; use levels of each scheme are evaluated for each demand scenario and weighted by scenario likelihood so that operating costs are accurately evaluated. Multiple interdependency relationships between schemes (pre-requisites, mutual exclusivity, start dates, etc.) can be accounted for by

  1. Tools required for efficient management of municipal utilities in the future heat and power market; Zukunftsfaehiges Management von Stadtwerken braucht Instrumente

    Energy Technology Data Exchange (ETDEWEB)

    Estermann, Andre S. [Strategieprojekt stadtwerke-monitor.de, Berlin (Germany)

    2010-10-15

    The key aspect of organizational management is the definition and implementation of goals for the future. This requires continuous reflection of the organization's position at a given moment and of the further strategy required to achieve the targeted goals. Small and medium-sized municipal utilities as a rule are reluctant to take this strategy and promote innovations. The author presents a new online platform that was developed specifically for municipal utilities, offering them the option to find new ways of business management. In the energy markets of the future, participation of municipal utilities will thus no longer be a matter of 'if' but a matter of 'how'. (orig.)

  2. Implications of Model Structure and Detail for Utility Planning: Scenario Case Studies Using the Resource Planning Model

    Energy Technology Data Exchange (ETDEWEB)

    Mai, Trieu [National Renewable Energy Lab. (NREL), Golden, CO (United States); Barrows, Clayton [National Renewable Energy Lab. (NREL), Golden, CO (United States); Lopez, Anthony [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hale, Elaine [National Renewable Energy Lab. (NREL), Golden, CO (United States); Dyson, Mark [National Renewable Energy Lab. (NREL), Golden, CO (United States); Eurek, Kelly [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2015-04-01

    In this report, we analyze the impacts of model configuration and detail in capacity expansion models, computational tools used by utility planners looking to find the least cost option for planning the system and by researchers or policy makers attempting to understand the effects of various policy implementations. The present analysis focuses on the importance of model configurations — particularly those related to capacity credit, dispatch modeling, and transmission modeling — to the construction of scenario futures. Our analysis is primarily directed toward advanced tools used for utility planning and is focused on those impacts that are most relevant to decisions with respect to future renewable capacity deployment. To serve this purpose, we develop and employ the NREL Resource Planning Model to conduct a case study analysis that explores 12 separate capacity expansion scenarios of the Western Interconnection through 2030.

  3. Developing a clinical utility framework to evaluate prediction models in radiogenomics

    Science.gov (United States)

    Wu, Yirong; Liu, Jie; Munoz del Rio, Alejandro; Page, David C.; Alagoz, Oguzhan; Peissig, Peggy; Onitilo, Adedayo A.; Burnside, Elizabeth S.

    2015-03-01

    Combining imaging and genetic information to predict disease presence and behavior is being codified into an emerging discipline called "radiogenomics." Optimal evaluation methodologies for radiogenomics techniques have not been established. We aim to develop a clinical decision framework based on utility analysis to assess prediction models for breast cancer. Our data comes from a retrospective case-control study, collecting Gail model risk factors, genetic variants (single nucleotide polymorphisms-SNPs), and mammographic features in Breast Imaging Reporting and Data System (BI-RADS) lexicon. We first constructed three logistic regression models built on different sets of predictive features: (1) Gail, (2) Gail+SNP, and (3) Gail+SNP+BI-RADS. Then, we generated ROC curves for three models. After we assigned utility values for each category of findings (true negative, false positive, false negative and true positive), we pursued optimal operating points on ROC curves to achieve maximum expected utility (MEU) of breast cancer diagnosis. We used McNemar's test to compare the predictive performance of the three models. We found that SNPs and BI-RADS features augmented the baseline Gail model in terms of the area under ROC curve (AUC) and MEU. SNPs improved sensitivity of the Gail model (0.276 vs. 0.147) and reduced specificity (0.855 vs. 0.912). When additional mammographic features were added, sensitivity increased to 0.457 and specificity to 0.872. SNPs and mammographic features played a significant role in breast cancer risk estimation (p-value < 0.001). Our decision framework comprising utility analysis and McNemar's test provides a novel framework to evaluate prediction models in the realm of radiogenomics.

  4. Utility of Social Modeling in Assessment of a State’s Propensity for Nuclear Proliferation

    Energy Technology Data Exchange (ETDEWEB)

    Coles, Garill A.; Brothers, Alan J.; Whitney, Paul D.; Dalton, Angela C.; Olson, Jarrod; White, Amanda M.; Cooley, Scott K.; Youchak, Paul M.; Stafford, Samuel V.

    2011-06-01

    This report is the third and final report out of a set of three reports documenting research for the U.S. Department of Energy (DOE) National Security Administration (NASA) Office of Nonproliferation Research and Development NA-22 Simulations, Algorithms, and Modeling program that investigates how social modeling can be used to improve proliferation assessment for informing nuclear security, policy, safeguards, design of nuclear systems and research decisions. Social modeling has not to have been used to any significant extent in a proliferation studies. This report focuses on the utility of social modeling as applied to the assessment of a State's propensity to develop a nuclear weapons program.

  5. SMV model-based safety analysis of software requirements

    Energy Technology Data Exchange (ETDEWEB)

    Koh, Kwang Yong [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of); Seong, Poong Hyun [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of)], E-mail: phseong@kaist.ac.kr

    2009-02-15

    Fault tree analysis (FTA) is one of the most frequently applied safety analysis techniques when developing safety-critical industrial systems such as software-based emergency shutdown systems of nuclear power plants and has been used for safety analysis of software requirements in the nuclear industry. However, the conventional method for safety analysis of software requirements has several problems in terms of correctness and efficiency; the fault tree generated from natural language specifications may contain flaws or errors while the manual work of safety verification is very labor-intensive and time-consuming. In this paper, we propose a new approach to resolve problems of the conventional method; we generate a fault tree from a symbolic model verifier (SMV) model, not from natural language specifications, and verify safety properties automatically, not manually, by a model checker SMV. To demonstrate the feasibility of this approach, we applied it to shutdown system 2 (SDS2) of Wolsong nuclear power plant (NPP). In spite of subtle ambiguities present in the approach, the results of this case study demonstrate its overall feasibility and effectiveness.

  6. Assessing the utility of thermodynamic features for microRNA target prediction under relaxed seed and no conservation requirements.

    Directory of Open Access Journals (Sweden)

    Parawee Lekprasert

    Full Text Available BACKGROUND: Many computational microRNA target prediction tools are focused on several key features, including complementarity to 5'seed of miRNAs and evolutionary conservation. While these features allow for successful target identification, not all miRNA target sites are conserved and adhere to canonical seed complementarity. Several studies have propagated the use of energy features of mRNA:miRNA duplexes as an alternative feature. However, different independent evaluations reported conflicting results on the reliability of energy-based predictions. Here, we reassess the usefulness of energy features for mammalian target prediction, aiming to relax or eliminate the need for perfect seed matches and conservation requirement. METHODOLOGY/PRINCIPAL FINDINGS: We detect significant differences of energy features at experimentally supported human miRNA target sites and at genome-wide sites of AGO protein interaction. This trend is confirmed on datasets that assay the effect of miRNAs on mRNA and protein expression changes, and a simple linear regression model leads to significant correlation of predicted versus observed expression change. Compared to 6-mer seed matches as baseline, application of our energy-based model leads to ∼3-5-fold enrichment on highly down-regulated targets, and allows for prediction of strictly imperfect targets with enrichment above baseline. CONCLUSIONS/SIGNIFICANCE: In conclusion, our results indicate significant promise for energy-based miRNA target prediction that includes a broader range of targets without having to use conservation or impose stringent seed match rules.

  7. Research on the Prediction Model of CPU Utilization Based on ARIMA-BP Neural Network

    Directory of Open Access Journals (Sweden)

    Wang Jina

    2016-01-01

    Full Text Available The dynamic deployment technology of the virtual machine is one of the current cloud computing research focuses. The traditional methods mainly work after the degradation of the service performance that usually lag. To solve the problem a new prediction model based on the CPU utilization is constructed in this paper. A reference offered by the new prediction model of the CPU utilization is provided to the VM dynamic deployment process which will speed to finish the deployment process before the degradation of the service performance. By this method it not only ensure the quality of services but also improve the server performance and resource utilization. The new prediction method of the CPU utilization based on the ARIMA-BP neural network mainly include four parts: preprocess the collected data, build the predictive model of ARIMA-BP neural network, modify the nonlinear residuals of the time series by the BP prediction algorithm and obtain the prediction results by analyzing the above data comprehensively.

  8. Experimental development based on mapping rule between requirements analysis model and web framework specific design model.

    Science.gov (United States)

    Okuda, Hirotaka; Ogata, Shinpei; Matsuura, Saeko

    2013-12-01

    Model Driven Development is a promising approach to develop high quality software systems. We have proposed a method of model-driven requirements analysis using Unified Modeling Language (UML). The main feature of our method is to automatically generate a Web user interface prototype from UML requirements analysis model so that we can confirm validity of input/output data for each page and page transition on the system by directly operating the prototype. We proposes a mapping rule in which design information independent of each web application framework implementation is defined based on the requirements analysis model, so as to improve the traceability to the final product from the valid requirements analysis model. This paper discusses the result of applying our method to the development of a Group Work Support System that is currently running in our department.

  9. A prospective overview of the essential requirements in molecular modeling for nanomedicine design.

    Science.gov (United States)

    Kumar, Pradeep; Khan, Riaz A; Choonara, Yahya E; Pillay, Viness

    2013-05-01

    Nanotechnology has presented many new challenges and opportunities in the area of nanomedicine design. The issues related to nanoconjugation, nanosystem-mediated targeted drug delivery, transitional stability of nanovehicles, the integrity of drug transport, drug-delivery mechanisms and chemical structural design require a pre-estimated and determined course of assumptive actions with property and characteristic estimations for optimal nanomedicine design. Molecular modeling in nanomedicine encompasses these pre-estimations and predictions of pertinent design data via interactive computographic software. Recently, an increasing amount of research has been reported where specialized software is being developed and employed in an attempt to bridge the gap between drug discovery, materials science and biology. This review provides an assimilative and concise incursion into the current and future strategies of molecular-modeling applications in nanomedicine design and aims to describe the utilization of molecular models and theoretical-chemistry computographic techniques for expansive nanomedicine design and development.

  10. Life sciences research in space: The requirement for animal models

    Science.gov (United States)

    Fuller, C. A.; Philips, R. W.; Ballard, R. W.

    1987-01-01

    Use of animals in NASA space programs is reviewed. Animals are needed because life science experimentation frequently requires long-term controlled exposure to environments, statistical validation, invasive instrumentation or biological tissue sampling, tissue destruction, exposure to dangerous or unknown agents, or sacrifice of the subject. The availability and use of human subjects inflight is complicated by the multiple needs and demands upon crew time. Because only living organisms can sense, integrate and respond to the environment around them, the sole use of tissue culture and computer models is insufficient for understanding the influence of the space environment on intact organisms. Equipment for spaceborne experiments with animals is described.

  11. Specification of advanced safety modeling requirements (Rev. 0).

    Energy Technology Data Exchange (ETDEWEB)

    Fanning, T. H.; Tautges, T. J.

    2008-06-30

    The U.S. Department of Energy's Global Nuclear Energy Partnership has lead to renewed interest in liquid-metal-cooled fast reactors for the purpose of closing the nuclear fuel cycle and making more efficient use of future repository capacity. However, the U.S. has not designed or constructed a fast reactor in nearly 30 years. Accurate, high-fidelity, whole-plant dynamics safety simulations will play a crucial role by providing confidence that component and system designs will satisfy established design limits and safety margins under a wide variety of operational, design basis, and beyond design basis transient conditions. Current modeling capabilities for fast reactor safety analyses have resulted from several hundred person-years of code development effort supported by experimental validation. The broad spectrum of mechanistic and phenomenological models that have been developed represent an enormous amount of institutional knowledge that needs to be maintained. Complicating this, the existing code architectures for safety modeling evolved from programming practices of the 1970s. This has lead to monolithic applications with interdependent data models which require significant knowledge of the complexities of the entire code in order for each component to be maintained. In order to develop an advanced fast reactor safety modeling capability, the limitations of the existing code architecture must be overcome while preserving the capabilities that already exist. To accomplish this, a set of advanced safety modeling requirements is defined, based on modern programming practices, that focuses on modular development within a flexible coupling framework. An approach for integrating the existing capabilities of the SAS4A/SASSYS-1 fast reactor safety analysis code into the SHARP framework is provided in order to preserve existing capabilities while providing a smooth transition to advanced modeling capabilities. In doing this, the advanced fast reactor safety models

  12. THE ROLE OF TECHNICAL CONSUMPTION CALCULATION MODELS ON ACCOUNTING INFORMATION SYSTEMS OF PUBLIC UTILITIES SERVICES OPERATORS

    Directory of Open Access Journals (Sweden)

    GHEORGHE CLAUDIU FEIES

    2012-05-01

    Full Text Available After studying how the operators’ management works, an influence of the specific activities of public utilities on their financial accounting system can be noticed. The asymmetry of these systems is also present, resulting from organization and specific services, which implies a close link between the financial accounting system and the specialized technical department. The research methodology consists in observing specific activities of public utility operators and their influence on information system and analysis views presented in the context of published work in some journals. It analyses the impact of technical computing models used by public utility community services on the financial statements and therefore the information provided by accounting information system stakeholders.

  13. 77 FR 25525 - Requirements and Registration for the U.S. DOT Motorcoach Safety Data Utilization Student Challenge

    Science.gov (United States)

    2012-04-30

    ... winning student-developed applications or Web sites for mobile devices will be showcased at a U.S. DOT or... personal computer, a mobile handheld device, console, or any platform broadly accessible on the open... Safety Data Utilization Student Challenge AGENCY: Federal Motor Carrier Safety Administration...

  14. Computer Simulation Modeling: A Method for Predicting the Utilities of Alternative Computer-Aided Treat Evaluation Algorithms

    Science.gov (United States)

    1990-09-01

    0 Technical Report 911 D~i. FiLE COPY Computer Simulation Modeling : A Method for Predicting the Utilities of Alternative Computer-Aided Threat...63007A 793 1202 HI 11. TITLE (Include Security Classification) Computer Simulation Modeling : A Method for Predicting the Utilities of Alternative...SECURITY CLASSIFICATION OF THIS PAGE("wn Data Entered) ii Technical Report 911 Computer Simulation Modeling : A Method for Predicting the Utilities of

  15. Survey review of models for use in market penetration analysis: utility sector focus

    Energy Technology Data Exchange (ETDEWEB)

    Groncki, P.J.; Kydes, A.S.; Lamontagne, J.; Marcuse, W.; Vinjamuri, G.

    1980-11-01

    The ultimate benefits of federal expenditures in research and development for new technologies are dependent upon the degree of acceptance of these technologies. Market penetration considerations are central to the problem of quantifying the potential benefits. These benefits are inputs to the selection process of projects competing for finite R and D funds. Market penetration is the gradual acceptance of a new commodity or technology. The Office of Coal utilization is concerned with the specialized area of market penetration of new electric power generation technologies for both replacement and new capacity. The common measure of market penetration is the fraction of the market serviced by the challenging technology for each time point considered. The methodologies for estimating market penetration are divided into three generic classes: integrated energy/economy modeling systems, utility capacity expansion models, and technology substitution models. In general, the integrated energy/economy modeling systems have three advantages: they provide internally consistent macro, energy-economy scenarios, they account for the effect of prices on demand by fuel form, and they explicitly capture the effects of population growth and the level and structure of economic activity on energy demand. A variety of deficiencies appear in most energy-economy systems models. All of the methodologies may be applied at some level to questions of market penetration of new technologies in the utility sector; choice of methods for a particular analysis must be conditioned by the scope of the analysis, data availability, and the relative cost of alternative analysis.

  16. On the Path to SunShot - Utility Regulatory Business Model Reforms forAddressing the Financial Impacts of Distributed Solar on Utilities

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2016-05-01

    Net-energy metering (NEM) with volumetric retail electricity pricing has enabled rapid proliferation of distributed photovoltaics (DPV) in the United States. However, this transformation is raising concerns about the potential for higher electricity rates and cost-shifting to non-solar customers, reduced utility shareholder profitability, reduced utility earnings opportunities, and inefficient resource allocation. Although DPV deployment in most utility territories remains too low to produce significant impacts, these concerns have motivated real and proposed reforms to utility regulatory and business models, with profound implications for future DPV deployment. This report explores the challenges and opportunities associated with such reforms in the context of the U.S. Department of Energy’s SunShot Initiative. As such, the report focuses on a subset of a broader range of reforms underway in the electric utility sector. Drawing on original analysis and existing literature, we analyze the significance of DPV’s financial impacts on utilities and non-solar ratepayers under current NEM rules and rate designs, the projected effects of proposed NEM and rate reforms on DPV deployment, and alternative reforms that could address utility and ratepayer concerns while supporting continued DPV growth. We categorize reforms into one or more of four conceptual strategies. Understanding how specific reforms map onto these general strategies can help decision makers identify and prioritize options for addressing specific DPV concerns that balance stakeholder interests.

  17. Modeling of Testability Requirement Based on Generalized Stochastic Petri Nets

    Institute of Scientific and Technical Information of China (English)

    SU Yong-ding; QIU Jing; LIU Guan-jun; QIAN Yan-ling

    2009-01-01

    Testability design is an effective way to realize the fault detection and isolation. Its important step is to determine testability figures of merits (TFOM). Firstly, some influence factors for TFOMs are analyzed, such as the processes of system operation, maintenance and support, fault detection and isolation and so on. Secondly, a testability requirement analysis model is built based on generalized stochastic Petri net (GSPN). Then, the system's reachable states are analyzed based on the model, a Markov chain isomorphic with Petri net is constructed, a state transition matrix is created and the system's steady state probability is obtained. The relationship between the steady state availability and testability parameters can be revealed and reasoned. Finally, an example shows that the proposed method can determine TFOM, such as fault detection rate and fault isolation rate, effectively and reasonably.

  18. The Min system and nucleoid occlusion are not required for identifying the division site in Bacillus subtilis but ensure its efficient utilization.

    Directory of Open Access Journals (Sweden)

    Christopher D A Rodrigues

    Full Text Available Precise temporal and spatial control of cell division is essential for progeny survival. The current general view is that precise positioning of the division site at midcell in rod-shaped bacteria is a result of the combined action of the Min system and nucleoid (chromosome occlusion. Both systems prevent assembly of the cytokinetic Z ring at inappropriate places in the cell, restricting Z rings to the correct site at midcell. Here we show that in the bacterium Bacillus subtilis Z rings are positioned precisely at midcell in the complete absence of both these systems, revealing the existence of a mechanism independent of Min and nucleoid occlusion that identifies midcell in this organism. We further show that Z ring assembly at midcell is delayed in the absence of Min and Noc proteins, while at the same time FtsZ accumulates at other potential division sites. This suggests that a major role for Min and Noc is to ensure efficient utilization of the midcell division site by preventing Z ring assembly at potential division sites, including the cell poles. Our data lead us to propose a model in which spatial regulation of division in B. subtilis involves identification of the division site at midcell that requires Min and nucleoid occlusion to ensure efficient Z ring assembly there and only there, at the right time in the cell cycle.

  19. A Buck-Boost Converter Modified to Utilize 600V GaN Power Devices in a PV Application Requiring 1200V Devices

    OpenAIRE

    2015-01-01

    This paper presents a buck-boost converter which is modified to utilize new 600 V gallium nitride (GaN) power semiconductor devices in an application requiring 1200 V devices. The presented buck-boost converter is used as a part of a dc/dc stage in an all-GaN photovoltaic (PV) inverter and it provides a negative voltage for the 3-level neutral-point-clamped (NPC) PWM inverter which is connected to the utility grid. Since in this application the transistor and the diode of ...

  20. The Utility of Cognitive Plausibility in Language Acquisition Modeling: Evidence From Word Segmentation.

    Science.gov (United States)

    Phillips, Lawrence; Pearl, Lisa

    2015-11-01

    The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's cognitive plausibility. We suggest that though every computational model necessarily idealizes the modeled task, an informative language acquisition model can aim to be cognitively plausible in multiple ways. We discuss these cognitive plausibility checkpoints generally and then apply them to a case study in word segmentation, investigating a promising Bayesian segmentation strategy. We incorporate cognitive plausibility by using an age-appropriate unit of perceptual representation, evaluating the model output in terms of its utility, and incorporating cognitive constraints into the inference process. Our more cognitively plausible model shows a beneficial effect of cognitive constraints on segmentation performance. One interpretation of this effect is as a synergy between the naive theories of language structure that infants may have and the cognitive constraints that limit the fidelity of their inference processes, where less accurate inference approximations are better when the underlying assumptions about how words are generated are less accurate. More generally, these results highlight the utility of incorporating cognitive plausibility more fully into computational models of language acquisition.

  1. A Computer Simulation Modeling Approach to Estimating Utility in Several Air Force Specialties

    Science.gov (United States)

    1992-05-01

    AL-TR-1992-0006 AD-A252 322 /II" A COMPUTER SIMULATION MODELING A APPROACH TO ESTIMATING UTILITY IN R SEVERAL AIR FORCE SPECIALTIES M Brice M. Stone...I 2. REPORT DATE 3. REPORT TYPE AND DATES COVERED IU 1Q::l.n1 Umrjh 1100 4. TITLE AND SUBTITLE S. FUNDING NUMBERS A Computer Simulation Modeling Approach...I DTIC TAB 0 Unannounced 0 justificatlon- By Distribut On . Availability Codes Avai an /r Dist Special v A COMPUTER SIMULATION MODELING APPROACH TO

  2. Federal and State Structures to Support Financing Utility-Scale Solar Projects and the Business Models Designed to Utilize Them

    Energy Technology Data Exchange (ETDEWEB)

    Mendelsohn, M.; Kreycik, C.

    2012-04-01

    Utility-scale solar projects have grown rapidly in number and size over the last few years, driven in part by strong renewable portfolio standards (RPS) and federal incentives designed to stimulate investment in renewable energy technologies. This report provides an overview of such policies, as well as the project financial structures they enable, based on industry literature, publicly available data, and questionnaires conducted by the National Renewable Energy Laboratory (NREL).

  3. Aspen Plus® and economic modeling of equine waste utilization for localized hot water heating via fast pyrolysis.

    Science.gov (United States)

    Hammer, Nicole L; Boateng, Akwasi A; Mullen, Charles A; Wheeler, M Clayton

    2013-10-15

    Aspen Plus(®) based simulation models have been developed to design a pyrolysis process for on-site production and utilization of pyrolysis oil from equine waste at the Equine Rehabilitation Center at Morrisville State College (MSC). The results indicate that utilization of all the available waste from the site's 41 horses requires a 6 oven dry metric ton per day (ODMTPD) pyrolysis system but it will require a 15 ODMTPD system for waste generated by an additional 150 horses at the expanded area including the College and its vicinity. For this a dual fluidized bed combustion reduction integrated pyrolysis system (CRIPS) developed at USDA's Agricultural Research Service (ARS) was identified as the technology of choice for pyrolysis oil production. The Aspen Plus(®) model was further used to consider the combustion of the produced pyrolysis oil (bio-oil) in the existing boilers that generate hot water for space heating at the Equine Center. The model results show the potential for both the equine facility and the College to displace diesel fuel (fossil) with renewable pyrolysis oil and alleviate a costly waste disposal problem. We predict that all the heat required to operate the pyrolyzer could be supplied by non-condensable gas and about 40% of the biochar co-produced with bio-oil. Techno-economic Analysis shows neither design is economical at current market conditions; however the 15 ODMTPD CRIPS design would break even when diesel prices reach $11.40/gal. This can be further improved to $7.50/gal if the design capacity is maintained at 6 ODMTPD but operated at 4950 h per annum.

  4. Requirements for high level models supporting design space exploration in model-based systems engineering

    NARCIS (Netherlands)

    Haveman, Steven; Bonnema, Gerrit Maarten

    2013-01-01

    Most formal models are used in detailed design and focus on a single domain. Few effective approaches exist that can effectively tie these lower level models to a high level system model during design space exploration. This complicates the validation of high level system requirements during

  5. A GENERALIZATION OF TRADITIONAL KANO MODEL FOR CUSTOMER REQUIREMENTS ANALYSIS

    Directory of Open Access Journals (Sweden)

    Renáta Turisová

    2015-07-01

    Full Text Available Purpose: The theory of attractiveness determines the relationship between the technically achieved and customer perceived quality of product attributes. The most frequently used approach in the theory of attractiveness is the implementation of Kano‘s model. There exist a lot of generalizations of that model which take into consideration various aspects and approaches focused on understanding the customer preferences and identification of his priorities for a selling  product. The aim of this article is to outline another possible generalization of Kano‘s model.Methodology/Approach: The traditional Kano’s model captures the nonlinear relationship between reached attributes of quality and customer requirements. The individual attributes of quality are divided into three main categories: must-be, one-dimensional, attractive quality and into two side categories: indifferent and reverse quality. The well selling product has to contain the must-be attribute. It should contain as many one-dimensional attributes as possible. If there are also supplementary attractive attributes, it means that attractiveness of the entire product, from the viewpoint of the customer, nonlinearly sharply rises what has a direct positive impact on a decision of potential customer when purchasing the product. In this article, we show that inclusion of individual quality attributes of a product to the mentioned categories depends, among other things, also on costs on life cycle of the product, respectively on a price of the product on the market.Findings: In practice, we are often encountering the inclusion of products into different price categories: lower, middle and upper class. For a certain type of products the category is either directly declared by a producer (especially in automotive industry, or is determined by a customer by means of assessment of available market prices. To each of those groups of a products different customer expectations can be assigned

  6. Modelling EuroQol health-related utility values for diabetic complications from CODE-2 data.

    Science.gov (United States)

    Bagust, Adrian; Beale, Sophie

    2005-03-01

    Recent research has employed different analytical techniques to estimate the impact of the various long-term complications of type 2 diabetes on health-related utility and health status. However, limited patient numbers or lack of variety of patient experience has limited their power to discriminate between separate complications and grades of severity. In this study alternative statistical model forms were compared to investigate the influence of various factors on self-assessed health status and calculated utility scores, including the presence and severity of complications, and type of diabetes therapy. Responses to the EuroQol EQ-5D questionnaire from 4641 patients with type 2 diabetes in 5 European countries were analysed. Simple multiple regression analysis was used to model both visual analogue scale (VAS) scores and time trade-off index scores (TTO). Also, two complex models were developed for TTO analysis using a structure suggested by the EuroQol calculation algorithm. Both VAS and TTO models achieved greater explanatory power than in earlier studies. Relative weightings for individual complications differed between VAS and TTO scales, reflecting the strong influence of loss of mobility and severe pain in the EuroQol algorithm. Insulin-based therapy was uniformly associated with a detrimental effect equivalent to an additional moderate complication. Evidence was found that TTO values are not responsive in cases where 3 or more multiple complications are present, and therefore may underestimate utility loss for patients most adversely affected by complex chronic diseases like diabetes.

  7. Region-specific study of the electric utility industry: financial history and future power requirements for the VACAR region

    Energy Technology Data Exchange (ETDEWEB)

    Pochan, M.J.

    1985-07-01

    Financial data for the period 1966 to 1981 are presented for the four investor-owned electric utilities in the VACAR (Virginia-Carolinas) region. This region was selected as representative for the purpose of assessing the availability, reliability, and cost of electric power for the future in the United States. The estimated demand for power and planned additions to generating capacity for the region through the year 2000 are also given.

  8. Extending the Utility of the Parabolic Approximation in Medical Ultrasound Using Wide-Angle Diffraction Modeling.

    Science.gov (United States)

    Soneson, Joshua E

    2017-04-01

    Wide-angle parabolic models are commonly used in geophysics and underwater acoustics but have seen little application in medical ultrasound. Here, a wide-angle model for continuous-wave high-intensity ultrasound beams is derived, which approximates the diffraction process more accurately than the commonly used Khokhlov-Zabolotskaya-Kuznetsov (KZK) equation without increasing implementation complexity or computing time. A method for preventing the high spatial frequencies often present in source boundary conditions from corrupting the solution is presented. Simulations of shallowly focused axisymmetric beams using both the wide-angle and standard parabolic models are compared to assess the accuracy with which they model diffraction effects. The wide-angle model proposed here offers improved focusing accuracy and less error throughout the computational domain than the standard parabolic model, offering a facile method for extending the utility of existing KZK codes.

  9. [Utilization of feed energy by growing pigs. 3. Energy requirement for the growth and fattening of pigs].

    Science.gov (United States)

    Hoffmann, L; Schiemann, R; Jentsch, W

    1979-02-01

    The test series for the investigation of the energy consumption of growing pigs of the breeds large white and improved land race pig as well as cross breeds of the two breeds in a total of 369 metabolism periods (as described in the first two pieces of information of this publication series -- Hoffmann and others, 1977 and Jentsch and Hoffmann, 1977) were statistically analysed for the purpose of the derivation of the energy requirement for maintenance and the partial energy requirement for growth in order to test the possibilities of the factorial analysis for the derivation of energy requirement values of growing pigs. The dependence of the maintenance requirement of growing pigs (investigations in the live weight range of 10 to 40 kg -- see 1st information--were made with boars those in the live weight range of 30 to 120 kg were made with gelded boars, 2nd information) on the live weight can best be characterised by applying a power exponent of 0,61 or 0,62 for the live weight. A definition is offered to be discussed for the energetic maintenance requirement of productive live stock and laboratory animals as a conventional value. The energy requirement values derived from the doubly-factorial statistical analysis show a satisfactory adaptation to the measured values as such concerning energy intake and observed growth performance of the test animals. The conclusion is drawn that the factorial analysis of the energy requirement (maintenance plus partial performances) results in a better estimate of the requirement of growing animals than the assessment according only to live weight and live weight increase without characterising the energy requirement for partial performances. This is important for the further working on and more exact definition of requirement norms.

  10. Utilizing neural networks in magnetic media modeling and field computation: A review

    Directory of Open Access Journals (Sweden)

    Amr A. Adly

    2014-11-01

    Full Text Available Magnetic materials are considered as crucial components for a wide range of products and devices. Usually, complexity of such materials is defined by their permeability classification and coupling extent to non-magnetic properties. Hence, development of models that could accurately simulate the complex nature of these materials becomes crucial to the multi-dimensional field-media interactions and computations. In the past few decades, artificial neural networks (ANNs have been utilized in many applications to perform miscellaneous tasks such as identification, approximation, optimization, classification and forecasting. The purpose of this review article is to give an account of the utilization of ANNs in modeling as well as field computation involving complex magnetic materials. Mostly used ANN types in magnetics, advantages of this usage, detailed implementation methodologies as well as numerical examples are given in the paper.

  11. Utilizing neural networks in magnetic media modeling and field computation: A review.

    Science.gov (United States)

    Adly, Amr A; Abd-El-Hafiz, Salwa K

    2014-11-01

    Magnetic materials are considered as crucial components for a wide range of products and devices. Usually, complexity of such materials is defined by their permeability classification and coupling extent to non-magnetic properties. Hence, development of models that could accurately simulate the complex nature of these materials becomes crucial to the multi-dimensional field-media interactions and computations. In the past few decades, artificial neural networks (ANNs) have been utilized in many applications to perform miscellaneous tasks such as identification, approximation, optimization, classification and forecasting. The purpose of this review article is to give an account of the utilization of ANNs in modeling as well as field computation involving complex magnetic materials. Mostly used ANN types in magnetics, advantages of this usage, detailed implementation methodologies as well as numerical examples are given in the paper.

  12. Piloting Utility Modeling Applications (PUMA): Planning for Climate Change at the Portland Water Bureau

    Science.gov (United States)

    Heyn, K.; Campbell, E.

    2016-12-01

    The Portland Water Bureau has been studying the anticipated effects of climate change on its primary surface water source, the Bull Run Watershed, since the early 2000's. Early efforts by the bureau were almost exclusively reliant on outside expertise from climate modelers and researchers, particularly those at the Climate Impacts Group (CIG) at the University of Washington. Early work products from CIG formed the basis of the bureau's understanding of the most likely and consequential impacts to the watershed from continued GHG-caused warming. However, by mid-decade, as key supply and demand conditions for the bureau changed, it found it lacked the technical capacity and tools to conduct more refined and updated research to build on the outside analysis it had obtained. Beginning in 2010 through its participation in the Pilot Utility Modeling Applications (PUMA) project, the bureau identified and began working to address the holes in its technical and institutional capacity by embarking on a process to assess and select a hydrologic model while obtaining downscaled climate change data to utilize within it. Parallel to the development of these technical elements, the bureau made investments in qualified staff to lead the model selection, development and utilization, while working to establish productive, collegial and collaborative relationships with key climate research staff at the Oregon Climate Change Research Institute (OCCRI), the University of Washington and the University of Idaho. This presentation describes the learning process of a major metropolitan area drinking water utility as its approach to addressing the complex problem of climate change evolves, matures, and begins to influence broader aspects of the organization's planning efforts.

  13. Changes in fibrinogen availability and utilization in an animal model of traumatic coagulopathy

    DEFF Research Database (Denmark)

    Hagemo, Jostein S; Jørgensen, Jørgen; Ostrowski, Sisse R

    2013-01-01

    Impaired haemostasis following shock and tissue trauma is frequently detected in the trauma setting. These changes occur early, and are associated with increased mortality. The mechanism behind trauma-induced coagulopathy (TIC) is not clear. Several studies highlight the crucial role of fibrinogen...... in posttraumatic haemorrhage. This study explores the coagulation changes in a swine model of early TIC, with emphasis on fibrinogen levels and utilization of fibrinogen....

  14. Evaluating the Impact of the Healthy Beginnings System of Care Model on Pediatric Emergency Department Utilization.

    Science.gov (United States)

    Tan, Cheryl H; Gazmararian, Julie

    2017-03-01

    The aim of this study was to evaluate whether enrollment in the Healthy Beginnings System of Care (SOC) model is associated with a decrease in emergency department (ED) visits among children aged 6 months to 5.5 years. A retrospective, longitudinal study of ED utilization was conducted among children enrolled in the Healthy Beginnings SOC model between February 2011 and May 2013. Using medical records obtained from a children's hospital in Atlanta, the rate of ED visits per quarter was examined as the main outcome. A multilevel, multivariate Poisson model, with family- and child-level random effects, compared ED utilization rates before and after enrollment. Adjusted rate ratios and 95% confidence intervals were calculated after controlling for sociodemographic confounders. The effect of SOC enrollment on the rate of ED visits differed by income level of the primary parent. The rate of ED visits after enrollment was not significantly different than the rate of ED visits before enrollment for children whose primary parent had an annual income of less than $5000 (P = 0.298), $20,000 to $29,999 (P = 0.199), or $30,000 or more (P = 0.117). However, for the children whose primary parent's annual income was $5000 to $19,999, the rate of ED visits after enrollment was significantly higher than the rate of ED visits before enrollment (adjusted rate ratio, 1.48; 95% confidence interval, 1.17-1.87). Enrollment in the SOC model does not appear to decrease the rate of ED visits among enrolled children. Additional strategies, such as education sessions on ED utilization, are needed to reduce the rate of ED utilization among SOC-enrolled children.

  15. The utility of comparative models and the local model quality for protein crystal structure determination by Molecular Replacement

    Directory of Open Access Journals (Sweden)

    Pawlowski Marcin

    2012-11-01

    Full Text Available Abstract Background Computational models of protein structures were proved to be useful as search models in Molecular Replacement (MR, a common method to solve the phase problem faced by macromolecular crystallography. The success of MR depends on the accuracy of a search model. Unfortunately, this parameter remains unknown until the final structure of the target protein is determined. During the last few years, several Model Quality Assessment Programs (MQAPs that predict the local accuracy of theoretical models have been developed. In this article, we analyze whether the application of MQAPs improves the utility of theoretical models in MR. Results For our dataset of 615 search models, the real local accuracy of a model increases the MR success ratio by 101% compared to corresponding polyalanine templates. On the contrary, when local model quality is not utilized in MR, the computational models solved only 4.5% more MR searches than polyalanine templates. For the same dataset of the 615 models, a workflow combining MR with predicted local accuracy of a model found 45% more correct solution than polyalanine templates. To predict such accuracy MetaMQAPclust, a “clustering MQAP” was used. Conclusions Using comparative models only marginally increases the MR success ratio in comparison to polyalanine structures of templates. However, the situation changes dramatically once comparative models are used together with their predicted local accuracy. A new functionality was added to the GeneSilico Fold Prediction Metaserver in order to build models that are more useful for MR searches. Additionally, we have developed a simple method, AmIgoMR (Am I good for MR?, to predict if an MR search with a template-based model for a given template is likely to find the correct solution.

  16. P Voltage Control of DFIG with Two-Mass-Shaft Turbine Model Under Utility Voltage Disturbance

    Directory of Open Access Journals (Sweden)

    Hengameh Kojooyan Jafari

    2016-06-01

    Full Text Available Doubly fed induction generators as a variable speed induction generators are applied instead of other electric machines in wind power plants to be connected to the grid with flexible controllers. Nowadays one of the most important subjects in wind farms is control of output power delivered to the grid under utility disturbance. In this paper, a doubly-fed induction generator with external rotor resistance and power converters model as an external voltage source having an adjustable phase and amplitude with an ordinary turbine connected to one mass shaft model and also two mass shaft model, is used and controlled by a P voltage controller to control the output active power for typical high and low wind speeds under two conditions of utility disturbance; while time of disturbance is not too long to change the domain of external rotor voltage source and also while time is long and the domain of external rotor voltage decreases.Simulation results show that P voltage controller can control output active power under 27% stator voltage drop down for typical low wind speed and 11% stator voltage drop down for typical high wind speed in long time disturbance while 80% of rotor external voltage domain drops down under short time utility disturbance.

  17. [Research practices of conversion efficiency of resources utilization model of castoff from Chinese material medica industrialization].

    Science.gov (United States)

    Duan, Jin-Ao; Su, Shu-Lan; Guo, Sheng; Liu, Pei; Qian, Da-Wei; Jiang, Shu; Zhu, Hua-Xu; Tang, Yu-Ping; Wu, Qi-Nan

    2013-12-01

    The industrialization chains and their products, which were formed from the process of the production of medicinal materials-prepared drug in pieces and deep processed product of Chinese material medica (CMM) resources, have generated large benefits of social and economic. However, The large of herb-medicine castoff of "non-medicinal parts" and "rejected materials" produced inevitably during the process of Chinese medicinal resources produce and process, and the residues, waste water and waste gas were produced during the manufactured and deep processed product of CMM. These lead to the waste of resources and environmental pollution. Our previous researches had proposed the "three utilization strategies" and "three types of resources models" of herb-medicine castoff according to the different physicochemical property of resources constitutes, resources potential and utility value of herb-medicine castoff. This article focus on the conversion efficiency of resources model and analysis the ways, technologies, practices, and application in herb-medicine cast off of the conversion efficiency of resources model based on the recycling economy theory of resources and thoughts of resources chemistry of CMM. These data may be promote and resolve the key problems limited the industrialization of Chinese material medica for long time and promote the realization of herb-medicine castoff resources utilization.

  18. Business model innovation for Local Energy Management: a perspective from Swiss utilities

    Directory of Open Access Journals (Sweden)

    Emanuele Facchinetti

    2016-08-01

    Full Text Available The successful deployment of the energy transition relies on a deep reorganization of the energy market. Business model innovation is recognized as a key driver of this process. This work contributes to this topic by providing to potential Local Energy Management stakeholders and policy makers a conceptual framework guiding the Local Energy Management business model innovation. The main determinants characterizing Local Energy Management concepts and impacting its business model innovation are identified through literature reviews on distributed generation typologies and customer/investor preferences related to new business opportunities emerging with the energy transition. Afterwards, the relation between the identified determinants and the Local Energy Management business model solution space is analyzed based on semi-structured interviews with managers of Swiss utilities companies. The collected managers’ preferences serve as explorative indicators supporting the business model innovation process and provide insights to policy makers on challenges and opportunities related to Local Energy Management.

  19. Case series demonstrating the clinical utility of dual energy computed tomography in patients requiring stents for urinary calculi.

    Science.gov (United States)

    Jepperson, Maria A; Thiel, David D; Cernigliaro, Joseph G; Broderick, Gregory A; Haley, William E

    2014-02-01

    Dual energy computed tomography (DECT) utilizes the material change in attenuation when imaged at two different energies to determine the composition of urinary calculi as uric acid or non-uric acid. We discuss a series of case reports illustrating DECT's ability to provide immediate determination of uric acid versus non-uric acid calculi and facilitate more informed clinical decision-making. Further, these cases demonstrate a unique population of patients with ureteral stents and percutaneous nephrostomy tubes that benefit from DECT's ability to create a virtual color contrast between an indwelling device and the stone material and thereby significantly impacting patient morbidity.

  20. NAD-independent L-lactate dehydrogenase is required for L-lactate utilization in Pseudomonas stutzeri SDM.

    Directory of Open Access Journals (Sweden)

    Chao Gao

    Full Text Available BACKGROUND: Various Pseudomonas strains can use L-lactate as their sole carbon source for growth. However, the L-lactate-utilizing enzymes in Pseudomonas have never been identified and further studied. METHODOLOGY/PRINCIPAL FINDINGS: An NAD-independent L-lactate dehydrogenase (L-iLDH was purified from the membrane fraction of Pseudomonas stutzeri SDM. The enzyme catalyzes the oxidation of L-lactate to pyruvate by using FMN as cofactor. After cloning its encoding gene (lldD, L-iLDH was successfully expressed, purified from a recombinant Escherichia coli strain, and characterized. An lldD mutant of P. stutzeri SDM was constructed by gene knockout technology. This mutant was unable to grow on L-lactate, but retained the ability to grow on pyruvate. CONCLUSIONS/SIGNIFICANCE: It is proposed that L-iLDH plays an indispensable function in Pseudomonas L-lactate utilization by catalyzing the conversion of L-lactate into pyruvate.

  1. LSST camera heat requirements using CFD and thermal seeing modeling

    Science.gov (United States)

    Sebag, Jacques; Vogiatzis, Konstantinos

    2010-07-01

    The LSST camera is located above the LSST primary/tertiary mirror and in front of the secondary mirror in the shadow of its central obscuration. Due to this position within the optical path, heat released from the camera has a potential impact on the seeing degradation that is larger than traditionally estimated for Cassegrain or Nasmyth telescope configurations. This paper presents the results of thermal seeing modeling combined with Computational Fluid Dynamics (CFD) analyzes to define the thermal requirements on the LSST camera. Camera power output fluxes are applied to the CFD model as boundary conditions to calculate the steady-state temperature distribution on the camera and the air inside the enclosure. Using a previously presented post-processing analysis to calculate the optical seeing based on the mechanical turbulence and temperature variations along the optical path, the optical performance resulting from the seeing is determined. The CFD simulations are repeated for different wind speeds and orientations to identify the worst case scenario and generate an estimate of seeing contribution as a function of camera-air temperature difference. Finally, after comparing with the corresponding error budget term, a maximum allowable temperature for the camera is selected.

  2. Research on Computer Aided Innovation Model of Weapon Equipment Requirement Demonstration

    Science.gov (United States)

    Li, Yong; Guo, Qisheng; Wang, Rui; Li, Liang

    Firstly, in order to overcome the shortcoming of using only AD or TRIZ solely, and solve the problems currently existed in weapon equipment requirement demonstration, the paper construct the method system of weapon equipment requirement demonstration combining QFD, AD, TRIZ, FA. Then, we construct a CAI model frame of weapon equipment requirement demonstration, which include requirement decomposed model, requirement mapping model and requirement plan optimization model. Finally, we construct the computer aided innovation model of weapon equipment requirement demonstration, and developed CAI software of equipment requirement demonstration.

  3. Low-versus high-glycemic index diets in women: effects on caloric requirement, substrate utilization and insulin sensitivity.

    Science.gov (United States)

    Clapp, James F; Lopez, Beth

    2007-09-01

    Lowering dietary glycemic index appears to have positive health effects in obese and/or insulin resistant individuals. However, detailed studies in lean young men show no effect. This study was designed to test the null hypothesis that a diet rich in low-glycemic carbohydrate has no effect on lipid profile, caloric requirements, fat oxidation, or insulin sensitivity in adult women when compared to one rich in high-glycemic carbohydrate. The metabolic feeding protocol used was conducted in both a free-living and in-patient setting using a randomized crossover design. Seven women were studied on each of 2 diets in which 60% of the calories were from either high- or low-glycemic carbohydrate sources. Each diet lasted 20 days with measurements of caloric requirement, resting metabolic rate, glucose and insulin responses to diet and activity, insulin sensitivity, and lipid profile over the last 7 days. Caloric requirement was determined by bomb calorimetry. Other techniques included indirect calorimetry, hydrodensitometry, stable isotope tracers, and the euglycemic clamp. On the low-glycemic index diet the women's caloric requirements were 11% +/- 1% higher, fat oxidation at fasted rest supplied an average of 45% +/- 4% versus 28% +/- 5% of oxidative requirements, average glucose and insulin levels were approximately 40% lower, low density lipoproteins (LDL) and leptin concentrations were lower, and various indices of insulin sensitivity were > 20% higher. In this group of adult women, a diet that lowered glycemic index well below that typically found in western diets increased both daily caloric requirement and fat oxidation, decreased insulin and glucose concentrations and increased insulin sensitivity.

  4. Clinical utility of the DSM-5 alternative model of personality disorders: six cases from practice.

    Science.gov (United States)

    Bach, Bo; Markon, Kristian; Simonsen, Erik; Krueger, Robert F

    2015-01-01

    In Section III, Emerging Measures and Models, DSM-5 presents an Alternative Model of Personality Disorders, which is an empirically based model of personality pathology measured with the Level of Personality Functioning Scale (LPFS) and the Personality Inventory for DSM-5 (PID-5). These novel instruments assess level of personality impairment and pathological traits. Objective. A number of studies have supported the psychometric qualities of the LPFS and the PID-5, but the utility of these instruments in clinical assessment and treatment has not been extensively evaluated. The goal of this study was to evaluate the clinical utility of this alternative model of personality disorders. Method. We administered the LPFS and the PID-5 to psychiatric outpatients diagnosed with personality disorders and other nonpsychotic disorders. The personality profiles of six characteristic patients were inspected (involving a comparison of presenting problems, history, and diagnoses) and used to formulate treatment considerations. We also considered 6 specific personality disorder types that could be derived from the profiles as defined in the DSM-5 Section III criteria. Results. Using the LPFS and PID-5, we were able to characterize the 6 cases in a meaningful and useful manner with regard to understanding and treatment of the individual patient and to match the cases with 6 relevant personality disorder types. Implications for ease of use, communication, and psychotherapy are discussed. Conclusion. Our evaluation generally supported the utility for clinical purposes of the Alternative Model for Personality Disorders in Section III of the DSM-5, although it also identified some areas for refinement. (Journal of Psychiatric Practice 2015;21:3-25).

  5. Modeling the hydrologic and economic efficacy of stormwater utility credit programs for US single family residences.

    Science.gov (United States)

    Kertesz, Ruben; Green, Olivia Odom; Shuster, William D

    2014-01-01

    As regulatory pressure to reduce the environmental impact of urban stormwater intensifies, US municipalities increasingly seek a dedicated source of funding for stormwater programs, such as a stormwater utility. In rare instances, single family residences are eligible for utility discounts for installing green infrastructure. This study examined the hydrologic and economic efficacy of four such programs at the parcel scale: Cleveland (OH), Portland (OR), Fort Myers (FL), and Lynchburg (VA). Simulations were performed to model the reduction in stormwater runoff by implementing bioretention on a typical residential property according to extant administrative rules. The EPA National Stormwater Calculator was used to perform pre- vs post-retrofit comparisons and to demonstrate its ease of use for possible use by other cities in utility planning. Although surface slope, soil type and infiltration rate, impervious area, and bioretention parameters were different across cities, our results suggest that modeled runoff volume was most sensitive to percent of total impervious area that drained to the bioretention cell, with soil type the next most important factor. Findings also indicate a persistent gap between the percentage of annual runoff reduced and the percentage of fee reduced.

  6. Requirements-Driven Deployment: Customizing the Requirements Model for the Host Environment

    NARCIS (Netherlands)

    Ali, Raian; Dalpiaz, Fabiano; Giorgini, Paolo

    2014-01-01

    Deployment is a main development phase which configures a software to be ready for use in a certain environment. The ultimate goal of deployment is to enable users to achieve their requirements while using the deployed software. However, requirements are not uniform and differ between deployment env

  7. Evaluating components of dental care utilization among adults with diabetes and matched controls via hurdle models

    Directory of Open Access Journals (Sweden)

    Chaudhari Monica

    2012-07-01

    Full Text Available Abstract Background About one-third of adults with diabetes have severe oral complications. However, limited previous research has investigated dental care utilization associated with diabetes. This project had two purposes: to develop a methodology to estimate dental care utilization using claims data and to use this methodology to compare utilization of dental care between adults with and without diabetes. Methods Data included secondary enrollment and demographic data from Washington Dental Service (WDS and Group Health Cooperative (GH, clinical data from GH, and dental-utilization data from WDS claims during 2002–2006. Dental and medical records from WDS and GH were linked for enrolees continuously and dually insured during the study. We employed hurdle models in a quasi-experimental setting to assess differences between adults with and without diabetes in 5-year cumulative utilization of dental services. Propensity score matching adjusted for differences in baseline covariates between the two groups. Results We found that adults with diabetes had lower odds of visiting a dentist (OR = 0.74, p  0.001. Among those with a dental visit, diabetes patients had lower odds of receiving prophylaxes (OR = 0.77, fillings (OR = 0.80 and crowns (OR = 0.84 (p 0.005 for all and higher odds of receiving periodontal maintenance (OR = 1.24, non-surgical periodontal procedures (OR = 1.30, extractions (OR = 1.38 and removable prosthetics (OR = 1.36 (p  Conclusions Patients with diabetes are less likely to use dental services. Those who do are less likely to use preventive care and more likely to receive periodontal care and tooth-extractions. Future research should address the possible effectiveness of additional prevention in reducing subsequent severe oral disease in patients with diabetes.

  8. Clinical Utility of the DSM-5 Alternative Model of Personality Disorders

    DEFF Research Database (Denmark)

    Bach, Bo; Markon, Kristian; Simonsen, Erik

    2015-01-01

    In Section III, Emerging Measures and Models, DSM-5 presents an Alternative Model of Personality Disorders, which is an empirically based model of personality pathology measured with the Level of Personality Functioning Scale (LPFS) and the Personality Inventory for DSM-5 (PID-5). These novel...... (involving a comparison of presenting problems, history, and diagnoses) and used to formulate treatment considerations. We also considered 6 specific personality disorder types that could be derived from the profiles as defined in the DSM-5 Section III criteria. Results. Using the LPFS and PID-5, we were...... evaluation generally supported the utility for clinical purposes of the Alternative Model for Personality Disorders in Section III of the DSM-5, although it also identified some areas for refinement....

  9. Utilization of Model Predictive Control to Balance Power Absorption Against Load Accumulation: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Abbas, Nikhar; Tom, Nathan

    2017-09-01

    Wave energy converter (WEC) control strategies have been primarily focused on maximizing power absorption. The use of model predictive control strategies allows for a finite-horizon, multiterm objective function to be solved. This work utilizes a multiterm objective function to maximize power absorption while minimizing the structural loads on the WEC system. Furthermore, a Kalman filter and autoregressive model were used to estimate and forecast the wave exciting force and predict the future dynamics of the WEC. The WEC's power-take-off time-averaged power and structural loads under a perfect forecast assumption in irregular waves were compared against results obtained from the Kalman filter and autoregressive model to evaluate model predictive control performance.

  10. Utility of multi temporal satellite images for crop water requirements estimation and irrigation management in the Jordan Valley

    Science.gov (United States)

    Identifying the spatial and temporal distribution of crop water requirements is a key for successful management of water resources in the dry areas. Climatic data were obtained from three automated weather stations to estimate reference evapotranspiration (ETO) in the Jordan Valley according to the...

  11. Impact of Copayment Requirements on Therapy Utilization for Children with Developmental Disabilities in Israeli Jewish and Arab Bedouin Populations

    Science.gov (United States)

    Lubetzky, Hasia; Shvarts, Shifra; Galil, Aharon; Tesler, Hedva; Vardi, Gideon; Merrick, Joav

    2004-01-01

    Under Israeli law, national health insurance covers basic health care for all of the nation's residents, but health services users have to copay for medications and therapy. This study examined whether the requirement to copay for therapy services among certain subpopulations influences their compliance with meeting rehabilitation therapy…

  12. Clinical utility of current-generation dipole modelling of scalp EEG.

    Science.gov (United States)

    Plummer, C; Litewka, L; Farish, S; Harvey, A S; Cook, M J

    2007-11-01

    To investigate the clinical utility of current-generation dipole modelling of scalp EEG in focal epilepsies seen commonly in clinical practice. Scalp EEG recordings from 10 patients with focal epilepsy, five with Benign Focal Epilepsy of Childhood (BFEC) and five with Mesial Temporal Lobe Epilepsy (MTLE), were used for interictal spike dipole modelling using Scan 4.3 and CURRY 5.0. Optimum modelling parameters for EEG source localisation (ESL) were sought by the step-wise application of various volume conductor (forward) and dipole (inverse) models. Best-fit ESL solutions (highest explained forward-fit to measured data variance) were used to characterise best-fit forward and inverse models, regularisation effect, additional electrode effect, single-to-single spike and single-to-averaged spike variability, and intra- and inter-operator concordance. Inter-parameter relationships were examined. Computation times and interface problems were recorded. For both BFEC and MTLE, the best-fit forward model was the finite element method interpolated (FEMi) model, while the best-fit single dipole models were the rotating non-regularised and the moving regularised models. When combined, these forward-inverse models appeared to offer clinically meaningful ESL results when referenced to an averaged cortex overlay, best-fit dipoles localising to the central fissure region in BFEC and to the basolateral temporal region in MTLE. Single-to-single spike and single-to-averaged spike measures of concordance for dipole location and orientation were stronger for BFEC versus MTLE. The use of an additional pair of inferior temporal electrodes in MTLE directed best-fit dipoles towards the basomesial temporal region. Inverse correlations were noted between unexplained variance (RD) and dipole strength (Amp), RD and signal to noise ratio (SNR), and SNR and confidence ellipsoid (CE) volume. Intra- and inter-operator levels of agreement were relatively robust for dipole location and orientation

  13. Research on Water Utility Revenue Model and Compensation Policy under Uncertain Demand

    Directory of Open Access Journals (Sweden)

    Shou-Kui He

    2014-03-01

    Full Text Available With the diversification of both water utility investment and property right structure, it is necessary to establish a scientific compensation mechanism of water conservancy benefit to balance the interests among investors, water users and pertinent sectors which suffer loss. This paper analyzes the compensation policies water management authority imposed on water supply enterprises under uncertain demand, establishes a compensation model with risk preference, explains the implications of risk preference on the decision-making behaviors of water supply enterprises by using numerical analysis method, provides the basis for the water management department to formulate reasonable water resources charge standards and compensation policies. At last, the paper discusses how to implement the water compensation policies according to the characteristics of rural water utilities.

  14. Energy Utilization Evaluation of Carbon Performance in Public Projects by FAHP and Cloud Model

    Directory of Open Access Journals (Sweden)

    Lin Li

    2016-07-01

    Full Text Available With the low-carbon economy advocated all over the world, how to use energy reasonably and efficiently in public projects has become a major issue. It has brought many open questions, including which method is more reasonable in evaluating the energy utilization of carbon performance in public projects when the evaluation information is fuzzy; whether an indicator system can be constructed; and which indicators have more impact on carbon performance. This article aims to solve these problems. We propose a new carbon performance evaluation system for energy utilization based on project processes (design, construction, and operation. Fuzzy Analytic Hierarchy Process (FAHP is used to accumulate the indicator weights and cloud model is incorporated when the indicator value is fuzzy. Finally, we apply our indicator system to a case study of the Xiangjiang River project in China, which demonstrates the applicability and efficiency of our method.

  15. The episodic random utility model unifies time trade-off and discrete choice approaches in health state valuation

    Directory of Open Access Journals (Sweden)

    Craig Benjamin M

    2009-01-01

    Full Text Available Abstract Background To present an episodic random utility model that unifies time trade-off and discrete choice approaches in health state valuation. Methods First, we introduce two alternative random utility models (RUMs for health preferences: the episodic RUM and the more common instant RUM. For the interpretation of time trade-off (TTO responses, we show that the episodic model implies a coefficient estimator, and the instant model implies a mean slope estimator. Secondly, we demonstrate these estimators and the differences between the estimates for 42 health states using TTO responses from the seminal Measurement and Valuation in Health (MVH study conducted in the United Kingdom. Mean slopes are estimates with and without Dolan's transformation of worse-than-death (WTD responses. Finally, we demonstrate an exploded probit estimator, an extension of the coefficient estimator for discrete choice data that accommodates both TTO and rank responses. Results By construction, mean slopes are less than or equal to coefficients, because slopes are fractions and, therefore, magnify downward errors in WTD responses. The Dolan transformation of WTD responses causes mean slopes to increase in similarity to coefficient estimates, yet they are not equivalent (i.e., absolute mean difference = 0.179. Unlike mean slopes, coefficient estimates demonstrate strong concordance with rank-based predictions (Lin's rho = 0.91. Combining TTO and rank responses under the exploded probit model improves the identification of health state values, decreasing the average width of confidence intervals from 0.057 to 0.041 compared to TTO only results. Conclusion The episodic RUM expands upon the theoretical framework underlying health state valuation and contributes to health econometrics by motivating the selection of coefficient and exploded probit estimators for the analysis of TTO and rank responses. In future MVH surveys, sample size requirements may be reduced through

  16. A 2nd generation static model of greenhouse energy requirements (horticern) : a comparison with dynamic models

    CERN Document Server

    Jolliet, O; Munday, G L

    1989-01-01

    Optimisation of a greenhouse and its components requires a suitable model permitting precise determination of its energy requirements. Existing static models are simple but lack precision; dynamic models though more precise, are unsuitable for use over long periods and difficult to handle in practice. A theoretical study and measurements from the CERN trial greenhouse have allowed the development of new static model named "HORTICERN", precise and easy to use for predicting energy consumption and which takes into account effects of solar energy, wind and radiative loss to the sky. This paper compares the HORTICERN model with the dynamic models of Bot, Takakura, Van Bavel and Gembloux, and demonstrates that its precision is comparable; differences on average being less than 5%, it is independent of type of greenhouse (e.g. single or double glazing, Hortiplus, etc.) and climate. The HORTICERN method has been developed for PC use and is proving to be a powerful tool for greenhouse optimisation by research work...

  17. Modeling and Optimizing Energy Utilization of Steel Production Process: A Hybrid Petri Net Approach

    Directory of Open Access Journals (Sweden)

    Peng Wang

    2013-01-01

    Full Text Available The steel industry is responsible for nearly 9% of anthropogenic energy utilization in the world. It is urgent to reduce the total energy utilization of steel industry under the huge pressures on reducing energy consumption and CO2 emission. Meanwhile, the steel manufacturing is a typical continuous-discrete process with multiprocedures, multiobjects, multiconstraints, and multimachines coupled, which makes energy management rather difficult. In order to study the energy flow within the real steel production process, this paper presents a new modeling and optimization method for the process based on Hybrid Petri Nets (HPN in consideration of the situation above. Firstly, we introduce the detailed description of HPN. Then the real steel production process from one typical integrated steel plant is transformed into Hybrid Petri Net model as a case. Furthermore, we obtain a series of constraints of our optimization model from this model. In consideration of the real process situation, we pick the steel production, energy efficiency and self-made gas surplus as the main optimized goals in this paper. Afterwards, a fuzzy linear programming method is conducted to obtain the multiobjective optimization results. Finally, some measures are suggested to improve this low efficiency and high whole cost process structure.

  18. Forecasting Paratransit Utility by Using Multinomial Logit Model: A Case Study

    Directory of Open Access Journals (Sweden)

    Waikhom Victory

    2016-10-01

    Full Text Available Paratransit plays an important role in the urban passenger transportation system in the developing countries. Three cities viz. Imphal East, Imphal West and Silchar in India have been undertaken for the study. Household survey and traffic survey have been employed to collect data for the paratransit users. Modelling techniques and tools also have been used to forecast the utility of paratransit in the region. For this purpose, a Multinomial Logit Model (MNL had been used. A total of seven variables were considered in the model estimation of which three are quantitative i.e. trip length (km, travel cost (rupees and travel time (minutes and four are qualitative variables i.e. reliability, comfort, road condition and convenience.

  19. On the use of prior information in modelling metabolic utilization of energy in growing pigs

    DEFF Research Database (Denmark)

    Strathe, Anders Bjerring; Jørgensen, Henry; Fernández, José Adalberto

    2011-01-01

    Construction of models that provide a realistic representation of metabolic utilization of energy in growing animals tend to be over-parameterized because data generated from individual metabolic studies are often sparse. In the Bayesian framework prior information can enter the data analysis...... through formal statements of probability because model parameters are random variables and hence, are assigned probability distribution (Gelman et al. 2004). The objective of the study was to introduce prior information in modelling metabolizable energy (ME) intake, protein (PD) and lipid deposition (LD......) curves, resulting from a metabolism study on growing pigs of high genetic potential. A total of 17 crossbred pigs of three genders (barrows, boars and gilts) were used. Pigs were fed four diets based on barley, wheat and soybean meal supplemented with crystalline amino acids to meet Danish nutrient...

  20. Utilizing Soize's Approach to Identify Parameter and Model Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Bonney, Matthew S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Univ. of Wisconsin, Madison, WI (United States); Brake, Matthew Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-10-01

    Quantifying uncertainty in model parameters is a challenging task for analysts. Soize has derived a method that is able to characterize both model and parameter uncertainty independently. This method is explained with the assumption that some experimental data is available, and is divided into seven steps. Monte Carlo analyses are performed to select the optimal dispersion variable to match the experimental data. Along with the nominal approach, an alternative distribution can be used along with corrections that can be utilized to expand the scope of this method. This method is one of a very few methods that can quantify uncertainty in the model form independently of the input parameters. Two examples are provided to illustrate the methodology, and example code is provided in the Appendix.

  1. From Ambiguity Aversion to a Generalized Expected Utility. Modeling Preferences in a Quantum Probabilistic Framework

    CERN Document Server

    Aerts, Diederik

    2015-01-01

    Ambiguity and ambiguity aversion have been widely studied in decision theory and economics both at a theoretical and an experimental level. After Ellsberg's seminal studies challenging subjective expected utility theory (SEUT), several (mainly normative) approaches have been put forward to reproduce ambiguity aversion and Ellsberg-type preferences. However, Machina and other authors have pointed out some fundamental difficulties of these generalizations of SEUT to cope with some variants of Ellsberg's thought experiments, which has recently been experimentally confirmed. Starting from our quantum modeling approach to human cognition, we develop here a general probabilistic framework to model human decisions under uncertainty. We show that our quantum theoretical model faithfully represents different sets of data collected on both the Ellsberg and the Machina paradox situations, and is flexible enough to describe different subjective attitudes with respect to ambiguity. Our approach opens the way toward a quan...

  2. Assessing the effect of the VHA PCMH model on utilization patterns among veterans with PTSD.

    Science.gov (United States)

    Randall, Ian; Maynard, Charles; Chan, Gary; Devine, Beth; Johnson, Chris

    2017-05-01

    The Veterans Health Administration (VHA) implemented a patient-centered medical home (PCMH)-based Patient Aligned Care Teams (PACT) model in 2010. We examined its effects on the utilization of health services among US veterans with posttraumatic stress disorder (PTSD). We analyzed VHA clinical and administrative data to conduct an interrupted time series study. Encounter-level data were obtained for the period of April 1, 2005, through March 31, 2014. We identified 642,660 veterans with PTSD who were assigned to either a high- or low-PCMH implementation group using a validated VHA PCMH measurement instrument. We measured the effect of high-PCMH implementation on the count of hospitalizations and primary care, specialty care, specialty mental health, emergency department (ED), and urgent care encounters compared with low-PCMH implementation. We fit a multilevel, mixed-effects, negative binomial regression model and estimated average marginal effects and incidence rate ratios. Compared with patients in low-PCMH implementation clinics, patients who received care in high-PCMH implementation clinics experienced a decrease in hospitalizations (incremental effect [IE], -0.036; 95% confidence interval [CI], -0.0371 to -0.0342), a decrease in specialty mental health encounters (IE, -0.009; 95% CI, -0.009 to -0.008), a decrease in urgent care encounters (IE, -0.210; 95% CI, -0.212 to -0.207), and a decrease in ED encounters (IE, -0.056; 95% CI, -0.057 to -0.054). High PCMH implementation positively affected utilization patterns by reducing downstream use of high-cost inpatient and specialty services. Future research should investigate whether a reduction in utilization of health services indeed results in higher levels of virtual and non-face-to-face access, or if the PACT model has reduced necessary access to care.

  3. Pathologists' roles in clinical utilization management. A financing model for managed care.

    Science.gov (United States)

    Zhao, J J; Liberman, A

    2000-03-01

    In ancillary or laboratory utilization management, the roles of pathologists have not been explored fully in managed care systems. Two possible reasons may account for this: pathologists' potential contributions have not been defined clearly, and effective measurement of and reasonable compensation for the pathologist's contribution remains vague. The responsibilities of pathologists in clinical practice may include clinical pathology and laboratory services (which have long been well-defined and are compensated according to a resource-based relative value system-based coding system), laboratory administration, clinical utilization management, and clinical research. Although laboratory administration services have been compensated with mechanisms such as percentage of total service revenue or fixed salary, the involvement of pathologists seems less today than in the past, owing to increased clinical workload and time constraints in an expanding managed care environment, especially in community hospital settings. The lack of financial incentives or appropriate compensation mechanisms for the services likely accounts for the current situation. Furthermore, the importance of pathologist-driven utilization management in laboratory services lacks recognition among hospital administrators, managed care executives, and pathologists themselves, despite its potential benefits for reducing cost and enhancing quality of care. We propose a financial compensation model for such services and summarize its advantages.

  4. Analysis of biomarker utility using a PBPK/PD model for carbaryl

    Directory of Open Access Journals (Sweden)

    Martin Blake Phillips

    2014-11-01

    Full Text Available There are many types of biomarkers; the two common ones are biomarkers of exposure and biomarkers of effect. The utility of a biomarker for estimating exposures or predicting risks depends on the strength of the correlation between biomarker concentrations and exposure/effects. In the current study, a combined exposure and physiologically-based pharmacokinetic/pharmacodynamic (PBPK/PD model of carbaryl was used to demonstrate the use of computational modeling for providing insight into the selection of biomarkers for different purposes. The Cumulative and Aggregate Risk Evaluation System (CARES was used to generate exposure profiles, including magnitude and timing, for use as inputs to the PBPK/PD model. The PBPK/PD model was then used to predict blood concentrations of carbaryl and urine concentrations of its principal metabolite, 1-naphthol (1-N, as biomarkers of exposure. The PBPK/PD model also predicted acetylcholinesterase (AChE inhibition in red blood cells (RBC as a biomarker of effect. The correlations of these simulated biomarker concentrations with intake doses or brain AChE inhibition (as a surrogate of effects were analyzed using a linear regression model. Results showed that 1-N in urine is a better biomarker of exposure than carbaryl in blood, and that 1-N in urine is correlated with the dose averaged over the last two days of the simulation. They also showed that RBC AChE inhibition is an appropriate biomarker of effect. This computational approach can be applied to a wide variety of chemicals to facilitate quantitative analysis of biomarker utility.

  5. Icodextrin enhances survival in an intraperitoneal ovarian cancer murine model utilizing gene therapy.

    Science.gov (United States)

    Rocconi, Rodney P; Numnum, Michael T; Zhu, Zeng B; Lu, Baogen; Wang, Minghui; Rivera, Angel A; Stoff-Khalili, Mariam; Alvarez, Ronald D; Curiel, David T; Makhija, Sharmila

    2006-12-01

    Icodextrin, a novel glucose polymer solution utilized for peritoneal dialysis, has been demonstrated to have prolonged intraperitoneal (IP) instillation volumes in comparison to standard PBS solutions. In an animal model of ovarian cancer, we explored whether a survival advantage exists utilizing icodextrin rather than PBS as a delivery solution for an infectivity enhanced virotherapy approach. Initial experiments evaluated whether icodextrin would adversely affect replication of a clinical grade infectivity enhanced conditionally replicative adenovirus (Delta24-RGD). Virus was added to prepared blinded solutions of PBS or icodextrin (20%) and then evaluated in vitro in various human ovarian cancer cell lines (SKOV3.ip1, PA-1, and Hey) and in vivo in a SKOV3.ip1 human ovarian cancer IP murine model. Viral replication was measured by detecting adenovirus E4 gene levels utilizing QRT-PCR. Survival was subsequently evaluated in a separate SKOV3.ip1 ovarian cancer IP murine model. Cohorts of mice were treated in blinded fashion with PBS alone, icodextrin alone, PBS+Delta24-RGD, or icodextrin+Delta24-RGD. Survival data were plotted on Kaplan-Meier curve and statistical calculations performed using the log-rank test. There was no adverse affect of icodextrin on vector replication in the ovarian cancer cell lines nor murine model tumor samples evaluated. Median survival in the IP treated animal cohorts was 23 days for the PBS group, 40 days for the icodextrin group, 65 days for the PBS+Delta24-RGD group, and 105 days for icodextrin+Delta24-RGD (p=0.023). Of note, 5 of the 10 mice in the icodextrin+Delta24-RGD group were alive at the end of the study period, all without evidence of tumor (120 days). These experiments suggest that the use of dialysates such as icodextrin may further enhance the therapeutic effects of novel IP virotherapy and other gene therapy strategies for ovarian cancer. Phase I studies utilizing icodextrin-based virotherapy for ovarian cancer are

  6. On the Path to SunShot. Utility Regulatory and Business Model Reforms for Addressing the Financial Impacts of Distributed Solar on Utilities

    Energy Technology Data Exchange (ETDEWEB)

    Barbose, Galen [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Miller, John [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sigrin, Ben [National Renewable Energy Lab. (NREL), Golden, CO (United States); Reiter, Emerson [National Renewable Energy Lab. (NREL), Golden, CO (United States); Cory, Karlynn [National Renewable Energy Lab. (NREL), Golden, CO (United States); McLaren, Joyce [National Renewable Energy Lab. (NREL), Golden, CO (United States); Seel, Joachim [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Mills, Andrew [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Darghouth, Naim [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Satchwell, Andrew [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-05-01

    Net-energy metering (NEM) has helped drive the rapid growth of distributed PV (DPV) but has raised concerns about electricity cost shifts, utility financial losses, and inefficient resource allocation. These concerns have motivated real and proposed reforms to utility regulatory and business models. This report explores the challenges and opportunities associated with such reforms in the context of the U.S. Department of Energy's SunShot Initiative. Most of the reforms to date address NEM concerns by reducing the benefits provided to DPV customers and thus constraining DPV deployment. Eliminating NEM nationwide, by compensating exports of PV electricity at wholesale rather than retail rates, could cut cumulative DPV deployment by 20% in 2050 compared with a continuation of current policies. This would slow the PV cost reductions that arise from larger scale and market certainty. It could also thwart achievement of the SunShot deployment goals even if the initiative's cost targets are achieved. This undesirable prospect is stimulating the development of alternative reform strategies that address concerns about distributed PV compensation without inordinately harming PV economics and growth. These alternatives fall into the categories of facilitating higher-value DPV deployment, broadening customer access to solar, and aligning utility profits and earnings with DPV. Specific strategies include utility ownership and financing of DPV, community solar, distribution network operators, services-driven utilities, performance-based incentives, enhanced utility system planning, pricing structures that incentivize high-value DPV configurations, and decoupling and other ratemaking reforms that reduce regulatory lag. These approaches represent near- and long-term solutions for preserving the legacy of the SunShot Initiative.

  7. Tfp1 is required for ion homeostasis, fluconazole resistance and N-Acetylglucosamine utilization in Candida albicans.

    Science.gov (United States)

    Jia, Chang; Zhang, Kai; Yu, Qilin; Zhang, Bing; Xiao, Chenpeng; Dong, Yijie; Chen, Yulu; Zhang, Biao; Xing, Laijun; Li, Mingchun

    2015-10-01

    The vacuolar-type H+-ATPase (V-ATPase) is crucial for the maintenance of ion homeostasis. Dysregulation of ion homeostasis affects various aspects of cellular processes. However, the importance of V-ATPase in Candida albicans is not totally clear. In this study, we demonstrated the essential roles of V-ATPase through Tfp1, a putative V-ATPase subunit. Deletion of TFP1 led to generation of an iron starvation signal and reduced total iron content, which was associated with mislocalization of Fet34p that was finally due to disorders in copper homeostasis. Furthermore, the tfp1∆/∆ mutant exhibited weaker growth and lower aconitase activity on nonfermentable carbon sources, and iron or copper addition partially rescued the growth defect. In addition, the tfp1∆/∆ mutant also showed elevated cytosolic calcium levels in normal or low calcium medium that were relevant to calcium release from vacuole. Kinetics of cytosolic calcium response to an alkaline pulse and VCX1 (VCX1 encodes a putative vacuolar Ca2+/H+ exchanger) overexpression assays indicated that the cytosolic calcium status was in relation to Vcx1 activity. Spot assay and concentration-kill curve demonstrated that the tfp1∆/∆ mutant was hypersensitive to fluconazole, which was attributed to reduced ergosterol biosynthesis and CDR1 efflux pump activity, and iron/calcium dysregulation. Interestingly, carbon source utilization tests found the tfp1∆/∆ mutant was defective for growth on N-Acetylglucosamine (GlcNAc) plate, which was associated with ATP depletion due to the decreased ability to catabolize GlcNAc. Taken together, our study gives new insights into functions of Tfp1, and provides the potential to better exploit V-ATPase as an antifungal target.

  8. Estimates of nutritional requirements and use of Small Ruminant Nutrition System model for hair sheep in semiarid conditions

    Directory of Open Access Journals (Sweden)

    Alessandra Pinto de Oliveira

    2014-09-01

    Full Text Available The objective was to determine the efficiency of utilization of metabolizable energy for maintenance (km and weight gain (kf, the dietary requirements of total digestible nutrients (TDN and metabolizable protein (MP, as well as, evaluate the Small Ruminant Nutrition System (SRNS model to predict the dry matter intake (DMI and the average daily gain (ADG of Santa Ines lambs, fed diets containing different levels of metabolizable energy (ME. Thirty five lambs, non-castrated, with initial body weight (BW of 14.77 ± 1.26 kg at approximate two months old, were used. At the beginning of the experiment, five animals were slaughtered to serve as reference for the estimative of empty body weight (EBW and initial body composition of the 30 remaining animals, which were distributed in randomized block design with five treatments (1.13; 1.40; 1.73; 2.22 and 2.60 Mcal/kg DM, and six repetitions. The requirement of metabolizable energy for maintenance was 78.53 kcal/kg EBW0,75/day, with a utilization efficiency of 66%. The average value of efficiency of metabolizable energy utilization for weight gain was 48%. The dietary requirements of TDN and MP increased with the increase in BW and ADG of the animals. The SRNS model underestimated the DMI and ADG of the animals in 6.2% and 24.6%, respectively. Concludes that the values of km and kf are consistent with those observed in several studies with lambs created in the tropics. The dietary requirements of TDN and MP of Santa Ines lambs for different BW and ADG are, approximately, 42% and 24%, respectively, lower than those suggested by the american system of evaluation of food and nutrient requirements of small ruminants. The SRNS model was sensitive to predict the DMI in Santa Ines lambs, however, for variable ADG, more studies are needed, since the model underestimated the response of the animals of this study.

  9. Development of an innovative spacer grid model utilizing computational fluid dynamics within a subchannel analysis tool

    Science.gov (United States)

    Avramova, Maria

    In the past few decades the need for improved nuclear reactor safety analyses has led to a rapid development of advanced methods for multidimensional thermal-hydraulic analyses. These methods have become progressively more complex in order to account for the many physical phenomena anticipated during steady state and transient Light Water Reactor (LWR) conditions. The advanced thermal-hydraulic subchannel code COBRA-TF (Thurgood, M. J. et al., 1983) is used worldwide for best-estimate evaluations of the nuclear reactor safety margins. In the framework of a joint research project between the Pennsylvania State University (PSU) and AREVA NP GmbH, the theoretical models and numerics of COBRA-TF have been improved. Under the name F-COBRA-TF, the code has been subjected to an extensive verification and validation program and has been applied to variety of LWR steady state and transient simulations. To enable F-COBRA-TF for industrial applications, including safety margins evaluations and design analyses, the code spacer grid models were revised and substantially improved. The state-of-the-art in the modeling of the spacer grid effects on the flow thermal-hydraulic performance in rod bundles employs numerical experiments performed by computational fluid dynamics (CFD) calculations. Because of the involved computational cost, the CFD codes cannot be yet used for full bundle predictions, but their capabilities can be utilized for development of more advanced and sophisticated models for subchannel-level analyses. A subchannel code, equipped with improved physical models, can be then a powerful tool for LWR safety and design evaluations. The unique contributions of this PhD research are seen as development, implementation, and qualification of an innovative spacer grid model by utilizing CFD results within a framework of a subchannel analysis code. Usually, the spacer grid models are mostly related to modeling of the entrainment and deposition phenomena and the heat

  10. A Comparative Study of Systems of Utility Model Patent Search Report in Mainland China and Taiwan Region

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    I. An overview In the patent system of the civil law countries, the utility model patent, also known as the "petty invention", is granted to protect petty inventions that are not highly inventive, but very useful. For example, in the patent systems in Germany and Japan1 can be found provisions concerning utility model patent. After the patent system was launched in mainland China on 1 April 1985, the utility model patent is well received by the industry thanks to the adoption of the "preliminary examinat...

  11. Transaction-based building controls framework, Volume 2: Platform descriptive model and requirements

    Energy Technology Data Exchange (ETDEWEB)

    Akyol, Bora A. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Haack, Jereme N. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Carpenter, Brandon J. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Katipamula, Srinivas [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Lutes, Robert G. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Hernandez, George [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States)

    2015-07-31

    Transaction-based Building Controls (TBC) offer a control systems platform that provides an agent execution environment that meets the growing requirements for security, resource utilization, and reliability. This report outlines the requirements for a platform to meet these needs and describes an illustrative/exemplary implementation.

  12. INVESTIGATION OF QUANTIFICATION OF FLOOD CONTROL AND WATER UTILIZATION EFFECT OF RAINFALL INFILTRATION FACILITY BY USING WATER BALANCE ANALYSIS MODEL

    OpenAIRE

    文, 勇起; BUN, Yuki

    2013-01-01

    In recent years, many flood damage and drought attributed to urbanization has occurred. At present infiltration facility is suggested for the solution of these problems. Based on this background, the purpose of this study is investigation of quantification of flood control and water utilization effect of rainfall infiltration facility by using water balance analysis model. Key Words : flood control, water utilization , rainfall infiltration facility

  13. Modeling and design of light powered biomimicry micropump utilizing transporter proteins

    Science.gov (United States)

    Liu, Jin; Sze, Tsun-Kay Jackie; Dutta, Prashanta

    2014-11-01

    The creation of compact micropumps to provide steady flow has been an on-going challenge in the field of microfluidics. We present a mathematical model for a micropump utilizing Bacteriorhodopsin and sugar transporter proteins. This micropump utilizes transporter proteins as method to drive fluid flow by converting light energy into chemical potential. The fluid flow through a microchannel is simulated using the Nernst-Planck, Navier-Stokes, and continuity equations. Numerical results show that the micropump is capable of generating usable pressure. Designing parameters influencing the performance of the micropump are investigated including membrane fraction, lipid proton permeability, illumination, and channel height. The results show that there is a substantial membrane fraction region at which fluid flow is maximized. The use of lipids with low membrane proton permeability allows illumination to be used as a method to turn the pump on and off. This capability allows the micropump to be activated and shut off remotely without bulky support equipment. This modeling work provides new insights on mechanisms potentially useful for fluidic pumping in self-sustained bio-mimic microfluidic pumps. This work is supported in part by the National Science Fundation Grant CBET-1250107.

  14. Utility-Scale Lithium-Ion Storage Cost Projections for Use in Capacity Expansion Models

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Wesley J.; Marcy, Cara; Krishnan, Venkat K.; Margolis, Robert

    2016-11-21

    This work presents U.S. utility-scale battery storage cost projections for use in capacity expansion models. We create battery cost projections based on a survey of literature cost projections of battery packs and balance of system costs, with a focus on lithium-ion batteries. Low, mid, and high cost trajectories are created for the overnight capital costs and the operating and maintenance costs. We then demonstrate the impact of these cost projections in the Regional Energy Deployment System (ReEDS) capacity expansion model. We find that under reference scenario conditions, lower battery costs can lead to increased penetration of variable renewable energy, with solar photovoltaics (PV) seeing the largest increase. We also find that additional storage can reduce renewable energy curtailment, although that comes at the expense of additional storage losses.

  15. Utilization of building information modeling in infrastructure’s design and construction

    Science.gov (United States)

    Zak, Josef; Macadam, Helen

    2017-09-01

    Building Information Modeling (BIM) is a concept that has gained its place in the design, construction and maintenance of buildings in Czech Republic during recent years. This paper deals with description of usage, applications and potential benefits and disadvantages connected with implementation of BIM principles in the preparation and construction of infrastructure projects. Part of the paper describes the status of BIM implementation in Czech Republic, and there is a review of several virtual design and construction practices in Czech Republic. Examples of best practice are presented from current infrastructure projects. The paper further summarizes experiences with new technologies gained from the application of BIM related workflows. The focus is on the BIM model utilization for the machine control systems on site, quality assurance, quality management and construction management.

  16. Vitamin D Signaling in the Bovine Immune System: A Model for Understanding Human Vitamin D Requirements

    Directory of Open Access Journals (Sweden)

    Corwin D. Nelson

    2012-03-01

    Full Text Available The endocrine physiology of vitamin D in cattle has been rigorously investigated and has yielded information on vitamin D requirements, endocrine function in health and disease, general metabolism, and maintenance of calcium homeostasis in cattle. These results are relevant to human vitamin D endocrinology. The current debate regarding vitamin D requirements is centered on the requirements for proper intracrine and paracrine vitamin D signaling. Studies in adult and young cattle can provide valuable insight for understanding vitamin D requirements as they relate to innate and adaptive immune responses during infectious disease. In cattle, toll-like receptor recognition activates intracrine and paracrine vitamin D signaling mechanism in the immune system that regulates innate and adaptive immune responses in the presence of adequate 25-hydroxyvitamin D. Furthermore, experiments with mastitis in dairy cattle have provided in vivo evidence for the intracrine vitamin D signaling mechanism in macrophages as well as vitamin D mediated suppression of infection. Epidemiological evidence indicates that circulating concentrations above 32 ng/mL of 25-hydroxyvitamin D are necessary for optimal vitamin D signaling in the immune system, but experimental evidence is lacking for that value. Experiments in cattle can provide that evidence as circulating 25-hydroxyvitamin D concentrations can be experimentally manipulated within ranges that are normal for humans and cattle. Additionally, young and adult cattle can be experimentally infected with bacteria and viruses associated with significant diseases in both cattle and humans. Utilizing the bovine model to further delineate the immunomodulatory role of vitamin D will provide potentially valuable insights into the vitamin D requirements of both humans and cattle, especially as they relate to immune response capacity and infectious disease resistance.

  17. Vitamin D signaling in the bovine immune system: a model for understanding human vitamin D requirements.

    Science.gov (United States)

    Nelson, Corwin D; Reinhardt, Timothy A; Lippolis, John D; Sacco, Randy E; Nonnecke, Brian J

    2012-03-01

    The endocrine physiology of vitamin D in cattle has been rigorously investigated and has yielded information on vitamin D requirements, endocrine function in health and disease, general metabolism, and maintenance of calcium homeostasis in cattle. These results are relevant to human vitamin D endocrinology. The current debate regarding vitamin D requirements is centered on the requirements for proper intracrine and paracrine vitamin D signaling. Studies in adult and young cattle can provide valuable insight for understanding vitamin D requirements as they relate to innate and adaptive immune responses during infectious disease. In cattle, toll-like receptor recognition activates intracrine and paracrine vitamin D signaling mechanism in the immune system that regulates innate and adaptive immune responses in the presence of adequate 25-hydroxyvitamin D. Furthermore, experiments with mastitis in dairy cattle have provided in vivo evidence for the intracrine vitamin D signaling mechanism in macrophages as well as vitamin D mediated suppression of infection. Epidemiological evidence indicates that circulating concentrations above 32 ng/mL of 25-hydroxyvitamin D are necessary for optimal vitamin D signaling in the immune system, but experimental evidence is lacking for that value. Experiments in cattle can provide that evidence as circulating 25-hydroxyvitamin D concentrations can be experimentally manipulated within ranges that are normal for humans and cattle. Additionally, young and adult cattle can be experimentally infected with bacteria and viruses associated with significant diseases in both cattle and humans. Utilizing the bovine model to further delineate the immunomodulatory role of vitamin D will provide potentially valuable insights into the vitamin D requirements of both humans and cattle, especially as they relate to immune response capacity and infectious disease resistance.

  18. Closed loop models for analyzing engineering requirements for simulators

    Science.gov (United States)

    Baron, S.; Muralidharan, R.; Kleinman, D.

    1980-01-01

    A closed loop analytic model, incorporating a model for the human pilot, (namely, the optimal control model) that would allow certain simulation design tradeoffs to be evaluated quantitatively was developed. This model was applied to a realistic flight control problem. The resulting model is used to analyze both overall simulation effects and the effects of individual elements. The results show that, as compared to an ideal continuous simulation, the discrete simulation can result in significant performance and/or workload penalties.

  19. FadD Is Required for Utilization of Endogenous Fatty Acids Released from Membrane Lipids ▿ †

    Science.gov (United States)

    Pech-Canul, Ángel; Nogales, Joaquina; Miranda-Molina, Alfonso; Álvarez, Laura; Geiger, Otto; Soto, María José; López-Lara, Isabel M.

    2011-01-01

    FadD is an acyl coenzyme A (CoA) synthetase responsible for the activation of exogenous long-chain fatty acids (LCFA) into acyl-CoAs. Mutation of fadD in the symbiotic nitrogen-fixing bacterium Sinorhizobium meliloti promotes swarming motility and leads to defects in nodulation of alfalfa plants. In this study, we found that S. meliloti fadD mutants accumulated a mixture of free fatty acids during the stationary phase of growth. The composition of the free fatty acid pool and the results obtained after specific labeling of esterified fatty acids with a Δ5-desaturase (Δ5-Des) were in agreement with membrane phospholipids being the origin of the released fatty acids. Escherichia coli fadD mutants also accumulated free fatty acids released from membrane lipids in the stationary phase. This phenomenon did not occur in a mutant of E. coli with a deficient FadL fatty acid transporter, suggesting that the accumulation of fatty acids in fadD mutants occurs inside the cell. Our results indicate that, besides the activation of exogenous LCFA, in bacteria FadD plays a major role in the activation of endogenous fatty acids released from membrane lipids. Furthermore, expression analysis performed with S. meliloti revealed that a functional FadD is required for the upregulation of genes involved in fatty acid degradation and suggested that in the wild-type strain, the fatty acids released from membrane lipids are degraded by β-oxidation in the stationary phase of growth. PMID:21926226

  20. Complex problems require complex solutions: the utility of social quality theory for addressing the Social Determinants of Health

    Directory of Open Access Journals (Sweden)

    Ward Paul R

    2011-08-01

    Full Text Available Abstract Background In order to improve the health of the most vulnerable groups in society, the WHO Commission on Social Determinants of Health (CSDH called for multi-sectoral action, which requires research and policy on the multiple and inter-linking factors shaping health outcomes. Most conceptual tools available to researchers tend to focus on singular and specific social determinants of health (SDH (e.g. social capital, empowerment, social inclusion. However, a new and innovative conceptual framework, known as social quality theory, facilitates a more complex and complete understanding of the SDH, with its focus on four domains: social cohesion, social inclusion, social empowerment and socioeconomic security, all within the same conceptual framework. This paper provides both an overview of social quality theory in addition to findings from a national survey of social quality in Australia, as a means of demonstrating the operationalisation of the theory. Methods Data were collected using a national random postal survey of 1044 respondents in September, 2009. Multivariate logistic regression analysis was conducted. Results Statistical analysis revealed that people on lower incomes (less than $45000 experience worse social quality across all of the four domains: lower socio-economic security, lower levels of membership of organisations (lower social cohesion, higher levels of discrimination and less political action (lower social inclusion and lower social empowerment. The findings were mixed in terms of age, with people over 65 years experiencing lower socio-economic security, but having higher levels of social cohesion, experiencing lower levels of discrimination (higher social inclusion and engaging in more political action (higher social empowerment. In terms of gender, women had higher social cohesion than men, although also experienced more discrimination (lower social inclusion. Conclusions Applying social quality theory allows

  1. Effects of atmospheric variability on energy utilization and conservation. [Space heating energy demand modeling; Program HEATLOAD

    Energy Technology Data Exchange (ETDEWEB)

    Reiter, E.R.; Johnson, G.R.; Somervell, W.L. Jr.; Sparling, E.W.; Dreiseitly, E.; Macdonald, B.C.; McGuirk, J.P.; Starr, A.M.

    1976-11-01

    Research conducted between 1 July 1975 and 31 October 1976 is reported. A ''physical-adaptive'' model of the space-conditioning demand for energy and its response to changes in weather regimes was developed. This model includes parameters pertaining to engineering factors of building construction, to weather-related factors, and to socio-economic factors. Preliminary testing of several components of the model on the city of Greeley, Colorado, yielded most encouraging results. Other components, especially those pertaining to socio-economic factors, are still under development. Expansion of model applications to different types of structures and larger regions is presently underway. A CRT-display model for energy demand within the conterminous United States also has passed preliminary tests. A major effort was expended to obtain disaggregated data on energy use from utility companies throughout the United States. The study of atmospheric variability revealed that the 22- to 26-day vacillation in the potential and kinetic energy modes of the Northern Hemisphere is related to the behavior of the planetary long-waves, and that the midwinter dip in zonal available potential energy is reflected in the development of blocking highs. Attempts to classify weather patterns over the eastern and central United States have proceeded satisfactorily to the point where testing of our method for longer time periods appears desirable.

  2. Brain in flames – animal models of psychosis: utility and limitations

    Directory of Open Access Journals (Sweden)

    Mattei D

    2015-05-01

    Full Text Available Daniele Mattei,1 Regina Schweibold,1,2 Susanne A Wolf1 1Department of Cellular Neuroscience, Max-Delbrueck-Center for Molecular Medicine, Berlin, Germany; 2Department of Neurosurgery, Helios Clinics, Berlin, Germany Abstract: The neurodevelopmental hypothesis of schizophrenia posits that schizophrenia is a psychopathological condition resulting from aberrations in neurodevelopmental processes caused by a combination of environmental and genetic factors which proceed long before the onset of clinical symptoms. Many studies discuss an immunological component in the onset and progression of schizophrenia. We here review studies utilizing animal models of schizophrenia with manipulations of genetic, pharmacologic, and immunological origin. We focus on the immunological component to bridge the studies in terms of evaluation and treatment options of negative, positive, and cognitive symptoms. Throughout the review we link certain aspects of each model to the situation in human schizophrenic patients. In conclusion we suggest a combination of existing models to better represent the human situation. Moreover, we emphasize that animal models represent defined single or multiple symptoms or hallmarks of a given disease. Keywords: inflammation, schizophrenia, microglia, animal models 

  3. On the utility of land surface models for agricultural drought monitoring

    Directory of Open Access Journals (Sweden)

    W. T. Crow

    2012-09-01

    Full Text Available The lagged rank cross-correlation between model-derived root-zone soil moisture estimates and remotely sensed vegetation indices (VI is examined between January 2000 and December 2010 to quantify the skill of various soil moisture models for agricultural drought monitoring. Examined modeling strategies range from a simple antecedent precipitation index to the application of modern land surface models (LSMs based on complex water and energy balance formulations. A quasi-global evaluation of lagged VI/soil moisture cross-correlation suggests, when globally averaged across the entire annual cycle, soil moisture estimates obtained from complex LSMs provide little added skill (< 5% in relative terms in anticipating variations in vegetation condition relative to a simplified water accounting procedure based solely on observed precipitation. However, larger amounts of added skill (5–15% in relative terms can be identified when focusing exclusively on the extra-tropical growing season and/or utilizing soil moisture values acquired by averaging across a multi-model ensemble.

  4. Utilizing Land:Water Isopleths for Storm Surge Model Development in Coastal Louisiana

    Science.gov (United States)

    Siverd, C. G.; Hagen, S. C.; Bilskie, M. V.; Braud, D.; Peele, H.; Twilley, R.

    2016-12-01

    In the Mississippi River Delta (MRD) Land:Water (L:W) isopleths (Gagliano et al., 1970, 1971) can be used to better understand coastal flood risk from hurricanes than simple estimates of land loss (Twilley et al., 2016). The major goal of this study is to develop a methodology that utilizes L:W isopleths to simplify a detailed present day storm surge model of coastal Louisiana. A secondary goal is to represent marsh fragmentation via L:W isopleths for modeling (for example) storm surge. Isopleths of L:W were derived for the year 2010 and include 1%, 10%, 20%, 30%, 40%, 50%, 60%, 70%, 80%, 90%, 99% (1% being mostly water and 99% being mostly land). Thirty-six models were developed via permutations of two isopleths selected with no repetition between 1% and 99%. The selected two isopleths result in three polygons which represent "open water/transition", "marsh", and "land". The ADvaced CIRCulation (ADCIRC) code (Luettich and Westerink, 2006) was used to perform storm surge simulations. Hydrologic basins, specifically Hydrologic Unit Code 12 (HUC12s), were used to quantify the water surface elevation, depth, volume, area and retention time across south Louisiana for each storm simulation and to provide a basin by basin comparison for the detailed model vs. simplified model results. This methodology aids in identifying the simplified model that most closely resembles the detailed model. It can also be used to develop comparable storm surge models for historical eras prior to the advent of modern remote sensing technology for the purpose of storm surge analysis throughout time.

  5. Required Collaborative Work in Online Courses: A Predictive Modeling Approach

    Science.gov (United States)

    Smith, Marlene A.; Kellogg, Deborah L.

    2015-01-01

    This article describes a predictive model that assesses whether a student will have greater perceived learning in group assignments or in individual work. The model produces correct classifications 87.5% of the time. The research is notable in that it is the first in the education literature to adopt a predictive modeling methodology using data…

  6. Quantitative utilization of prior biological knowledge in the Bayesian network modeling of gene expression data

    Directory of Open Access Journals (Sweden)

    Gao Shouguo

    2011-08-01

    Full Text Available Abstract Background Bayesian Network (BN is a powerful approach to reconstructing genetic regulatory networks from gene expression data. However, expression data by itself suffers from high noise and lack of power. Incorporating prior biological knowledge can improve the performance. As each type of prior knowledge on its own may be incomplete or limited by quality issues, integrating multiple sources of prior knowledge to utilize their consensus is desirable. Results We introduce a new method to incorporate the quantitative information from multiple sources of prior knowledge. It first uses the Naïve Bayesian classifier to assess the likelihood of functional linkage between gene pairs based on prior knowledge. In this study we included cocitation in PubMed and schematic similarity in Gene Ontology annotation. A candidate network edge reservoir is then created in which the copy number of each edge is proportional to the estimated likelihood of linkage between the two corresponding genes. In network simulation the Markov Chain Monte Carlo sampling algorithm is adopted, and samples from this reservoir at each iteration to generate new candidate networks. We evaluated the new algorithm using both simulated and real gene expression data including that from a yeast cell cycle and a mouse pancreas development/growth study. Incorporating prior knowledge led to a ~2 fold increase in the number of known transcription regulations recovered, without significant change in false positive rate. In contrast, without the prior knowledge BN modeling is not always better than a random selection, demonstrating the necessity in network modeling to supplement the gene expression data with additional information. Conclusion our new development provides a statistical means to utilize the quantitative information in prior biological knowledge in the BN modeling of gene expression data, which significantly improves the performance.

  7. Memory Elicited by Courtship Conditioning Requires Mushroom Body Neuronal Subsets Similar to Those Utilized in Appetitive Memory

    Science.gov (United States)

    Montague, Shelby A.; Baker, Bruce S.

    2016-01-01

    An animal’s ability to learn and to form memories is essential for its survival. The fruit fly has proven to be a valuable model system for studies of learning and memory. One learned behavior in fruit flies is courtship conditioning. In Drosophila courtship conditioning, male flies learn not to court females during training with an unreceptive female. He retains a memory of this training and for several hours decreases courtship when subsequently paired with any female. Courtship conditioning is a unique learning paradigm; it uses a positive-valence stimulus, a female fly, to teach a male to decrease an innate behavior, courtship of the female. As such, courtship conditioning is not clearly categorized as either appetitive or aversive conditioning. The mushroom body (MB) region in the fruit fly brain is important for several types of memory; however, the precise subsets of intrinsic and extrinsic MB neurons necessary for courtship conditioning are unknown. Here, we disrupted synaptic signaling by driving a shibirets effector in precise subsets of MB neurons, defined by a collection of split-GAL4 drivers. Out of 75 lines tested, 32 showed defects in courtship conditioning memory. Surprisingly, we did not have any hits in the γ lobe Kenyon cells, a region previously implicated in courtship conditioning memory. We did find that several γ lobe extrinsic neurons were necessary for courtship conditioning memory. Overall, our memory hits in the dopaminergic neurons (DANs) and the mushroom body output neurons were more consistent with results from appetitive memory assays than aversive memory assays. For example, protocerebral anterior medial DANs were necessary for courtship memory, similar to appetitive memory, while protocerebral posterior lateral 1 (PPL1) DANs, important for aversive memory, were not needed. Overall, our results indicate that the MB circuits necessary for courtship conditioning memory coincide with circuits necessary for appetitive memory. PMID

  8. Utilization of Expert Knowledge in a Multi-Objective Hydrologic Model Automatic Calibration Process

    Science.gov (United States)

    Quebbeman, J.; Park, G. H.; Carney, S.; Day, G. N.; Micheletty, P. D.

    2016-12-01

    Spatially distributed continuous simulation hydrologic models have a large number of parameters for potential adjustment during the calibration process. Traditional manual calibration approaches of such a modeling system is extremely laborious, which has historically motivated the use of automatic calibration procedures. With a large selection of model parameters, achieving high degrees of objective space fitness - measured with typical metrics such as Nash-Sutcliffe, Kling-Gupta, RMSE, etc. - can easily be achieved using a range of evolutionary algorithms. A concern with this approach is the high degree of compensatory calibration, with many similarly performing solutions, and yet grossly varying parameter set solutions. To help alleviate this concern, and mimic manual calibration processes, expert knowledge is proposed for inclusion within the multi-objective functions, which evaluates the parameter decision space. As a result, Pareto solutions are identified with high degrees of fitness, but also create parameter sets that maintain and utilize available expert knowledge resulting in more realistic and consistent solutions. This process was tested using the joint SNOW-17 and Sacramento Soil Moisture Accounting method (SAC-SMA) within the Animas River basin in Colorado. Three different elevation zones, each with a range of parameters, resulted in over 35 model parameters simultaneously calibrated. As a result, high degrees of fitness were achieved, in addition to the development of more realistic and consistent parameter sets such as those typically achieved during manual calibration procedures.

  9. Mechanistic modeling study on process optimization and precursor utilization with atmospheric spatial atomic layer deposition

    Energy Technology Data Exchange (ETDEWEB)

    Deng, Zhang; He, Wenjie; Duan, Chenlong [State Key Laboratory of Digital Manufacturing Equipment and Technology, School of Mechanical Science and Engineering, Huazhong University of Science and Technology, Wuhan, Hubei 430074 (China); Chen, Rong, E-mail: rongchen@mail.hust.edu.cn [State Key Laboratory of Digital Manufacturing Equipment and Technology, School of Mechanical Science and Engineering, School of Optical and Electronic Information, Huazhong University of Science and Technology, Wuhan, Hubei 430074 (China); Shan, Bin [State Key Laboratory of Material Processing and Die & Mould Technology, School of Materials Science and Engineering, Huazhong University of Science and Technology, Wuhan, Hubei 430074 (China)

    2016-01-15

    Spatial atomic layer deposition (SALD) is a promising technology with the aim of combining the advantages of excellent uniformity and conformity of temporal atomic layer deposition (ALD), and an industrial scalable and continuous process. In this manuscript, an experimental and numerical combined model of atmospheric SALD system is presented. To establish the connection between the process parameters and the growth efficiency, a quantitative model on reactant isolation, throughput, and precursor utilization is performed based on the separation gas flow rate, carrier gas flow rate, and precursor mass fraction. The simulation results based on this model show an inverse relation between the precursor usage and the carrier gas flow rate. With the constant carrier gas flow, the relationship of precursor usage and precursor mass fraction follows monotonic function. The precursor concentration, regardless of gas velocity, is the determinant factor of the minimal residual time. The narrow gap between precursor injecting heads and the substrate surface in general SALD system leads to a low Péclet number. In this situation, the gas diffusion act as a leading role in the precursor transport in the small gap rather than the convection. Fluid kinetics from the numerical model is independent of the specific structure, which is instructive for the SALD geometry design as well as its process optimization.

  10. Model of sustainable utilization of organic solids waste in Cundinamarca, Colombia

    Directory of Open Access Journals (Sweden)

    Solanyi Castañeda Torres

    2017-05-01

    Full Text Available Introduction: This article considers a proposal of a model of use of organic solids waste for the department of Cundinamarca, which responds to the need for a tool to support decision-making for the planning and management of organic solids waste. Objective: To perform an approximation of a conceptual technical and mathematician optimization model to support decision-making in order to minimize environmental impacts. Materials and methods: A descriptive study was applied due to the fact that some fundamental characteristics of the studied homogeneous phenomenon are presented and it is also considered to be quasi experimental. The calculation of the model for plants of the department is based on three axes (environmental, economic and social, that are present in the general equation of optimization. Results: A model of harnessing organic solids waste in the techniques of biological treatment of composting aerobic and worm cultivation is obtained, optimizing the system with the emissions savings of greenhouse gases spread into the atmosphere, and in the reduction of the overall cost of final disposal of organic solids waste in sanitary landfill. Based on the economic principle of utility that determines the environmental feasibility and sustainability in the plants of harnessing organic solids waste to the department, organic fertilizers such as compost and humus capture carbon and nitrogen that reduce the tons of CO2.

  11. Generation, validation, and utilization of a three-dimensional pharmacophore model for EP3 antagonists.

    Science.gov (United States)

    Mishra, Rama K; Singh, Jasbir

    2010-08-23

    Studies reported here are aimed to investigate the important structural features that characterize the human EP(3) antagonists. Based on the knowledge of low-energy conformation of the endogenous ligand, the initial hit analogs were prepared. Subsequently, a ligand-based lead optimization approach using pharmacophore model generation was utilized. A 5-point pharmacophore using a training set of 19 compounds spanning the IC(50) data over 4-log order was constructed using the HypoGen module of Catalyst. Following pharmacophore customization, using a linear structure-activity regression equation, a six feature three-dimensional predictive pharmacophore model, P6, was built, which resulted in improved predictive power. The P6 model was validated using a test set of 11 compounds providing a correlation coefficient (R(2)) of 0.90 for predictive versus experimental EP(3) IC(50) values. This pharmacophore model has been expanded to include diverse chemotypes, and the predictive ability of the customized pharmacophore has been tested.

  12. Practical whole-tooth restoration utilizing autologous bioengineered tooth germ transplantation in a postnatal canine model

    Science.gov (United States)

    Ono, Mitsuaki; Oshima, Masamitsu; Ogawa, Miho; Sonoyama, Wataru; Hara, Emilio Satoshi; Oida, Yasutaka; Shinkawa, Shigehiko; Nakajima, Ryu; Mine, Atsushi; Hayano, Satoru; Fukumoto, Satoshi; Kasugai, Shohei; Yamaguchi, Akira; Tsuji, Takashi; Kuboki, Takuo

    2017-01-01

    Whole-organ regeneration has great potential for the replacement of dysfunctional organs through the reconstruction of a fully functional bioengineered organ using three-dimensional cell manipulation in vitro. Recently, many basic studies of whole-tooth replacement using three-dimensional cell manipulation have been conducted in a mouse model. Further evidence of the practical application to human medicine is required to demonstrate tooth restoration by reconstructing bioengineered tooth germ using a postnatal large-animal model. Herein, we demonstrate functional tooth restoration through the autologous transplantation of bioengineered tooth germ in a postnatal canine model. The bioengineered tooth, which was reconstructed using permanent tooth germ cells, erupted into the jawbone after autologous transplantation and achieved physiological function equivalent to that of a natural tooth. This study represents a substantial advancement in whole-organ replacement therapy through the transplantation of bioengineered organ germ as a practical model for future clinical regenerative medicine. PMID:28300208

  13. Neuro-fuzzy inverse model control structure of robotic manipulators utilized for physiotherapy applications

    Directory of Open Access Journals (Sweden)

    A.A. Fahmy

    2013-12-01

    Full Text Available This paper presents a new neuro-fuzzy controller for robot manipulators. First, an inductive learning technique is applied to generate the required inverse modeling rules from input/output data recorded in the off-line structure learning phase. Second, a fully differentiable fuzzy neural network is developed to construct the inverse dynamics part of the controller for the online parameter learning phase. Finally, a fuzzy-PID-like incremental controller was employed as Feedback servo controller. The proposed control system was tested using dynamic model of a six-axis industrial robot. The control system showed good results compared to the conventional PID individual joint controller.

  14. A framework to utilize turbulent flux measurements for mesoscale models and remote sensing applications

    Directory of Open Access Journals (Sweden)

    W. Babel

    2011-05-01

    Full Text Available Meteorologically measured fluxes of energy and matter between the surface and the atmosphere originate from a source area of certain extent, located in the upwind sector of the device. The spatial representativeness of such measurements is strongly influenced by the heterogeneity of the landscape. The footprint concept is capable of linking observed data with spatial heterogeneity. This study aims at upscaling eddy covariance derived fluxes to a grid size of 1 km edge length, which is typical for mesoscale models or low resolution remote sensing data.

    Here an upscaling strategy is presented, utilizing footprint modelling and SVAT modelling as well as observations from a target land-use area. The general idea of this scheme is to model fluxes from adjacent land-use types and combine them with the measured flux data to yield a grid representative flux according to the land-use distribution within the grid cell. The performance of the upscaling routine is evaluated with real datasets, which are considered to be land-use specific fluxes in a grid cell. The measurements above rye and maize fields stem from the LITFASS experiment 2003 in Lindenberg, Germany and the respective modelled timeseries were derived by the SVAT model SEWAB. Contributions from each land-use type to the observations are estimated using a forward lagrangian stochastic model. A representation error is defined as the error in flux estimates made when accepting the measurements unchanged as grid representative flux and ignoring flux contributions from other land-use types within the respective grid cell.

    Results show that this representation error can be reduced up to 56 % when applying the spatial integration. This shows the potential for further application of this strategy, although the absolute differences between flux observations from rye and maize were so small, that the spatial integration would be rejected in a real situation. Corresponding thresholds for

  15. Effects of the Affordable Care Act's contraceptive coverage requirement on the utilization and out-of-pocket costs of prescribed oral contraceptives.

    Science.gov (United States)

    Kim, Nam Hyo; Look, Kevin A

    2017-06-17

    The Affordable Care Act (ACA) mandated that private health insurance plans cover prescribed contraceptive services for women, including oral contraceptives (OCs), without charging a patient any cost-sharing beginning in August 2012. To evaluate the effects of the ACA's contraceptive coverage requirement on the utilization and out-of-pocket costs of prescribed OCs after two years of implementation. A retrospective, cross-sectional study was designed using data from the 2010 to 2014 waves of the Medical Expenditure Panel Survey. The sample consisted of reproductive-aged women who have either private health insurance or Medicaid. Utilization of OCs was evaluated using 1) the proportion of women who purchased any OCs and 2) the mean annual number of cycles prescribed per woman. Out-of-pocket costs for OCs were evaluated using 1) the proportion of women who had any OC purchase with $0 out-of-pocket costs, 2) the mean annual out-of-pocket costs per woman, and 3) the mean out-of-pocket costs per cycle. Descriptive analyses and a difference-in-difference linear regression approach were used. No substantial changes were seen in the utilization of OCs after the ACA requirement became effective. The difference-in-difference regression showed that the proportion of women who had any OC purchase with $0 out-of-pocket costs increased significantly by 54.0 percentage points after the ACA requirement in the private insurance group relative to the Medicaid group. Mean annual out-of-pocket costs in the private insurance group dropped by 37% in the first year and an additional 52% decrease was found in the second year of the policy. Mean out-of-pocket costs per cycle also decreased substantially in the private insurance group by 39% in the first year and an additional decrease of 44% was seen in the second year. The ACA's contraceptive coverage requirement markedly reduced out-of-pocket costs of prescribed OCs for women with private health insurance. Copyright © 2017 Elsevier Inc

  16. Existing and Required Modeling Capabilities for Evaluating ATM Systems and Concepts

    Science.gov (United States)

    Odoni, Amedeo R.; Bowman, Jeremy; Delahaye, Daniel; Deyst, John J.; Feron, Eric; Hansman, R. John; Khan, Kashif; Kuchar, James K.; Pujet, Nicolas; Simpson, Robert W.

    1997-01-01

    ATM systems throughout the world are entering a period of major transition and change. The combination of important technological developments and of the globalization of the air transportation industry has necessitated a reexamination of some of the fundamental premises of existing Air Traffic Management (ATM) concepts. New ATM concepts have to be examined, concepts that may place more emphasis on: strategic traffic management; planning and control; partial decentralization of decision-making; and added reliance on the aircraft to carry out strategic ATM plans, with ground controllers confined primarily to a monitoring and supervisory role. 'Free Flight' is a case in point. In order to study, evaluate and validate such new concepts, the ATM community will have to rely heavily on models and computer-based tools/utilities, covering a wide range of issues and metrics related to safety, capacity and efficiency. The state of the art in such modeling support is adequate in some respects, but clearly deficient in others. It is the objective of this study to assist in: (1) assessing the strengths and weaknesses of existing fast-time models and tools for the study of ATM systems and concepts and (2) identifying and prioritizing the requirements for the development of additional modeling capabilities in the near future. A three-stage process has been followed to this purpose: 1. Through the analysis of two case studies involving future ATM system scenarios, as well as through expert assessment, modeling capabilities and supporting tools needed for testing and validating future ATM systems and concepts were identified and described. 2. Existing fast-time ATM models and support tools were reviewed and assessed with regard to the degree to which they offer the capabilities identified under Step 1. 3 . The findings of 1 and 2 were combined to draw conclusions about (1) the best capabilities currently existing, (2) the types of concept testing and validation that can be carried

  17. Structural modelling of thrust zones utilizing photogrammetry: Western Champsaur basin, SE France

    Science.gov (United States)

    Totake, Yukitsugu; Butler, Rob; Bond, Clare

    2016-04-01

    Recent advances in photogrammetric technologies allow geoscientists to easily obtain a high-resolution 3D geospatial data across multiple scales, from rock specimen to landscape. Although resolution and accuracy of photogrammetry models are dependent on various factors (a quality of photography, number of overlapping photo images, distance to targets, etc), modern photogrammetry techniques can even provide a comparable data resolution to laser scanning technologies (modelling of various geological objects. Another advantages of photogrammetry techniques, high portability and low costs for infrastructures, ease to incorporate these techniques with conventional geological surveys. Photogrammetry techniques have a great potential to enhance performances of geological surveys. We present a workflow for building basin-scale 3D structural models utilizing the ground-based photogrammetry along with field observations. The workflow is applied to model thrust zones in Eocene-Oligocene turbidite sequences called Champsaur Sandstone (Gres du Champsaur) filling an Alpine fore-deep basin, Western Champsaur basin, in southeastern France. The study area is located ca. 20km northeast from Gap, and approximately extends 10 km from east to west and 6 km from north to south. During a 2-week fieldwork, over 9400 photographs were taken at 133 locations by a handheld digital camera from ground, and were georeferenced with a handheld GPS. Photo images were processed within software PhotoScan to build a 3D photogrammetric model. The constructed photogrammetry model was then imported into software Move to map faults and geological layers along with georeferenced field data so that geological cross sections and 3D surfaces are produced. The workflow succeeded to produce a detailed topography and textures of landscape at ~1m resolution, and enabled to characterize thrust systems in the study area at bed-scale resolution. Three-dimensionally characterized architectures of thrust zones at high

  18. Accessing and Utilizing Remote Sensing Data for Vectorborne Infectious Diseases Surveillance and Modeling

    Science.gov (United States)

    Kiang, Richard; Adimi, Farida; Kempler, Steven

    2008-01-01

    Background: The transmission of vectorborne infectious diseases is often influenced by environmental, meteorological and climatic parameters, because the vector life cycle depends on these factors. For example, the geophysical parameters relevant to malaria transmission include precipitation, surface temperature, humidity, elevation, and vegetation type. Because these parameters are routinely measured by satellites, remote sensing is an important technological tool for predicting, preventing, and containing a number of vectorborne infectious diseases, such as malaria, dengue, West Nile virus, etc. Methods: A variety of NASA remote sensing data can be used for modeling vectorborne infectious disease transmission. We will discuss both the well known and less known remote sensing data, including Landsat, AVHRR (Advanced Very High Resolution Radiometer), MODIS (Moderate Resolution Imaging Spectroradiometer), TRMM (Tropical Rainfall Measuring Mission), ASTER (Advanced Spaceborne Thermal Emission and Reflection Radiometer), EO-1 (Earth Observing One) ALI (Advanced Land Imager), and SIESIP (Seasonal to Interannual Earth Science Information Partner) dataset. Giovanni is a Web-based application developed by the NASA Goddard Earth Sciences Data and Information Services Center. It provides a simple and intuitive way to visualize, analyze, and access vast amounts of Earth science remote sensing data. After remote sensing data is obtained, a variety of techniques, including generalized linear models and artificial intelligence oriented methods, t 3 can be used to model the dependency of disease transmission on these parameters. Results: The processes of accessing, visualizing and utilizing precipitation data using Giovanni, and acquiring other data at additional websites are illustrated. Malaria incidence time series for some parts of Thailand and Indonesia are used to demonstrate that malaria incidences are reasonably well modeled with generalized linear models and artificial

  19. Modeling menopause: The utility of rodents in translational behavioral endocrinology research.

    Science.gov (United States)

    Koebele, Stephanie V; Bimonte-Nelson, Heather A

    2016-05-01

    The human menopause transition and aging are each associated with an increase in a variety of health risk factors including, but not limited to, cardiovascular disease, osteoporosis, cancer, diabetes, stroke, sexual dysfunction, affective disorders, sleep disturbances, and cognitive decline. It is challenging to systematically evaluate the biological underpinnings associated with the menopause transition in the human population. For this reason, rodent models have been invaluable tools for studying the impact of gonadal hormone fluctuations and eventual decline on a variety of body systems. While it is essential to keep in mind that some of the mechanisms associated with aging and the transition into a reproductively senescent state can differ when translating from one species to another, animal models provide researchers with opportunities to gain a fundamental understanding of the key elements underlying reproduction and aging processes, paving the way to explore novel pathways for intervention associated with known health risks. Here, we discuss the utility of several rodent models used in the laboratory for translational menopause research, examining the benefits and drawbacks in helping us to better understand aging and the menopause transition in women. The rodent models discussed are ovary-intact, ovariectomy, and 4-vinylcylohexene diepoxide for the menopause transition. We then describe how these models may be implemented in the laboratory, particularly in the context of cognition. Ultimately, we aim to use these animal models to elucidate novel perspectives and interventions for maintaining a high quality of life in women, and to potentially prevent or postpone the onset of negative health consequences associated with these significant life changes during aging. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. Utility of population pharmacokinetic modeling in the assessment of therapeutic protein-drug interactions.

    Science.gov (United States)

    Chow, Andrew T; Earp, Justin C; Gupta, Manish; Hanley, William; Hu, Chuanpu; Wang, Diane D; Zajic, Stefan; Zhu, Min

    2014-05-01

    Assessment of pharmacokinetic (PK) based drug-drug interactions (DDI) is essential for ensuring patient safety and drug efficacy. With the substantial increase in therapeutic proteins (TP) entering the market and drug development, evaluation of TP-drug interaction (TPDI) has become increasingly important. Unlike for small molecule (e.g., chemical-based) drugs, conducting TPDI studies often presents logistical challenges, while the population PK (PPK) modeling may be a viable approach dealing with the issues. A working group was formed with members from the pharmaceutical industry and the FDA to assess the utility of PPK-based TPDI assessment including study designs, data analysis methods, and implementation strategy. This paper summarizes key issues for consideration as well as a proposed strategy with focuses on (1) PPK approach for exploratory assessment; (2) PPK approach for confirmatory assessment; (3) importance of data quality; (4) implementation strategy; and (5) potential regulatory implications. Advantages and limitations of the approach are also discussed.

  1. Mathematical model of a utility firm. Final technical report, Part I

    Energy Technology Data Exchange (ETDEWEB)

    1983-08-21

    Utility companies are in the predicament of having to make forecasts, and draw up plans for the future, in an increasingly fluid and volatile socio-economic environment. The project being reported is to contribute to an understanding of the economic and behavioral processes that take place within a firm, and without it. Three main topics are treated. One is the representation of the characteristics of the members of an organization, to the extent to which characteristics seem pertinent to the processes of interest. The second is the appropriate management of the processes of change by an organization. The third deals with the competitive striving towards an economic equilibrium among the members of a society in the large, on the theory that this process might be modeled in a way which is similar to the one for the intra-organizational ones. This volume covers mainly the first topic.

  2. In-House Communication Support System Based on the Information Propagation Model Utilizes Social Network

    Science.gov (United States)

    Takeuchi, Susumu; Teranishi, Yuuichi; Harumoto, Kaname; Shimojo, Shinji

    Almost all companies are now utilizing computer networks to support speedier and more effective in-house information-sharing and communication. However, existing systems are designed to support communications only within the same department. Therefore, in our research, we propose an in-house communication support system which is based on the “Information Propagation Model (IPM).” The IPM is proposed to realize word-of-mouth communication in a social network, and to support information-sharing on the network. By applying the system in a real company, we found that information could be exchanged between different and unrelated departments, and such exchanges of information could help to build new relationships between the users who are apart on the social network.

  3. Achieving a System Operational Availability Requirement (ASOAR) Model

    Science.gov (United States)

    1992-07-01

    ASOAR requires only system and end item level input data, not Line Replaceable Unit (LRU) Input data. ASOAR usage provides concepts for major logistics...the Corp/Theater ADP Service Center II (CTASC II) to a systen operational availabilty goal. The CTASC II system configuration had many redundant types

  4. Modeling and optimization of processes for clean and efficient pulverized coal combustion in utility boilers

    Directory of Open Access Journals (Sweden)

    Belošević Srđan V.

    2016-01-01

    Full Text Available Pulverized coal-fired power plants should provide higher efficiency of energy conversion, flexibility in terms of boiler loads and fuel characteristics and emission reduction of pollutants like nitrogen oxides. Modification of combustion process is a cost-effective technology for NOx control. For optimization of complex processes, such as turbulent reactive flow in coal-fired furnaces, mathematical modeling is regularly used. The NOx emission reduction by combustion modifications in the 350 MWe Kostolac B boiler furnace, tangentially fired by pulverized Serbian lignite, is investigated in the paper. Numerical experiments were done by an in-house developed three-dimensional differential comprehensive combustion code, with fuel- and thermal-NO formation/destruction reactions model. The code was developed to be easily used by engineering staff for process analysis in boiler units. A broad range of operating conditions was examined, such as fuel and preheated air distribution over the burners and tiers, operation mode of the burners, grinding fineness and quality of coal, boiler loads, cold air ingress, recirculation of flue gases, water-walls ash deposition and combined effect of different parameters. The predictions show that the NOx emission reduction of up to 30% can be achieved by a proper combustion organization in the case-study furnace, with the flame position control. Impact of combustion modifications on the boiler operation was evaluated by the boiler thermal calculations suggesting that the facility was to be controlled within narrow limits of operation parameters. Such a complex approach to pollutants control enables evaluating alternative solutions to achieve efficient and low emission operation of utility boiler units. [Projekat Ministarstva nauke Republike Srbije, br. TR-33018: Increase in energy and ecology efficiency of processes in pulverized coal-fired furnace and optimization of utility steam boiler air preheater by using in

  5. Utilization of isolated marine mussel cells as an in vitro model to assess xenobiotics induced genotoxicity.

    Science.gov (United States)

    Zhang, Y F; Chen, S Y; Qu, M J; Adeleye, A O; Di, Y N

    2017-10-01

    Freshly isolated cells are used as an ideal experimental model in in vitro toxicology analysis, especially the detection of diverse xenobiotics induced genotoxic effects. In present study, heavy metals (Zn, Cu, Cd, Pb) and PCBs were selected as representative xenobiotics to verify the ability of in vitro model in assessing genotoxic effects in cells of marine mussels (Mytilus galloprovincialis). DNA damage and chromosome aberration were assessed in freshly isolated cells from haemolymph, gill and digestive gland by single cell gel electrophoresis and micronucleus assay respectively. Gill cells showed more sensitive to Zn exposure among three types of cells, indicating tissue-specific genotoxicity. Significantly higher DNA aberrations were induced by Cu in haemocytes compared to Cd and Pb, indicating chemical-specific genotoxicity. An additive effect was detected after combined heavy metals and PCBs exposure, suggesting the interaction of selected xenobiotics. To our knowledge, this is the first attempt to study the complex effects of organic and/or inorganic contaminants using freshly isolated cells from marine mussels. Genetic responses are proved to occur and maintained in vitro in relation to short-term xenobiotics induced stresses. The utilization of the in vitro model could provide a rapid tool to investigate the comprehensive toxic effects in marine invertebrates and monitor environmental health. Copyright © 2017. Published by Elsevier Ltd.

  6. Decision Support for Test Trench Location Selection with 3D Semantic Subsurface Utility Models

    NARCIS (Netherlands)

    Racz, Paulina; Syfuss, Lars; Schultz, Carl; van Buiten, Marinus; olde Scholtenhuis, Léon Luc; Vahdatikhaki, Faridaddin; Doree, Andries G.; Lin, Ken-Yu; El-Gohary, Nora; Tang, Pingbo

    Subsurface utility construction work often involves repositioning of, and working between, existing buried networks. While the amount of utilities in modern cities grows, excavation work becomes more prone to incidents. To prevent such incidents, excavation workers request existing 2D utility maps,

  7. Modeling Crop Water Requirement at Regional Scales in the Context of Integrated Hydrology

    Science.gov (United States)

    Dogrul, E. C.; Kadir, T.; Brush, C. F.; Chung, F. I.

    2009-12-01

    In developed watersheds, the stresses on surface and subsurface water resources are generally created by groundwater pumping and stream flow diversions to satisfy agricultural and urban water requirements. The application of pumping and diversion to meet these requirements also affects the surface and subsurface water system through recharge of the aquifer and surface runoff back into the streams. The agricultural crop water requirement is a function of climate, soil and land surface physical properties as well as land use management practices which are spatially distributed and evolve in time. In almost all modeling studies pumping and diversions are specified as predefined stresses and are not included in the simulation as an integral and dynamic component of the hydrologic cycle that depend on other hydrologic components. To address this issue, California Department of Water Resources has been developing a new root zone module that can either be used as a stand-alone modeling tool or can be linked to other stream and aquifer modeling tools. The tool, named Integrated Water Flow Model Demand Calculator (IDC), computes crop water requirements under user-specified climatic, land-use and irrigation management settings at regional scales, and routes the precipitation and irrigation water through the root zone using physically-based methods. In calculating the crop water requirement, IDC uses an irrigation-scheduling type approach where irrigation is triggered when the soil moisture falls below a user-specified level. Water demands for managed wetlands, urban areas, and agricultural crops including rice, can either be computed by IDC or specified by the user depending on the requirements and available data for the modeling project. For areas covered with native vegetation water demand is not computed and only precipitation is routed through the root zone. Many irrigational practices such as irrigation for leaching, re-use of irrigation return flow, flooding and

  8. ECUT: Energy Conversion and Utilization Technologies program. Heterogeneous catalysis modeling program concept

    Science.gov (United States)

    Voecks, G. E.

    1983-01-01

    Insufficient theoretical definition of heterogeneous catalysts is the major difficulty confronting industrial suppliers who seek catalyst systems which are more active, selective, and stable than those currently available. In contrast, progress was made in tailoring homogeneous catalysts to specific reactions because more is known about the reaction intermediates promoted and/or stabilized by these catalysts during the course of reaction. However, modeling heterogeneous catalysts on a microscopic scale requires compiling and verifying complex information on reaction intermediates and pathways. This can be achieved by adapting homogeneous catalyzed reaction intermediate species, applying theoretical quantum chemistry and computer technology, and developing a better understanding of heterogeneous catalyst system environments. Research in microscopic reaction modeling is now at a stage where computer modeling, supported by physical experimental verification, could provide information about the dynamics of the reactions that will lead to designing supported catalysts with improved selectivity and stability.

  9. A Quantitative Human Spacecraft Design Evaluation Model for Assessing Crew Accommodation and Utilization

    Science.gov (United States)

    Fanchiang, Christine

    Crew performance, including both accommodation and utilization factors, is an integral part of every human spaceflight mission from commercial space tourism, to the demanding journey to Mars and beyond. Spacecraft were historically built by engineers and technologists trying to adapt the vehicle into cutting edge rocketry with the assumption that the astronauts could be trained and will adapt to the design. By and large, that is still the current state of the art. It is recognized, however, that poor human-machine design integration can lead to catastrophic and deadly mishaps. The premise of this work relies on the idea that if an accurate predictive model exists to forecast crew performance issues as a result of spacecraft design and operations, it can help designers and managers make better decisions throughout the design process, and ensure that the crewmembers are well-integrated with the system from the very start. The result should be a high-quality, user-friendly spacecraft that optimizes the utilization of the crew while keeping them alive, healthy, and happy during the course of the mission. Therefore, the goal of this work was to develop an integrative framework to quantitatively evaluate a spacecraft design from the crew performance perspective. The approach presented here is done at a very fundamental level starting with identifying and defining basic terminology, and then builds up important axioms of human spaceflight that lay the foundation for how such a framework can be developed. With the framework established, a methodology for characterizing the outcome using a mathematical model was developed by pulling from existing metrics and data collected on human performance in space. Representative test scenarios were run to show what information could be garnered and how it could be applied as a useful, understandable metric for future spacecraft design. While the model is the primary tangible product from this research, the more interesting outcome of

  10. Modelling of landfill gas adsorption with bottom ash for utilization of renewable energy

    Energy Technology Data Exchange (ETDEWEB)

    Miao, Chen

    2011-10-06

    Energy crisis, environment pollution and climate change are the serious challenges to people worldwide. In the 21st century, human being is trend to research new technology of renewable energy, so as to slow down global warming and develop society in an environmentally sustainable method. Landfill gas, produced by biodegradable municipal solid waste in landfill, is a renewable energy source. In this work, landfill gas utilization for energy generation is introduced. Landfill gas is able to produce hydrogen by steam reforming reactions. There is a steam reformer equipment in the fuel cells system. A sewage plant of Cologne in Germany has run the Phosphoric Acid Fuel Cells power station with biogas for more than 50,000 hours successfully. Landfill gas thus may be used as fuel for electricity generation via fuel cells system. For the purpose of explaining the possibility of landfill gas utilization via fuel cells, the thermodynamics of landfill gas steam reforming are discussed by simulations. In practice, the methane-riched gas can be obtained by landfill gas purification and upgrading. This work investigate a new method for upgrading-landfill gas adsorption with bottom ash experimentally. Bottom ash is a by-product of municipal solid waste incineration, some of its physical and chemical properties are analysed in this work. The landfill gas adsorption experimental data show bottom ash can be used as a potential adsorbent for landfill gas adsorption to remove CO{sub 2}. In addition, the alkalinity of bottom ash eluate can be reduced in these adsorption processes. Therefore, the interactions between landfill gas and bottom ash can be explained by series reactions accordingly. Furthermore, a conceptual model involving landfill gas adsorption with bottom ash is developed. In this thesis, the parameters of landfill gas adsorption equilibrium equations can be obtained by fitting experimental data. On the other hand, these functions can be deduced with theoretical approach

  11. Model Predictive Control of A Matrix-Converter Based Solid State Transformer for Utility Grid Interaction

    Energy Technology Data Exchange (ETDEWEB)

    Xue, Yaosuo [ORNL

    2016-01-01

    The matrix converter solid state transformer (MC-SST), formed from the back-to-back connection of two three-to-single-phase matrix converters, is studied for use in the interconnection of two ac grids. The matrix converter topology provides a light weight and low volume single-stage bidirectional ac-ac power conversion without the need for a dc link. Thus, the lifetime limitations of dc-bus storage capacitors are avoided. However, space vector modulation of this type of MC-SST requires to compute vectors for each of the two MCs, which must be carefully coordinated to avoid commutation failure. An additional controller is also required to control power exchange between the two ac grids. In this paper, model predictive control (MPC) is proposed for an MC-SST connecting two different ac power grids. The proposed MPC predicts the circuit variables based on the discrete model of MC-SST system and the cost function is formulated so that the optimal switch vector for the next sample period is selected, thereby generating the required grid currents for the SST. Simulation and experimental studies are carried out to demonstrate the effectiveness and simplicity of the proposed MPC for such MC-SST-based grid interfacing systems.

  12. Evaluation of Drogue Parachute Damping Effects Utilizing the Apollo Legacy Parachute Model

    Science.gov (United States)

    Currin, Kelly M.; Gamble, Joe D.; Matz, Daniel A.; Bretz, David R.

    2011-01-01

    Drogue parachute damping is required to dampen the Orion Multi Purpose Crew Vehicle (MPCV) crew module (CM) oscillations prior to deployment of the main parachutes. During the Apollo program, drogue parachute damping was modeled on the premise that the drogue parachute force vector aligns with the resultant velocity of the parachute attach point on the CM. Equivalent Cm(sub q) and Cm(sub alpha) equations for drogue parachute damping resulting from the Apollo legacy parachute damping model premise have recently been developed. The MPCV computer simulations ANTARES and Osiris have implemented high fidelity two-body parachute damping models. However, high-fidelity model-based damping motion predictions do not match the damping observed during wind tunnel and full-scale free-flight oscillatory motion. This paper will present the methodology for comparing and contrasting the Apollo legacy parachute damping model with full-scale free-flight oscillatory motion. The analysis shows an agreement between the Apollo legacy parachute damping model and full-scale free-flight oscillatory motion.

  13. Three Tier Unified Process Model for Requirement Negotiations and Stakeholder Collaborations

    Science.gov (United States)

    Niazi, Muhammad Ashraf Khan; Abbas, Muhammad; Shahzad, Muhammad

    2012-11-01

    This research paper is focused towards carrying out a pragmatic qualitative analysis of various models and approaches of requirements negotiations (a sub process of requirements management plan which is an output of scope managementís collect requirements process) and studies stakeholder collaborations methodologies (i.e. from within communication management knowledge area). Experiential analysis encompass two tiers; first tier refers to the weighted scoring model while second tier focuses on development of SWOT matrices on the basis of findings of weighted scoring model for selecting an appropriate requirements negotiation model. Finally the results are simulated with the help of statistical pie charts. On the basis of simulated results of prevalent models and approaches of negotiations, a unified approach for requirements negotiations and stakeholder collaborations is proposed where the collaboration methodologies are embeded into selected requirements negotiation model as internal parameters of the proposed process alongside some external required parameters like MBTI, opportunity analysis etc.

  14. COMPLEAT (Community-Oriented Model for Planning Least-Cost Energy Alternatives and Technologies): A planning tool for publicly owned electric utilities. [Community-Oriented Model for Planning Least-Cost Energy Alternatives and Technologies (Compleat)

    Energy Technology Data Exchange (ETDEWEB)

    1990-09-01

    COMPLEAT takes its name, as an acronym, from Community-Oriented Model for Planning Least-Cost Energy Alternatives and Technologies. It is an electric utility planning model designed for use principally by publicly owned electric utilities and agencies serving such utilities. As a model, COMPLEAT is significantly more full-featured and complex than called out in APPA's original plan and proposal to DOE. The additional complexity grew out of a series of discussions early in the development schedule, in which it became clear to APPA staff and advisors that the simplicity characterizing the original plan, while highly desirable in terms of utility applications, was not achievable if practical utility problems were to be addressed. The project teams settled on Energy 20/20, an existing model developed by Dr. George Backus of Policy Assessment Associates, as the best candidate for the kinds of modifications and extensions that would be required. The remainder of the project effort was devoted to designing specific input data files, output files, and user screens and to writing and testing the compute programs that would properly implement the desired features around Energy 20/20 as a core program. This report presents in outline form, the features and user interface of COMPLEAT.

  15. Requirements and Problems in Parallel Model Development at DWD

    Directory of Open Access Journals (Sweden)

    Ulrich Schäattler

    2000-01-01

    Full Text Available Nearly 30 years after introducing the first computer model for weather forecasting, the Deutscher Wetterdienst (DWD is developing the 4th generation of its numerical weather prediction (NWP system. It consists of a global grid point model (GME based on a triangular grid and a non-hydrostatic Lokal Modell (LM. The operational demand for running this new system is immense and can only be met by parallel computers. From the experience gained in developing earlier NWP models, several new problems had to be taken into account during the design phase of the system. Most important were portability (including efficieny of the programs on several computer architectures and ease of code maintainability. Also the organization and administration of the work done by developers from different teams and institutions is more complex than it used to be. This paper describes the models and gives some performance results. The modular approach used for the design of the LM is explained and the effects on the development are discussed.

  16. The Utilization of Standard Deviational Ellipse (SDE Model for the Analysis of Dengue Fever Cases in Banjar City 2013

    Directory of Open Access Journals (Sweden)

    Martya Rahmaniati

    2014-06-01

    Full Text Available Dengue Fever Disease is still regarded as an endemic disease in Banjar City. Information is still required to map dengue fever case distribution, mean center of case distribution, and the direction of dengue fever case dispersion in order to support the surveillance program in the relation to the vast area of the dengue fever disease control program. The objective of the research is to obtain information regarding the area of dengue fever disease distribution in Banjar City by utilizing the Standard Deviational Ellipse (SDE model. The research is an observational study with Explanatory Spatial Data Analysis (ESDA. Data analysis uses SDE model with the scope of the entire sub district area in Banjar City. The data analyzed is dengue fever case from 2007-2013 periods, with the number of sample of 315 cases. Social demographic overview of dengue fever patients in Banjar City shows that most of the patients are within the productive age, with 39.7% within the school age and 45.7% are within the work age. Most of the dengue fever patients are men (58.1%. Distribution of dengue fever cases from the period of 2007 until 2012 mostly occur in 25-37.5 meters above sea level (MASL (55.8%. The SDE models of dengue fever cases in Banjar City generally form dispersion patterns following the x-axis and clustered by physiographic boundaries. The SDE model can be used to discover dispersion patterns and directions of dengue fever cases, therefore, dengue fever disease control program can be conducted based on local-specific information, in order to support health decision.

  17. Modeling strategy to identify patients with primary immunodeficiency utilizing risk management and outcome measurement.

    Science.gov (United States)

    Modell, Vicki; Quinn, Jessica; Ginsberg, Grant; Gladue, Ron; Orange, Jordan; Modell, Fred

    2017-06-01

    This study seeks to generate analytic insights into risk management and probability of an identifiable primary immunodeficiency defect. The Jeffrey Modell Centers Network database, Jeffrey Modell Foundation's 10 Warning Signs, the 4 Stages of Testing Algorithm, physician-reported clinical outcomes, programs of physician education and public awareness, the SPIRIT® Analyzer, and newborn screening, taken together, generates P values of less than 0.05%. This indicates that the data results do not occur by chance, and that there is a better than 95% probability that the data are valid. The objectives are to improve patients' quality of life, while generating significant reduction of costs. The advances of the world's experts aligned with these JMF programs can generate analytic insights as to risk management and probability of an identifiable primary immunodeficiency defect. This strategy reduces the uncertainties related to primary immunodeficiency risks, as we can screen, test, identify, and treat undiagnosed patients. We can also address regional differences and prevalence, age, gender, treatment modalities, and sites of care, as well as economic benefits. These tools support high net benefits, substantial financial savings, and significant reduction of costs. All stakeholders, including patients, clinicians, pharmaceutical companies, third party payers, and government healthcare agencies, must address the earliest possible precise diagnosis, appropriate intervention and treatment, as well as stringent control of healthcare costs through risk assessment and outcome measurement. An affected patient is entitled to nothing less, and stakeholders are responsible to utilize tools currently available. Implementation offers a significant challenge to the entire primary immunodeficiency community.

  18. Identifying damage locations under ambient vibrations utilizing vector autoregressive models and Mahalanobis distances

    Science.gov (United States)

    Mosavi, A. A.; Dickey, D.; Seracino, R.; Rizkalla, S.

    2012-01-01

    This paper presents a study for identifying damage locations in an idealized steel bridge girder using the ambient vibration measurements. A sensitive damage feature is proposed in the context of statistical pattern recognition to address the damage detection problem. The study utilizes an experimental program that consists of a two-span continuous steel beam subjected to ambient vibrations. The vibration responses of the beam are measured along its length under simulated ambient vibrations and different healthy/damage conditions of the beam. The ambient vibration is simulated using a hydraulic actuator, and damages are induced by cutting portions of the flange at two locations. Multivariate vector autoregressive models were fitted to the vibration response time histories measured at the multiple sensor locations. A sensitive damage feature is proposed for identifying the damage location by applying Mahalanobis distances to the coefficients of the vector autoregressive models. A linear discriminant criterion was used to evaluate the amount of variations in the damage features obtained for different sensor locations with respect to the healthy condition of the beam. The analyses indicate that the highest variations in the damage features were coincident with the sensors closely located to the damages. The presented method showed a promising sensitivity to identify the damage location even when the induced damage was very small.

  19. Impact of the health services utilization and improvement model (HUIM) on self efficacy and satisfaction among a head start population.

    Science.gov (United States)

    Tataw, David B; Bazargan-Hejazi, Shahrzad

    2010-01-01

    The aim of this paper is to evaluate and report the impact of the Health Services Utilization Improvement Model (HUIM) on utilization and satisfaction with care, as well as knowledge regarding prevention, detection, and treatment of asthma, diabetes, tuberculosis, and child injury among low income health services consumers. HUIM outcomes data shows that the coupling of parental education and ecological factors (service linkage and provider orientation) impacts the health services utilization experience of low income consumers evidenced by improved self-efficacy (knowledge and voice), and satisfaction with care from a child's regular provider. Participation in HUIM activities also improved the low income consumer's knowledge of disease identification, self-care and prevention.

  20. A Weapon Target Assignment Model Based on Weapon Utility%一种基于武器效用的武器目标分配模型

    Institute of Scientific and Technical Information of China (English)

    王金山; 李伟兵

    2015-01-01

    为解决武器优化分配中存在的2个问题,提出一种基于武器效用的武器目标分配模型。通过两类武器的效用分析,把目标达到期望毁伤概率作为武器效用最大的起点,设置两类武器的效用函数,以最大武器效用为准则,建立武器分配的线性整数规划模型,并对比2种模型的结果。实践结果证明:新模型求解分配的速度快耗时短,可满足战场需求,且结果更加合理。%In order to overtake two problems in weapon optimal assignment, propose a weapon target assignment model based on weapon utility. By two type weapons utility analysis, set expected kill probability as the start point of maximum weapon utility, and set utility function of two type weapons, takes maximum weapon utility as rule, establish linear integer planning model, and compare the results of two models. The practice results show that the new model has fast speed on solution distribution and use less time. It meets the battlefield requirements and has more reasonable results.

  1. Modelling and Simulation for Requirements Engineering and Options Analysis

    Science.gov (United States)

    2010-05-01

    Defence, 2010 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la Défense nationale, 2010 DRDC Toronto CR 2010...externalize their mental model of the assumed solution for critique and correction by others, and whether or not this would assist in ensuring that

  2. Predicting Flu Season Requirements: An Undergraduate Modeling Project

    Science.gov (United States)

    Kramlich, Gary R., II; Braunstein Fierson, Janet L.; Wright, J. Adam

    2010-01-01

    This project was designed to be used in a freshman calculus class whose students had already been introduced to logistic functions and basic data modeling techniques. It need not be limited to such an audience, however; it has also been implemented in a topics in mathematics class for college upperclassmen. Originally intended to be presented in…

  3. Thermal Modeling and Feedback Requirements for LIFE Neutronic Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Seifried, J E

    2009-07-15

    An initial study is performed to determine how temperature considerations affect LIFE neutronic simulations. Among other figures of merit, the isotopic mass accumulation, thermal power, tritium breeding, and criticality are analyzed. Possible fidelities of thermal modeling and degrees of coupling are explored. Lessons learned from switching and modifying nuclear datasets is communicated.

  4. Balancing energy development and conservation: A method utilizing species distribution models

    Science.gov (United States)

    Jarnevich, C.S.; Laubhan, M.K.

    2011-01-01

    Alternative energy development is increasing, potentially leading to negative impacts on wildlife populations already stressed by other factors. Resource managers require a scientifically based methodology to balance energy development and species conservation, so we investigated modeling habitat suitability using Maximum Entropy to develop maps that could be used with other information to help site energy developments. We selected one species of concern, the Lesser Prairie-Chicken (LPCH; Tympanuchus pallidicinctus) found on the southern Great Plains of North America, as our case study. LPCH populations have been declining and are potentially further impacted by energy development. We used LPCH lek locations in the state of Kansas along with several environmental and anthropogenic parameters to develop models that predict the probability of lek occurrence across the landscape. The models all performed well as indicated by the high test area under the curve (AUC) scores (all >0.9). The inclusion of anthropogenic parameters in models resulted in slightly better performance based on AUC values, indicating that anthropogenic features may impact LPCH lek habitat suitability. Given the positive model results, this methodology may provide additional guidance in designing future survey protocols, as well as siting of energy development in areas of marginal or unsuitable habitat for species of concern. This technique could help to standardize and quantify the impacts various developments have upon at-risk species. ?? 2011 Springer Science+Business Media, LLC (outside the USA).

  5. Knowledge Translation for Research Utilization: Design of a Knowledge Translation Model at Tehran University of Medical Sciences

    Science.gov (United States)

    Majdzadeh, Reza; Sadighi, Jila; Nejat, Saharnaz; Mahani, Ali Shahidzade; Gholami, Jaleh

    2008-01-01

    Introduction: The present study aimed to generate a model that would provide a conceptual framework for linking disparate components of knowledge translation. A theoretical model of such would enable the organization and evaluation of attempts to analyze current conditions and to design interventions on the transfer and utilization of research…

  6. Utility of models of the gastrointestinal tract for assessment of the digestion and absorption of engineered nanomaterials released from food matrices.

    Science.gov (United States)

    Lefebvre, David E; Venema, Koen; Gombau, Lourdes; Valerio, Luis G; Raju, Jayadev; Bondy, Genevieve S; Bouwmeester, Hans; Singh, R Paul; Clippinger, Amy J; Collnot, Eva-Maria; Mehta, Rekha; Stone, Vicki

    2015-05-01

    Engineered metal/mineral, lipid and biochemical macromolecule nanomaterials (NMs) have potential applications in food. Methodologies for the assessment of NM digestion and bioavailability in the gastrointestinal tract are nascent and require refinement. A working group was tasked by the International Life Sciences Institute NanoRelease Food Additive project to review existing models of the gastrointestinal tract in health and disease, and the utility of these models for the assessment of the uptake of NMs intended for food. Gastrointestinal digestion and absorption could be addressed in a tiered approach using in silico computational models, in vitro non-cellular fluid systems and in vitro cell culture models, after which the necessity of ex vivo organ culture and in vivo animal studies can be considered. Examples of NM quantification in gastrointestinal tract fluids and tissues are emerging; however, few standardized analytical techniques are available. Coupling of these techniques to gastrointestinal models, along with further standardization, will further strengthen methodologies for risk assessment.

  7. Mathematical Formulation Requirements and Specifications for the Process Models

    Energy Technology Data Exchange (ETDEWEB)

    Steefel, C.; Moulton, D.; Pau, G.; Lipnikov, K.; Meza, J.; Lichtner, P.; Wolery, T.; Bacon, D.; Spycher, N.; Bell, J.; Moridis, G.; Yabusaki, S.; Sonnenthal, E.; Zyvoloski, G.; Andre, B.; Zheng, L.; Davis, J.

    2010-11-01

    The Advanced Simulation Capability for Environmental Management (ASCEM) is intended to be a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM program is aimed at addressing critical EM program needs to better understand and quantify flow and contaminant transport behavior in complex geological systems. It will also address the long-term performance of engineered components including cementitious materials in nuclear waste disposal facilities, in order to reduce uncertainties and risks associated with DOE EM's environmental cleanup and closure activities. Building upon national capabilities developed from decades of Research and Development in subsurface geosciences, computational and computer science, modeling and applied mathematics, and environmental remediation, the ASCEM initiative will develop an integrated, open-source, high-performance computer modeling system for multiphase, multicomponent, multiscale subsurface flow and contaminant transport. This integrated modeling system will incorporate capabilities for predicting releases from various waste forms, identifying exposure pathways and performing dose calculations, and conducting systematic uncertainty quantification. The ASCEM approach will be demonstrated on selected sites, and then applied to support the next generation of performance assessments of nuclear waste disposal and facility decommissioning across the EM complex. The Multi-Process High Performance Computing (HPC) Simulator is one of three thrust areas in ASCEM. The other two are the Platform and Integrated Toolsets (dubbed the Platform) and Site Applications. The primary objective of the HPC Simulator is to provide a flexible and extensible computational engine to simulate the coupled processes and flow scenarios described by the conceptual models developed using the ASCEM Platform. The graded and iterative approach to assessments

  8. "Open Access" Requires Clarification: Medical Journal Publication Models Evolve.

    Science.gov (United States)

    Lubowitz, James H; Brand, Jefferson C; Rossi, Michael J; Provencher, Matthew T

    2017-03-01

    While Arthroscopy journal is a traditional subscription model journal, our companion journal Arthroscopy Techniques is "open access." We used to believe open access simply meant online and free of charge. However, while open-access journals are free to readers, in 2017 authors must make a greater sacrifice in the form of an article-processing charge (APC). Again, while this does not apply to Arthroscopy, the APC will apply to Arthroscopy Techniques.

  9. Resource planning for gas utilities: Using a model to analyze pivotal issues

    Energy Technology Data Exchange (ETDEWEB)

    Busch, J.F.; Comnes, G.A.

    1995-11-01

    With the advent of wellhead price decontrols that began in the late 1970s and the development of open access pipelines in the 1980s and 90s, gas local distribution companies (LDCs) now have increased responsibility for their gas supplies and face an increasingly complex array of supply and capacity choices. Heretofore this responsibility had been share with the interstate pipelines that provide bundled firm gas supplies. Moreover, gas supply an deliverability (capacity) options have multiplied as the pipeline network becomes increasing interconnected and as new storage projects are developed. There is now a fully-functioning financial market for commodity price hedging instruments and, on interstate Pipelines, secondary market (called capacity release) now exists. As a result of these changes in the natural gas industry, interest in resource planning and computer modeling tools for LDCs is increasing. Although in some ways the planning time horizon has become shorter for the gas LDC, the responsibility conferred to the LDC and complexity of the planning problem has increased. We examine current gas resource planning issues in the wake of the Federal Energy Regulatory Commission`s (FERC) Order 636. Our goal is twofold: (1) to illustrate the types of resource planning methods and models used in the industry and (2) to illustrate some of the key tradeoffs among types of resources, reliability, and system costs. To assist us, we utilize a commercially-available dispatch and resource planning model and examine four types of resource planning problems: the evaluation of new storage resources, the evaluation of buyback contracts, the computation of avoided costs, and the optimal tradeoff between reliability and system costs. To make the illustration of methods meaningful yet tractable, we developed a prototype LDC and used it for the majority of our analysis.

  10. On the utilization of hydrological modelling for road drainage design under climate and land use change.

    Science.gov (United States)

    Kalantari, Zahra; Briel, Annemarie; Lyon, Steve W; Olofsson, Bo; Folkeson, Lennart

    2014-03-15

    Road drainage structures are often designed using methods that do not consider process-based representations of a landscape's hydrological response. This may create inadequately sized structures as coupled land cover and climate changes can lead to an amplified hydrological response. This study aims to quantify potential increases of runoff in response to future extreme rain events in a 61 km(2) catchment (40% forested) in southwest Sweden using a physically-based hydrological modelling approach. We simulate peak discharge and water level (stage) at two types of pipe bridges and one culvert, both of which are commonly used at Swedish road/stream intersections, under combined forest clear-cutting and future climate scenarios for 2050 and 2100. The frequency of changes in peak flow and water level varies with time (seasonality) and storm size. These changes indicate that the magnitude of peak flow and the runoff response are highly correlated to season rather than storm size. In all scenarios considered, the dimensions of the current culvert are insufficient to handle the increase in water level estimated using a physically-based modelling approach. It also appears that the water level at the pipe bridges changes differently depending on the size and timing of the storm events. The findings of the present study and the approach put forward should be considered when planning investigations on and maintenance for areas at risk of high water flows. In addition, the research highlights the utility of physically-based hydrological models to identify the appropriateness of road drainage structure dimensioning.

  11. Requirements for Logical Models for Value-Added Tax Legislation

    DEFF Research Database (Denmark)

    Nielsen, Morten Ib; Simonsen, Jakob Grue; Larsen, Ken Friis

    -specific needs. Currently, these difficulties are handled in most major ERP systems by customising and localising the native code of the ERP systems for each specific country and industry. We propose an alternative that uses logical modeling of VAT legislation. The potential benefit is to eventually transform......Enterprise resource planning (ERP) systems are ubiquitous in commercial enterprises of all sizes and invariably need to account for the notion of value-added tax (VAT). The legal and technical difficulties in handling VAT are exacerbated by spanning a broad and chaotic spectrum of intricate country...

  12. Utilizing observations of vegetation patterns to infer ecosystem parameters and test model predictions

    Science.gov (United States)

    Penny, G.; Daniels, K. E.; Thompson, S. E.

    2012-12-01

    Periodic vegetation patterns arise globally in arid and semi-arid environments, and are believed to indicate competing positive and negative feedbacks between resource availability and plant uptake at different length scales. The patterns have become the object of two separate research themes, one focusing on observation of ecosystem properties and vegetation morphology, and another focusing on the development of theoretical models and descriptions of pattern behavior. Given the growing body of work in both directions, there is a compelling need to unify both strands of research by bringing together observations of large-scale pattern morphology with predictions made by various models. Previous attempts have employed spectral analysis on pattern images and inverse modeling on one-dimensional transects of patterns images, yet have not made a concerted effort to rigorously confront predictions with observational data in two dimensions. This study makes the first steps towards unification, utilizing high resolution landscape-scale images of vegetation patterns over multiple years at five different locations, including Niger, Central Mexico, Baja California, Texas, and Australia. Initial analyses of the observed patterns reveal considerable departures from the idealized morphologies predicted by models. Pattern wavelengths, while clustered around a local average, vary through space and are frequently altered by pattern defects such as missing or broken bands. While often locally homogeneous, pattern orientation also varies through space, allowing the correlations between landscape features and changes in local pattern morphology to be explored. Stationarity of the pattern can then be examined by comparing temporal changes in morphology with local climatic fluctuations. Ultimately, by identifying homogeneous regions of coherent pattern, inversion approaches can be applied to infer model parameters and build links between observable pattern and landscape features and the

  13. A novel murine model of Fusarium solani keratitis utilizing fluorescent labeled fungi.

    Science.gov (United States)

    Zhang, Hongmin; Wang, Liya; Li, Zhijie; Liu, Susu; Xie, Yanting; He, Siyu; Deng, Xianming; Yang, Biao; Liu, Hui; Chen, Guoming; Zhao, Huiwen; Zhang, Junjie

    2013-05-01

    Fungal keratitis is a common disease that causes blindness. An effective animal model for fungal keratitis is essential for advancing research on this disease. Our objective is to develop a novel mouse model of Fusarium solani keratitis through the inoculation of fluorescent-labeled fungi into the cornea to facilitate the accurate and early identification and screening of fungal infections. F. solani was used as the model fungus in this study. In in vitro experiment, the effects of Calcofluor White (CFW) staining concentration and duration on the fluorescence intensity of F. solani were determined through the mean fluorescence intensity (MFI); the effects of CFW staining on the growth of F. solani were determined by the colony diameter. In in vivo experiment, the F. solani keratitis mice were induced and divided into a CFW-unlabeled and CFW-labeled groups. The positive rate, corneal lesion score and several positive rate determination methods were measured. The MFIs of F. solani in the 30 μg/ml CFW-30 min, 90 μg/ml CFW-10 min and 90 μg/ml CFW-30 min groups were higher than that in the 10 μg/ml CFW-10 min group (P  0.05). No significant differences (P > 0.05) were observed for the positive rate or the corneal lesion scores between the CFW-unlabeled and the CFW-labeled group. On day 1 and 2, the positive rates of the infected corneas in the scraping group were lower than those in the fluorescence microscopy group (P  0.05). Thus, these experiments established a novel murine model of F. solani keratitis utilizing fluorescent labeled fungi. This model facilitates the accurate identification and screening of fungal infections during the early stages of fungal keratitis and provides a novel and reliable technology to study the fungal keratitis. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Modeling invasion of metastasizing cancer cells to bone marrow utilizing ecological principles

    Directory of Open Access Journals (Sweden)

    Chen Kun-Wan

    2011-10-01

    Full Text Available Abstract Background The invasion of a new species into an established ecosystem can be directly compared to the steps involved in cancer metastasis. Cancer must grow in a primary site, extravasate and survive in the circulation to then intravasate into target organ (invasive species survival in transport. Cancer cells often lay dormant at their metastatic site for a long period of time (lag period for invasive species before proliferating (invasive spread. Proliferation in the new site has an impact on the target organ microenvironment (ecological impact and eventually the human host (biosphere impact. Results Tilman has described mathematical equations for the competition between invasive species in a structured habitat. These equations were adapted to study the invasion of cancer cells into the bone marrow microenvironment as a structured habitat. A large proportion of solid tumor metastases are bone metastases, known to usurp hematopoietic stem cells (HSC homing pathways to establish footholds in the bone marrow. This required accounting for the fact that this is the natural home of hematopoietic stem cells and that they already occupy this structured space. The adapted Tilman model of invasion dynamics is especially valuable for modeling the lag period or dormancy of cancer cells. Conclusions The Tilman equations for modeling the invasion of two species into a defined space have been modified to study the invasion of cancer cells into the bone marrow microenvironment. These modified equations allow a more flexible way to model the space competition between the two cell species. The ability to model initial density, metastatic seeding into the bone marrow and growth once the cells are present, and movement of cells out of the bone marrow niche and apoptosis of cells are all aspects of the adapted equations. These equations are currently being applied to clinical data sets for verification and further refinement of the models.

  15. Parameterization of a rainfall-runoff model based on the utility of the forecasts for a specific stakeholder

    Science.gov (United States)

    Cappelletti, Matteo; Toth, Elena

    2016-04-01

    The work presents the application of a new method for calibration of an hydrological rainfall-runoff model, based on the use of utility functions. The utility function is defined on the basis of the specific purpose of the desired predictions, according to the needs of the stakeholders that will use them: in the present case, the purpose is the identification of the future streamflow occurrences that will surpass an assigned threshold runoff, thus helping the stakeholder in the decisions concerning the issuance of flood watches and warnings in the operation of a flood forecasting system. The chosen utility function is based on both the absolute error of the model and the values of the observed streamflow. In addition to the parameterization developed using the utility function, in an application referred to a mid-sized mountain watershed in Tuscany (Italy), the model response was studied, as a term of comparison, also using traditional mono- and multi-objective calibration approaches. The results, evaluated also using skill scores based on false and missed alarms as well as on the probability of detection and frequency of hits of the threshold runoff (widely adopted when assessing the value of both meteorological and hydrological forecasts in real-world flood warning systems), demonstrate that the proposed approach may allow an improvement of the model performances, if compared with traditional mono-objective and multi-objective calibration procedures, in respect to the actual utility of the forecasts for a specific stakeholder.

  16. Theoretical modeling of a self-referenced dual mode SPR sensor utilizing indium tin oxide film

    Science.gov (United States)

    Srivastava, Sachin K.; Verma, Roli; Gupta, Banshi D.

    2016-06-01

    A prism based dual mode SPR sensor was theoretically modeled to work as a self-referenced sensor in spectral interrogation scheme. Self-referenced sensing was achieved by sandwiching an indium tin oxide thin film in between the prism base and the metal layer. The proposed sensor possesses two plasmon modes similar to long and short range SPRs (LR- and SR-SPRs) and we have analogically used LRSPR and SRSPR for them. However, these modes do not possess usual long range character due to the losses introduced by the imaginary part of indium tin oxide (ITO) dielectric function. One of the two plasmon modes responds to change in analyte refractive index while the other remains fixed. The influence of various design parameters on the performance of the sensor was evaluated. The performance of the proposed sensor was compared, via control simulations, with established dual mode geometries utilizing silicon dioxide (SiO2), Teflon AF-1600 and Cytop. The design parameters of the established geometries were optimized to obtain self-referenced sensing operation. Trade-offs between the resonance spectral width, minimum reflectivity, shift in resonance wavelength and angle of incidence were examined for optimal design. The present study will be useful in the fabrication of self-referenced sensors where the ambient conditions are not quite stable.

  17. Biomimetic peptide-based models of [FeFe]-hydrogenases: utilization of phosphine-containing peptides.

    Science.gov (United States)

    Roy, Souvik; Nguyen, Thuy-Ai D; Gan, Lu; Jones, Anne K

    2015-09-07

    Two synthetic strategies for incorporating diiron analogues of [FeFe]-hydrogenases into short peptides via phosphine functional groups are described. First, utilizing the amine side chain of lysine as an anchor, phosphine carboxylic acids can be coupled via amide formation to resin-bound peptides. Second, artificial, phosphine-containing amino acids can be directly incorporated into peptides via solution phase peptide synthesis. The second approach is demonstrated using three amino acids each with a different phosphine substituent (diphenyl, diisopropyl, and diethyl phosphine). In total, five distinct monophosphine-substituted, diiron model complexes were prepared by reaction of the phosphine-peptides with diiron hexacarbonyl precursors, either (μ-pdt)Fe2(CO)6 or (μ-bdt)Fe2(CO)6 (pdt = propane-1,3-dithiolate, bdt = benzene-1,2-dithiolate). Formation of the complexes was confirmed by UV/Vis, FTIR and (31)P NMR spectroscopy. Electrocatalysis by these complexes is reported in the presence of acetic acid in mixed aqueous-organic solutions. Addition of water results in enhancement of the catalytic rates.

  18. A Cold Model Aerodynamical Test of Air-Staged Combustion in a Tangential Firing Utility Boiler

    Institute of Scientific and Technical Information of China (English)

    ZHANG Hui-juan; HUI Shi-en; ZHOU Qu-lan

    2007-01-01

    The purpose of this paper is to present the flow field in the 300MW tangential firing utility boiler that used the Low NOx Concentric Firing System (LNCFS). Using the method of cold isothermal simulation ensures the geometric and boundary condition similarity. At the same time the condition of self-modeling is met. The experimental results show that the mixture of primary air and secondary air becomes slower, the average turbulence magnitude of the main combustion zone becomes less and the relative diameter of the tangential firing enlarges when the secondary air deflection angle increases. When the velocity pressure ratio of the secondary air to the primary air (p2/p1) enlarges, the mixture of the secondary air and the primary air becomes stronger, the average turbulence magnitude of the main combustion zone increases, and the relative diameter of the tangential firing becomes larger. Because the over fire air (OFA) laid out near the wall has a powerful penetration, the relative diameter of the tangential firing on the section of the OFA is very little, but the average turbulence magnitude is great. When the velocity pressure ratio of the OFA to the primary air pOFA/p1 increases, the relative diameter of the tangential firing on the section of the OFA grows little, the average turbulence magnitude becomes larger and the penetration of the OFA becomes more powerful.

  19. Research utilization in the building industry: decision model and preliminary assessment

    Energy Technology Data Exchange (ETDEWEB)

    Watts, R.L.; Johnson, D.R.; Smith, S.A.; Westergard, E.J.

    1985-10-01

    The Research Utilization Program was conceived as a far-reaching means for managing the interactions of the private sector and the federal research sector as they deal with energy conservation in buildings. The program emphasizes a private-public partnership in planning a research agenda and in applying the results of ongoing and completed research. The results of this task support the hypothesis that the transfer of R and D results to the buildings industry can be accomplished more efficiently and quickly by a systematic approach to technology transfer. This systematic approach involves targeting decision makers, assessing research and information needs, properly formating information, and then transmitting the information through trusted channels. The purpose of this report is to introduce elements of a market-oriented knowledge base, which would be useful to the Building Systems Division, the Office of Buildings and Community Systems and their associated laboratories in managing a private-public research partnership on a rational systematic basis. This report presents conceptual models and data bases that can be used in formulating a technology transfer strategy and in planning technology transfer programs.

  20. Optimal urban water conservation strategies considering embedded energy: coupling end-use and utility water-energy models.

    Science.gov (United States)

    Escriva-Bou, A.; Lund, J. R.; Pulido-Velazquez, M.; Spang, E. S.; Loge, F. J.

    2014-12-01

    Although most freshwater resources are used in agriculture, a greater amount of energy is consumed per unit of water supply for urban areas. Therefore, efforts to reduce the carbon footprint of water in cities, including the energy embedded within household uses, can be an order of magnitude larger than for other water uses. This characteristic of urban water systems creates a promising opportunity to reduce global greenhouse gas emissions, particularly given rapidly growing urbanization worldwide. Based on a previous Water-Energy-CO2 emissions model for household water end uses, this research introduces a probabilistic two-stage optimization model considering technical and behavioral decision variables to obtain the most economical strategies to minimize household water and water-related energy bills given both water and energy price shocks. Results show that adoption rates to reduce energy intensive appliances increase significantly, resulting in an overall 20% growth in indoor water conservation if household dwellers include the energy cost of their water use. To analyze the consequences on a utility-scale, we develop an hourly water-energy model based on data from East Bay Municipal Utility District in California, including the residential consumption, obtaining that water end uses accounts for roughly 90% of total water-related energy, but the 10% that is managed by the utility is worth over 12 million annually. Once the entire end-use + utility model is completed, several demand-side management conservation strategies were simulated for the city of San Ramon. In this smaller water district, roughly 5% of total EBMUD water use, we found that the optimal household strategies can reduce total GHG emissions by 4% and utility's energy cost over 70,000/yr. Especially interesting from the utility perspective could be the "smoothing" of water use peaks by avoiding daytime irrigation that among other benefits might reduce utility energy costs by 0.5% according to our

  1. Power Utility Maximization in an Exponential Lévy Model Without a Risk-free Asset

    Institute of Scientific and Technical Information of China (English)

    Qing Zhou

    2005-01-01

    We consider the problem of maximizing the expected power utility from terminal wealth in a market where logarithmic securities prices follow a Levy process. By Girsanov's theorem, we give explicit solutions for power utility of undiscounted terminal wealth in terms of the Levy-Khintchine triplet.

  2. Studies on Models,Patterns and Require-ments of Digestible Amino Acids for Layers by Nitrogen Metabolism

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    The nitrogen (N) metabolic experiments were made to estimate separately amino acid requirements of 43~48 weeks old layers for maintenance, for protein accretion to estabolish models to estimate digestible amino acid requirements. The regression relationship of nitrogen retention vs amino acid intake was estimated for each amino acid by giving, at rate of N intake of 0.91, 0.52, 0.15 and 0.007g.kg-1 body-weight (W0.75) per d, the semi-synthetic diets was made specially deficient in one amino acid. From the regression coefficients, it was calculated that, for the accretion of 1 g protein, the dietary digestible amino acid requirements were (mg) Thr 63.1, Val 100.4, Met 39.9, Ile 88.6, Leu 114.3, Phe 63.2, Lys 87.0, His 20.5, Arg 87.9, Trp 21.4, Met+Cys 77.6, and Phe+Tyr 114.3. Daily amino acid requirements for N equilibrium were estimated to be (mg.kg-1W0.75 per day) Thr 50.6, Val 74.7, Met 30.3, ILe 66.7 Leu 81.4, Phe 44.8, Lys 60.5 His 14.7, Arg 73.9 ,Trp 17.3, Met+Cys 58.6, and Phe+Tyr 83.9 The dietary degestible amino acid patterns for protein accretion and N equilibrium were also proposed. The models of estimating digestible amino acid requirements for the different productions were developed.

  3. Dissecting genetic requirements of human breast tumorigenesis in a tissue transgenic model of human breast cancer in mice.

    Science.gov (United States)

    Wu, Min; Jung, Lina; Cooper, Adrian B; Fleet, Christina; Chen, Lihao; Breault, Lyne; Clark, Kimberly; Cai, Zuhua; Vincent, Sylvie; Bottega, Steve; Shen, Qiong; Richardson, Andrea; Bosenburg, Marcus; Naber, Stephen P; DePinho, Ronald A; Kuperwasser, Charlotte; Robinson, Murray O

    2009-04-28

    Breast cancer development is a complex pathobiological process involving sequential genetic alterations in normal epithelial cells that results in uncontrolled growth in a permissive microenvironment. Accordingly, physiologically relevant models of human breast cancer that recapitulate these events are needed to study cancer biology and evaluate therapeutic agents. Here, we report the generation and utilization of the human breast cancer in mouse (HIM) model, which is composed of genetically engineered primary human breast epithelial organoids and activated human breast stromal cells. By using this approach, we have defined key genetic events required to drive the development of human preneoplastic lesions as well as invasive adenocarcinomas that are histologically similar to those in patients. Tumor development in the HIM model proceeds through defined histological stages of hyperplasia, DCIS to invasive carcinoma. Moreover, HIM tumors display characteristic responses to targeted therapies, such as HER2 inhibitors, further validating the utility of these models in preclinical compound testing. The HIM model is an experimentally tractable human in vivo system that holds great potential for advancing our basic understanding of cancer biology and for the discovery and testing of targeted therapies.

  4. Utilization of sulfate additives in biomass combustion: fundamental and modeling aspects

    DEFF Research Database (Denmark)

    Wu, Hao; Jespersen, Jacob Boll; Grell, Morten Nedergaard;

    2013-01-01

    Sulfates, such as ammonium sulfate, aluminum sulfate and ferric sulfate, are effective additives for converting the alkali chlorides released from biomass combustion to the less harmful alkali sulfates. Optimization of the use of these additives requires knowledge on their decomposition rate...... and product distribution under high temperature conditions. In the present work, the decomposition of ammonium sulfate, aluminum sulfate and ferric sulfate was studied respectively in a fast-heating rate thermogravimetric analyzer for deriving a kinetic model to describe the process. The yields of SO2 and SO3...... of different sulfates indicated that ammonium sulfate has clearly strongest sulfation power towards KCl at temperatures below 800oC, whereas the sulfation power of ferric and aluminum sulfates exceeds clearly that of ammonium sulfate between 900 and 1000oC. However, feeding gaseous SO3 was found to be most...

  5. Forecasting Model of Coal Requirement Quantity Based on Grey System Theory

    Institute of Scientific and Technical Information of China (English)

    孙继湖

    2001-01-01

    The generally used methods of forecasting coal requirement quantity include the analogy method, the outside-push method and the cause-effect analysis method. However, the precision of forecasting results using these methods is lower. This paper uses the grey system theory, and sets up grey forecasting model GM (1, 3) to coal requirement quantity. The forecasting result for the Chinese coal requirement quantity coincides with the actual values, and this shows that the model is reliable. Finally, this model are used to forecast Chinese coal requirement quantity in the future ten years.

  6. Digital Avionics Information System (DAIS): Training Requirements Analysis Model Users Guide. Final Report.

    Science.gov (United States)

    Czuchry, Andrew J.; And Others

    This user's guide describes the functions, logical operations and subroutines, input data requirements, and available outputs of the Training Requirements Analysis Model (TRAMOD), a computerized analytical life cycle cost modeling system for use in the early stages of system design. Operable in a stand-alone mode, TRAMOD can be used for the…

  7. Feasibility and utility of applications of the common data model to multiple, disparate observational health databases.

    Science.gov (United States)

    Voss, Erica A; Makadia, Rupa; Matcho, Amy; Ma, Qianli; Knoll, Chris; Schuemie, Martijn; DeFalco, Frank J; Londhe, Ajit; Zhu, Vivienne; Ryan, Patrick B

    2015-05-01

    To evaluate the utility of applying the Observational Medical Outcomes Partnership (OMOP) Common Data Model (CDM) across multiple observational databases within an organization and to apply standardized analytics tools for conducting observational research. Six deidentified patient-level datasets were transformed to the OMOP CDM. We evaluated the extent of information loss that occurred through the standardization process. We developed a standardized analytic tool to replicate the cohort construction process from a published epidemiology protocol and applied the analysis to all 6 databases to assess time-to-execution and comparability of results. Transformation to the CDM resulted in minimal information loss across all 6 databases. Patients and observations excluded were due to identified data quality issues in the source system, 96% to 99% of condition records and 90% to 99% of drug records were successfully mapped into the CDM using the standard vocabulary. The full cohort replication and descriptive baseline summary was executed for 2 cohorts in 6 databases in less than 1 hour. The standardization process improved data quality, increased efficiency, and facilitated cross-database comparisons to support a more systematic approach to observational research. Comparisons across data sources showed consistency in the impact of inclusion criteria, using the protocol and identified differences in patient characteristics and coding practices across databases. Standardizing data structure (through a CDM), content (through a standard vocabulary with source code mappings), and analytics can enable an institution to apply a network-based approach to observational research across multiple, disparate observational health databases. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  8. Integration of symptom ratings from multiple informants in ADHD diagnosis: a psychometric model with clinical utility.

    Science.gov (United States)

    Martel, Michelle M; Schimmack, Ulrich; Nikolas, Molly; Nigg, Joel T

    2015-09-01

    The Diagnostic and Statistical Manual of Mental Disorder-Fifth Edition explicitly requires that attention-deficit/hyperactivity disorder (ADHD) symptoms should be apparent across settings, taking into account reports from multiple informants. Yet, it provides no guidelines how information from different raters should be combined in ADHD diagnosis. We examined the validity of different approaches using structural equation modeling (SEM) for multiple-informant data. Participants were 725 children, 6 to 17 years old, and their primary caregivers and teachers, recruited from the community and completing a thorough research-based diagnostic assessment, including a clinician-administered diagnostic interview, parent and teacher standardized rating scales, and cognitive testing. A best-estimate ADHD diagnosis was generated by a diagnostic team. An SEM model demonstrated convergent validity among raters. We found relatively weak symptom-specific agreement among raters, suggesting that a general average scoring algorithm is preferable to symptom-specific scoring algorithms such as the "or" and "and" algorithms. Finally, to illustrate the validity of this approach, we show that averaging makes it possible to reduce the number of items from 18 items to 8 items without a significant decrease in validity. In conclusion, information from multiple raters increases the validity of ADHD diagnosis, and averaging appears to be the optimal way to integrate information from multiple raters. (c) 2015 APA, all rights reserved.

  9. Evaluation Capacity Building in the Context of Military Psychological Health: Utilizing Preskill and Boyle's Multidisciplinary Model

    Science.gov (United States)

    Hilton, Lara; Libretto, Salvatore

    2017-01-01

    The need for evaluation capacity building (ECB) in military psychological health is apparent in light of the proliferation of newly developed, yet untested programs coupled with the lack of internal evaluation expertise. This study addresses these deficiencies by utilizing Preskill and Boyle's multidisciplinary ECB model within a post-traumatic…

  10. Fire rehabilitation decisions at landscape scales: utilizing state-and-transition models developed through disturbance response grouping of ecological sites

    Science.gov (United States)

    Recognizing the utility of ecological sites and the associated state-and-transition model (STM) for decision support, the Bureau of Land Management in Nevada partnered with Nevada NRCS and the University of Nevada, Reno (UNR) in 2009 with the goal of creating a team that could (1) expedite developme...

  11. Evaluating neurology CME in two educational methods using Patton's utilization focused model.

    Science.gov (United States)

    Vakani, Farhan; Ahmad, Amina; Sonawalla, Aziz; Sheerani, Mughis

    2013-01-01

    Generally in continuing education medical education (CME) the most time is consumed for in the planning and preparation of the event. This planning and preparation, however, needs recognition through an evaluative process. The purpose of this study was to evaluate neurology CME in two educational methods, lecture vs task-based learning, using Patton's utilisation focused model. This was an observational, cross-sectional inquiry. The questionnaire evaluated the educational elements such as learning objectives met, content covered, presentations at the level of understanding, level of interaction, knowledge gained, time management, queries responded, organisation, quality of learning material and overall grading of the educational event. General Practitioners were the key participants in this evaluation and consisted of 60 self-selected physicians distributed equally in both the TBL and lecture groups. Patton's utilization focused model was used to produce findings for effective decision making. The data were analysed using Mann-Whitney U test to know the value of the learning method that satisfied the most participants. A total of 58 evaluations were returned, 29 from the TBL group and 29 from the lecture. The analysis of the elements showed higher mean ranks for TBL method ranging between 32.2 and 38.4 versus lecture (20.6-26.8). Most of the elements assessed were statistically significant (p > 0.05), except time management (p = 0.22). However, elements as 'objectives of the activity met' (p = 0.07), 'overall grading of the event' (p = 0.06) and 'presentations at the level of understanding' (p = 0.06) were at border line. Of the 29 respondents in the TBL group, 75% rated all the elements of the program above very good. In the lecture group, 22 (75%) respondents out of 29 rated almost half of the elements above very good. Majority of respondents in the TBL group rated all program elements as exceptional compared to the lecture group in which only half of the

  12. The Utilization of Standard Deviational Ellipse (SDE) Model for the Analysis of Dengue Fever Cases in Banjar City 2013

    OpenAIRE

    Martya Rahmaniati; Tris Eryando; Dewi Susanna; Dian Pratiwi; Fajar Nugraha; Andri Ruliansah; Muhammad Umar Riandi

    2014-01-01

    Dengue Fever Disease is still regarded as an endemic disease in Banjar City. Information is still required to map dengue fever case distribution, mean center of case distribution, and the direction of dengue fever case dispersion in order to support the surveillance program in the relation to the vast area of the dengue fever disease control program. The objective of the research is to obtain information regarding the area of dengue fever disease distribution in Banjar City by utilizing the S...

  13. Glutathione Utilization by Candida albicans Requires a Functional Glutathione Degradation (DUG) Pathway and OPT7, an Unusual Member of the Oligopeptide Transporter Family

    Science.gov (United States)

    Desai, Prashant Ramesh; Thakur, Anil; Ganguli, Dwaipayan; Paul, Sanjoy; Morschhäuser, Joachim; Bachhawat, Anand K.

    2011-01-01

    Candida albicans lacks the ability to survive within its mammalian host in the absence of endogenous glutathione biosynthesis. To examine the ability of this yeast to utilize exogenous glutathione, we exploited the organic sulfur auxotrophy of C. albicans met15Δ strains. We observed that glutathione is utilized efficiently by the alternative pathway of glutathione degradation (DUG pathway). The major oligopeptide transporters OPT1–OPT5 of C. albicans that were most similar to the known yeast glutathione transporters were not found to contribute to glutathione transport to any significant extent. A genomic library approach to identify the glutathione transporter of C. albicans yielded OPT7 as the primary glutathione transporter. Biochemical studies on OPT7 using radiolabeled GSH uptake revealed a Km of 205 μm, indicating that it was a high affinity glutathione transporter. OPT7 is unusual in several aspects. It is the most remote member to known yeast glutathione transporters, lacks the two highly conserved cysteines in the family that are known to be crucial in trafficking, and also has the ability to take up tripeptides. The transporter was regulated by sulfur sources in the medium. OPT7 orthologues were prevalent among many pathogenic yeasts and fungi and formed a distinct cluster quite remote from the Saccharomyces cerevisiae HGT1 glutathione transporter cluster. In vivo experiments using a systemic model of candidiasis failed to detect expression of OPT7 in vivo, and strains disrupted either in the degradation (dug3Δ) or transport (opt7Δ) of glutathione failed to show a defect in virulence. PMID:21994941

  14. OAM system based on TMN for utility telecommunication network. Proposal of modeling method about managed objects; TMN ni motozuku denryoku tsushinmo no un`yo kanri system. Kanri object no sekkei shuho ni kansuru kento

    Energy Technology Data Exchange (ETDEWEB)

    Hirozawa, T.; Yusa, H.; Otani, T. [Central Research Institute of Electric Power Industry, Tokyo (Japan); Okamura, K. [Tokyo Electric Power Co. Inc., Tokyo (Japan)

    1996-03-01

    To construct an advanced operation and management system for utility telecommunications management network (TMN), this paper proposes a modeling method of managed objects (MOs) required for managing and managed systems, such as an asynchronous transmission mode (ATM) exchanger. Flexible line setting and path switching control are required for the advanced TMN, which must cope with the extension and modification of functions, flexibly. Assignment of roles of managing sides and managed sides was determined. Then, structures of objects such as facilities and logic data, and their interaction were modeled. Common management functions and objects of each function were classified. Based on the TMN standard and MOs of the existing design peculiar to utility, new MOs peculiar to utility were defined in response to the models. The existing MOs can be effectively utilized, and the optimum MOs to be incorporated can be expected. The MOs peculiar to utility are added to the common specification of electric power industry. Since they can be reused for the extension and modification of functions, the cost can be reduced. The MOs applicable to path switching control of utility were designed as a trial. 9 refs., 16 figs., 10 tabs.

  15. Uncertainty in photochemical modeling results from using seasonal estimates vs day-specific emissions inputs for utility sources in an urban airshed in the northeast

    Energy Technology Data Exchange (ETDEWEB)

    Arunachalam, S.; Georgopoulos, P.G. [Rutgers, the State Univ. of New Jersey, Piscataway, NJ (United States)

    1996-12-31

    Design and development of robust ozone control strategies through photochemical modeling studies are dependent to a large extent on the quality of the emissions inputs that are used. A key issue here in the quality of the emissions inventory is the choice between using day-specific information versus seasonal estimates for emissions from major utilities in the modeling domain of interest. Emissions of NO{sub x} from electric utilities constitute more than a third of the total NO{sub x} emissions from all sources ill a typical urban modeling domain, and hence it is important that the emissions from these sources are characterized as accurately as possible in the photochemical model. Since a considerable amount of resources are required to develop regional or urban-level emissions inventories for modeling purposes, one has to accept the level of detail that can be incorporated in a given modeling inventory and try to develop optimal control strategies based on the inputs. The sensitivity of the model to the differences in emissions inputs as mentioned above are examined in the New Jersey-Philadelphia-Delaware Valley Urban Airshed Model State Implementation Plan (SIP) application for two ozone episodes that occurred in the Northeastern US - the July 6-8, 1988 and the July 18-20, 1991. Day-specific emissions information are collected for a major portion of the elevated point sources within tile domain for these two episodes and various metrics besides the daily maximum one-hour averaged ozone predictions, are compared from model predictions for the two cases. Such comparative studies will bring into focus the presence of a weekend effect, if any, and differences between weekday and weekend emissions can also be tested with the model, using the same meteorology. Understanding the impact of this difference will lead to a better design sensitivity-uncertainty simulations and call lead to the development of robust emission control strategies as well.

  16. Hybrid supply chain model for material requirement planning under financial constraints: A case study

    Science.gov (United States)

    Curci, Vita; Dassisti, Michele; Josefa, Mula Bru; Manuel, Díaz Madroñero

    2014-10-01

    Supply chain model (SCM) are potentially capable to integrate different aspects in supporting decision making for enterprise management tasks. The aim of the paper is to propose an hybrid mathematical programming model for optimization of production requirements resources planning. The preliminary model was conceived bottom-up from a real industrial case analysed oriented to maximize cash flow. Despite the intense computational effort required to converge to a solution, optimisation done brought good result in solving the objective function.

  17. Utilization of similitude relationships to obtain a compact transformer model useful for technical and economical networks studies

    Energy Technology Data Exchange (ETDEWEB)

    Pierrat, L. [EDF-CNRS, Div. Technique Generale, Grenoble Cedex (France); Resende, M.J.; Santana, J. [IST-Seccao Maquinas Electricas e Elec. Potencia, Lisboa Codex (Portugal)

    1995-12-01

    Nowadays, utilities concentrate their efforts on a rationalized utilization of resources, in particular financial ones. On the electrical resources domain, investments are made on expensive and long life equipment, which means that must be taken into account specific long life characteristics some of them deterministic ones, and some with an high degree of uncertainty. A significant problem into this trend is the MV/LV distribution transformers renewal: the optimal choice of their rated power and renewal moment, depend upon consumers rate evolution and operation conditions. This paper proposes similitude relationships to define typical parameters present on thermal and, consequently, life expectancy models of distribution transformers. 8 refs, 6 figs, 2 tabs

  18. Adolescent idiopathic scoliosis screening for school, community, and clinical health promotion practice utilizing the PRECEDE-PROCEED model

    Directory of Open Access Journals (Sweden)

    Wyatt Lawrence A

    2005-11-01

    Full Text Available Abstract Background Screening for adolescent idiopathic scoliosis (AIS is a commonly performed procedure for school children during the high risk years. The PRECEDE-PROCEDE (PP model is a health promotion planning model that has not been utilized for the clinical diagnosis of AIS. The purpose of this research is to study AIS in the school age population using the PP model and its relevance for community, school, and clinical health promotion. Methods MEDLINE was utilized to locate AIS data. Studies were screened for relevance and applicability under the auspices of the PP model. Where data was unavailable, expert opinion was utilized based on consensus. Results The social assessment of quality of life is limited with few studies approaching the long-term effects of AIS. Epidemiologically, AIS is the most common form of scoliosis and leading orthopedic problem in children. Behavioral/environmental studies focus on discovering etiologic relationships yet this data is confounded because AIS is not a behavioral. Illness and parenting health behaviors can be appreciated. The educational diagnosis is confounded because AIS is an orthopedic disorder and not behavioral. The administration/policy diagnosis is hindered in that scoliosis screening programs are not considered cost-effective. Policies are determined in some schools because 26 states mandate school scoliosis screening. There exists potential error with the Adam's test. The most widely used measure in the PP model, the Health Belief Model, has not been utilized in any AIS research. Conclusion The PP model is a useful tool for a comprehensive study of a particular health concern. This research showed where gaps in AIS research exist suggesting that there may be problems to the implementation of school screening. Until research disparities are filled, implementation of AIS screening by school, community, and clinical health promotion will be compromised. Lack of data and perceived importance by

  19. Utilizing ARC EMCS Seedling Cassettes as Highly Versatile Miniature Growth Chambers for Model Organism Experiments

    Science.gov (United States)

    Freeman, John L.; Steele, Marianne K.; Sun, Gwo-Shing; Heathcote, David; Reinsch, S.; DeSimone, Julia C.; Myers, Zachary A.

    2014-01-01

    The aim of our ground testing was to demonstrate the capability of safely putting specific model organisms into dehydrated stasis, and to later rehydrate and successfully grow them inside flight proven ARC EMCS seedling cassettes. The ARC EMCS seedling cassettes were originally developed to support seedling growth during space flight. The seeds are attached to a solid substrate, launched dry, and then rehydrated in a small volume of media on orbit to initiate the experiment. We hypothesized that the same seedling cassettes should be capable of acting as culture chambers for a wide range of organisms with minimal or no modification. The ability to safely preserve live organisms in a dehydrated state allows for on orbit experiments to be conducted at the best time for crew operations and more importantly provides a tightly controlled physiologically relevant growth experiment with specific environmental parameters. Thus, we performed a series of ground tests that involved growing the organisms, preparing them for dehydration on gridded Polyether Sulfone (PES) membranes, dry storage at ambient temperatures for varying periods of time, followed by rehydration. Inside the culture cassettes, the PES membranes were mounted above blotters containing dehydrated growth media. These were mounted on stainless steel bases and sealed with plastic covers that have permeable membrane covered ports for gas exchange. The results showed we were able to demonstrate acceptable normal growth of C.elegans (nematodes), E.coli (bacteria), S.cerevisiae (yeast), Polytrichum (moss) spores and protonemata, C.thalictroides (fern), D.discoideum (amoeba), and H.dujardini (tardigrades). All organisms showed acceptable growth and rehydration in both petri dishes and culture cassettes initially, and after various time lengths of dehydration. At the end of on orbit ISS European Modular Cultivation System experiments the cassettes could be frozen at ultra-low temperatures, refrigerated, or chemically

  20. Physician Requirements-1990. For Cardiology.

    Science.gov (United States)

    Tracy, Octavious; Birchette-Pierce, Cheryl

    Professional requirements for physicians specializing in cardiology were estimated to assist policymakers in developing guidelines for graduate medical education. The determination of physician requirements was based on an adjusted needs rather than a demand or utilization model. For each illness, manpower requirements were modified by the…

  1. Deriving utility scores for co-morbid conditions: a test of the multiplicative model for combining individual condition scores

    Directory of Open Access Journals (Sweden)

    Le Petit Christel

    2006-10-01

    Full Text Available Abstract Background The co-morbidity of health conditions is becoming a significant health issue, particularly as populations age, and presents important methodological challenges for population health research. For example, the calculation of summary measures of population health (SMPH can be compromised if co-morbidity is not taken into account. One popular co-morbidity adjustment used in SMPH computations relies on a straightforward multiplicative combination of the severity weights for the individual conditions involved. While the convenience and simplicity of the multiplicative model are attractive, its appropriateness has yet to be formally tested. The primary objective of the current study was therefore to examine the empirical evidence in support of this approach. Methods The present study drew on information on the prevalence of chronic conditions and a utility-based measure of health-related quality of life (HRQoL, namely the Health Utilities Index Mark 3 (HUI3, available from Cycle 1.1 of the Canadian Community Health Survey (CCHS; 2000–01. Average HUI3 scores were computed for both single and co-morbid conditions, and were also purified by statistically removing the loss of functional health due to health problems other than the chronic conditions reported. The co-morbidity rule was specified as a multiplicative combination of the purified average observed HUI3 utility scores for the individual conditions involved, with the addition of a synergy coefficient s for capturing any interaction between the conditions not explained by the product of their utilities. The fit of the model to the purified average observed utilities for the co-morbid conditions was optimized using ordinary least squares regression to estimate s. Replicability of the results was assessed by applying the method to triple co-morbidities from the CCHS cycle 1.1 database, as well as to double and triple co-morbidities from cycle 2.1 of the CCHS (2003–04. Results

  2. Bioreactor process parameter screening utilizing a Plackett-Burman design for a model monoclonal antibody.

    Science.gov (United States)

    Agarabi, Cyrus D; Schiel, John E; Lute, Scott C; Chavez, Brittany K; Boyne, Michael T; Brorson, Kurt A; Khan, Mansoor A; Read, Erik K

    2015-06-01

    Consistent high-quality antibody yield is a key goal for cell culture bioprocessing. This endpoint is typically achieved in commercial settings through product and process engineering of bioreactor parameters during development. When the process is complex and not optimized, small changes in composition and control may yield a finished product of less desirable quality. Therefore, changes proposed to currently validated processes usually require justification and are reported to the US FDA for approval. Recently, design-of-experiments-based approaches have been explored to rapidly and efficiently achieve this goal of optimized yield with a better understanding of product and process variables that affect a product's critical quality attributes. Here, we present a laboratory-scale model culture where we apply a Plackett-Burman screening design to parallel cultures to study the main effects of 11 process variables. This exercise allowed us to determine the relative importance of these variables and identify the most important factors to be further optimized in order to control both desirable and undesirable glycan profiles. We found engineering changes relating to culture temperature and nonessential amino acid supplementation significantly impacted glycan profiles associated with fucosylation, β-galactosylation, and sialylation. All of these are important for monoclonal antibody product quality.

  3. Model-Based Requirements Analysis for Reactive Systems with UML Sequence Diagrams and Coloured Petri Nets

    DEFF Research Database (Denmark)

    Tjell, Simon; Lassen, Kristian Bisgaard

    2008-01-01

    In this paper, we describe a formal foundation for a specialized approach to automatically checking traces against real-time requirements. The traces are obtained from simulation of Coloured Petri Net (CPN) models of reactive systems. The real-time requirements are expressed in terms...... of a derivative of UML 2.0 high-level Sequence Diagrams. The automated requirement checking is part of a bigger tool framework in which VDM++ is applied to automatically generate initial CPN models based on Problem Diagrams. These models are manually enhanced to provide behavioral descriptions of the environment...

  4. Model requirements for decision support under uncertainty in data scarce dynamic deltas

    NARCIS (Netherlands)

    Haasnoot, Marjolijn; van Deursen, W.P.A.; Kwakkel, J. H.; Middelkoop, H.

    2016-01-01

    There is a long tradition of model-based decision support in water management. The consideration of deep uncertainty, however, changes the requirements imposed on models.. In the face of deep uncertainty, models are used to explore many uncertainties and the decision space across multiple outcomes o

  5. Requirements for UML and OWL Integration Tool for User Data Consistency Modeling and Testing

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard; Oleshchuk, V. A.

    2003-01-01

    . In this paper we analyze requirements for a tool that support integration of UML models and ontologies written in languages like the W3C Web Ontology Language (OWL). The tool can be used in the following way: after loading two legacy models into the tool, the tool user connects them by inserting modeling...

  6. A new approach for modeling the peak utility impacts from a proposed CUAC standard

    Energy Technology Data Exchange (ETDEWEB)

    LaCommare, Kristina Hamachi; Gumerman, Etan; Marnay, Chris; Chan, Peter; Coughlin, Katie

    2004-08-01

    This report describes a new Berkeley Lab approach for modeling the likely peak electricity load reductions from proposed energy efficiency programs in the National Energy Modeling System (NEMS). This method is presented in the context of the commercial unitary air conditioning (CUAC) energy efficiency standards. A previous report investigating the residential central air conditioning (RCAC) load shapes in NEMS revealed that the peak reduction results were lower than expected. This effect was believed to be due in part to the presence of the squelch, a program algorithm designed to ensure changes in the system load over time are consistent with the input historic trend. The squelch applies a system load-scaling factor that scales any differences between the end-use bottom-up and system loads to maintain consistency with historic trends. To obtain more accurate peak reduction estimates, a new approach for modeling the impact of peaky end uses in NEMS-BT has been developed. The new approach decrements the system load directly, reducing the impact of the squelch on the final results. This report also discusses a number of additional factors, in particular non-coincidence between end-use loads and system loads as represented within NEMS, and their impacts on the peak reductions calculated by NEMS. Using Berkeley Lab's new double-decrement approach reduces the conservation load factor (CLF) on an input load decrement from 25% down to 19% for a SEER 13 CUAC trial standard level, as seen in NEMS-BT output. About 4 GW more in peak capacity reduction results from this new approach as compared to Berkeley Lab's traditional end-use decrement approach, which relied solely on lowering end use energy consumption. The new method has been fully implemented and tested in the Annual Energy Outlook 2003 (AEO2003) version of NEMS and will routinely be applied to future versions. This capability is now available for use in future end-use efficiency or other policy analysis

  7. Modeling and verifying SoS performance requirements of C4ISR systems

    Institute of Scientific and Technical Information of China (English)

    Yudong Qi; Zhixue Wang; Qingchao Dong; Hongyue He

    2015-01-01

    System-of-systems (SoS) engineering involves a com-plex process of refining high-level SoS requirements into more detailed systems requirements and assessing the extent to which the performances of to-be systems may possibly satisfy SoS capa-bility objectives. The key issue is how to model such requirements to automate the process of analysis and assessment. This paper suggests a meta-model that defines both functional and non-functional features of SoS requirements for command and control, communication, computer, intel igence, surveil ance reconnais-sance (C4ISR) systems. A domain-specific modeling language is defined by extending unified modeling language (UML) con-structed of class and association with fuzzy theory in order to model the fuzzy concepts of performance requirements. An effi-ciency evaluation function is introduced, based on B´ezier curves, to predict the effectiveness of systems. An algorithm is presented to transform domain models in fuzzy UML into a requirements ontology in description logic (DL) so that requirements verification can be automated with a popular DL reasoner such as Pel et.

  8. The "proactive" model of learning: Integrative framework for model-free and model-based reinforcement learning utilizing the associative learning-based proactive brain concept.

    Science.gov (United States)

    Zsuga, Judit; Biro, Klara; Papp, Csaba; Tajti, Gabor; Gesztelyi, Rudolf

    2016-02-01

    Reinforcement learning (RL) is a powerful concept underlying forms of associative learning governed by the use of a scalar reward signal, with learning taking place if expectations are violated. RL may be assessed using model-based and model-free approaches. Model-based reinforcement learning involves the amygdala, the hippocampus, and the orbitofrontal cortex (OFC). The model-free system involves the pedunculopontine-tegmental nucleus (PPTgN), the ventral tegmental area (VTA) and the ventral striatum (VS). Based on the functional connectivity of VS, model-free and model based RL systems center on the VS that by integrating model-free signals (received as reward prediction error) and model-based reward related input computes value. Using the concept of reinforcement learning agent we propose that the VS serves as the value function component of the RL agent. Regarding the model utilized for model-based computations we turned to the proactive brain concept, which offers an ubiquitous function for the default network based on its great functional overlap with contextual associative areas. Hence, by means of the default network the brain continuously organizes its environment into context frames enabling the formulation of analogy-based association that are turned into predictions of what to expect. The OFC integrates reward-related information into context frames upon computing reward expectation by compiling stimulus-reward and context-reward information offered by the amygdala and hippocampus, respectively. Furthermore we suggest that the integration of model-based expectations regarding reward into the value signal is further supported by the efferent of the OFC that reach structures canonical for model-free learning (e.g., the PPTgN, VTA, and VS).

  9. Requirements Evolution Processes Modeling%需求演化过程建模

    Institute of Scientific and Technical Information of China (English)

    张国生

    2012-01-01

    Requirements tasks, requirements activities, requirements engineering processes and requirements engineering processes system are formally defined. Requirements tasks are measured with information entropy. Requirements activities, requirements engineering processes and requirements engineering processes system are measured with joint entropy. From point of view of requirements engineering processes, microcosmic evolution of iteration and feedback of the requirements engineering processes are modeled with condition-event nets. From point of view of system engineering, macro evolution of the whole software requirements engineering processes system is modeled with dissipative structure theory.%对需求任务、需求活动、需求工程过程以及需求工程过程系统进行形式化定义.用信息熵对需求任务演化进行度量,用联合熵对需求活动、需求工程过程以及需求工程过程系统演化进行度量.从需求工程过程的角度,用条件一事件网对需求工程过程的迭代、反馈进行微观演化建模.从系统工程的角度,用耗散结构理论对整个软件需求工程过程系统进行宏观演化建模.

  10. Performance Requirements Modeling andAssessment for Active Power Ancillary Services

    DEFF Research Database (Denmark)

    Bondy, Daniel Esteban Morales; Thavlov, Anders; Tougaard, Janus Bundsgaard Mosbæk

    2017-01-01

    New sources of ancillary services are expected in the power system. For large and conventional generation units the dynamic response is well understood and detailed individual measurement is feasible, which factors in to the straightforward performance requirements applied today. For secure power...... ancillary service sources. This paper develops a modeling method for ancillary services performance requirements, including performance and verification indices. The use of the modeling method and the indices is exemplified in two case studies.......New sources of ancillary services are expected in the power system. For large and conventional generation units the dynamic response is well understood and detailed individual measurement is feasible, which factors in to the straightforward performance requirements applied today. For secure power...... system operation, a reliable service delivery is required, yet it may not be appropriate to apply conventional performance requirements to new technologies and methods. The service performance requirements and assessment methods therefore need to be generalized and standardized in order to include future...

  11. Creating a Test Validated Structural Dynamic Finite Element Model of the Multi-Utility Technology Test Bed Aircraft

    Science.gov (United States)

    Pak, Chan-Gi; Truong, Samson S.

    2014-01-01

    Small modeling errors in the finite element model will eventually induce errors in the structural flexibility and mass, thus propagating into unpredictable errors in the unsteady aerodynamics and the control law design. One of the primary objectives of Multi Utility Technology Test Bed, X-56A, aircraft is the flight demonstration of active flutter suppression, and therefore in this study, the identification of the primary and secondary modes for the structural model tuning based on the flutter analysis of X-56A. The ground vibration test validated structural dynamic finite element model of the X-56A is created in this study. The structural dynamic finite element model of the X-56A is improved using a model tuning tool. In this study, two different weight configurations of the X-56A have been improved in a single optimization run.

  12. Utilizing a train-the-trainer model for multi-site naloxone distribution programs.

    Science.gov (United States)

    Madah-Amiri, Desiree; Clausen, Thomas; Lobmaier, Philipp

    2016-06-01

    In order to have a substantial impact on overdose prevention, the expansion and scaling-up of overdose prevention with naloxone distribution (OPEND) programs are needed. However, limited literature exists on the best method to train the large number of trainers needed to implement such initiatives. As part of a national overdose prevention strategy, widespread OPEND was implemented throughout multiple low-threshold facilities in Norway. Following a two-hour 'train-the trainer course' staff were able to distribute naloxone in their facility. The course was open to all staff, regardless of educational background. To measure the effectiveness of the course, a questionnaire was given to participants immediately before and after the session, assessing knowledge on overdoses and naloxone, as well as attitudes towards the training session and distributing naloxone. In total, 511 staff were trained during 41 trainer sessions. During a two-month survey period, 54 staff participated in a questionnaire study. Knowledge scores significantly improved in all areas following the training (p<0.001). Attitude scores improved, and the majority of staff found the training useful and intended to distribute naloxone to their clients. Large-scale naloxone distribution programs are likely to continue growing, and will require competent trainers to carry out training sessions. The train-the-trainer model appears to be effective in efficiently training a high volume of trainers, improving trainers' knowledge and intentions to distribute naloxone. Further research is needed to assess the long term effects of the training session, staffs' subsequent involvement following the trainer session, and knowledge transferred to the clients. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  13. Fraud Risk Modelling: Requirements Elicitation in the Case of Telecom Services

    DEFF Research Database (Denmark)

    Yesuf, Ahmed; Wolos, Lars Peter; Rannenberg, Kai

    2017-01-01

    . In this paper, we highlight the important requirements for a usable and context-aware fraud risk modelling approach for Telecom services. To do so, we have conducted two workshops with experts from a Telecom provider and experts from multi-disciplinary areas. In order to show and document the requirements, we...

  14. The Nuremberg Code subverts human health and safety by requiring animal modeling

    OpenAIRE

    Greek Ray; Pippus Annalea; Hansen Lawrence A

    2012-01-01

    Abstract Background The requirement that animals be used in research and testing in order to protect humans was formalized in the Nuremberg Code and subsequent national and international laws, codes, and declarations. Discussion We review the history of these requirements and contrast what was known via science about animal models then with what is known now. We further analyze the predictive...

  15. Elastic Model Transitions: a Hybrid Approach Utilizing Quadratic Inequality Constrained Least Squares (LSQI) and Direct Shape Mapping (DSM)

    Science.gov (United States)

    Jurenko, Robert J.; Bush, T. Jason; Ottander, John A.

    2014-01-01

    A method for transitioning linear time invariant (LTI) models in time varying simulation is proposed that utilizes both quadratically constrained least squares (LSQI) and Direct Shape Mapping (DSM) algorithms to determine physical displacements. This approach is applicable to the simulation of the elastic behavior of launch vehicles and other structures that utilize multiple LTI finite element model (FEM) derived mode sets that are propagated throughout time. The time invariant nature of the elastic data for discrete segments of the launch vehicle trajectory presents a problem of how to properly transition between models while preserving motion across the transition. In addition, energy may vary between flex models when using a truncated mode set. The LSQI-DSM algorithm can accommodate significant changes in energy between FEM models and carries elastic motion across FEM model transitions. Compared with previous approaches, the LSQI-DSM algorithm shows improvements ranging from a significant reduction to a complete removal of transients across FEM model transitions as well as maintaining elastic motion from the prior state.

  16. A Hybrid Parallel Execution Model for Logic Based Requirement Specifications (Invited Paper

    Directory of Open Access Journals (Sweden)

    Jeffrey J. P. Tsai

    1999-05-01

    Full Text Available It is well known that undiscovered errors in a requirements specification is extremely expensive to be fixed when discovered in the software maintenance phase. Errors in the requirement phase can be reduced through the validation and verification of the requirements specification. Many logic-based requirements specification languages have been developed to achieve these goals. However, the execution and reasoning of a logic-based requirements specification can be very slow. An effective way to improve their performance is to execute and reason the logic-based requirements specification in parallel. In this paper, we present a hybrid model to facilitate the parallel execution of a logic-based requirements specification language. A logic-based specification is first applied by a data dependency analysis technique which can find all the mode combinations that exist within a specification clause. This mode information is used to support a novel hybrid parallel execution model, which combines both top-down and bottom-up evaluation strategies. This new execution model can find the failure in the deepest node of the search tree at the early stage of the evaluation, thus this new execution model can reduce the total number of nodes searched in the tree, the total processes needed to be generated, and the total communication channels needed in the search process. A simulator has been implemented to analyze the execution behavior of the new model. Experiments show significant improvement based on several criteria.

  17. Virtual Community Life Cycle: a Model to Develop Systems with Fluid Requirements

    OpenAIRE

    El Morr, Christo; Maret, Pierre de; Rioux, Marcia; Dinca-Panaitescu, Mihaela; Subercaze, Julien

    2011-01-01

    This paper reports the results of an investigation into the life cycle model needed to develop information systems for group of people with fluid requirements. For this purpose, we developed a modified spiral model and applied to the analysis, design and implementation of a virtual community for a group of researchers and organizations that collaborated in a research project and had changing system requirements? The virtual knowledge community was dedicated to support mobilization and dissemi...

  18. Do Stochastic Traffic Assignment Models Consider Differences in Road Users Utility Functions?

    DEFF Research Database (Denmark)

    Nielsen, Otto Anker

    1996-01-01

    The early stochastic traffic assignment models (e.g. Dial, 1971) build on the assump-tion that different routes are independent (the logit-model concept). Thus, the models gave severe problems in networks with overlapping routes. Daganzo & Sheffi (1977) suggested to use probit based models...

  19. The Utility of Cognitive Plausibility in Language Acquisition Modeling: Evidence from Word Segmentation

    Science.gov (United States)

    Phillips, Lawrence; Pearl, Lisa

    2015-01-01

    The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's "cognitive plausibility." We suggest that though every computational model necessarily idealizes the modeled task, an informative language acquisition…

  20. The Utility of Cognitive Plausibility in Language Acquisition Modeling: Evidence from Word Segmentation

    Science.gov (United States)

    Phillips, Lawrence; Pearl, Lisa

    2015-01-01

    The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's "cognitive plausibility." We suggest that though every computational model necessarily idealizes the modeled task, an informative language acquisition…

  1. A Review of Equation of State Models, Chemical Equilibrium Calculations and CERV Code Requirements for SHS Detonation Modelling

    Science.gov (United States)

    2009-10-01

    Beattie - Bridgeman Virial expansion The above equations are suitable for moderate pressures and are usually based on either empirical constants...CR 2010-013 October 2009 A Review of Equation of State Models, Chemical Equilibrium Calculations and CERV Code Requirements for SHS Detonation...Defence R&D Canada. A Review of Equation of State Models, Chemical Equilibrium Calculations and CERV Code Requirements for SHS Detonation

  2. Multistep Lattice-Voxel method utilizing lattice function for Monte-Carlo treatment planning with pixel based voxel model.

    Science.gov (United States)

    Kumada, H; Saito, K; Nakamura, T; Sakae, T; Sakurai, H; Matsumura, A; Ono, K

    2011-12-01

    Treatment planning for boron neutron capture therapy generally utilizes Monte-Carlo methods for calculation of the dose distribution. The new treatment planning system JCDS-FX employs the multi-purpose Monte-Carlo code PHITS to calculate the dose distribution. JCDS-FX allows to build a precise voxel model consisting of pixel based voxel cells in the scale of 0.4×0.4×2.0 mm(3) voxel in order to perform high-accuracy dose estimation, e.g. for the purpose of calculating the dose distribution in a human body. However, the miniaturization of the voxel size increases calculation time considerably. The aim of this study is to investigate sophisticated modeling methods which can perform Monte-Carlo calculations for human geometry efficiently. Thus, we devised a new voxel modeling method "Multistep Lattice-Voxel method," which can configure a voxel model that combines different voxel sizes by utilizing the lattice function over and over. To verify the performance of the calculation with the modeling method, several calculations for human geometry were carried out. The results demonstrated that the Multistep Lattice-Voxel method enabled the precise voxel model to reduce calculation time substantially while keeping the high-accuracy of dose estimation.

  3. A SLAM II simulation model for analyzing space station mission processing requirements

    Science.gov (United States)

    Linton, D. G.

    1985-01-01

    Space station mission processing is modeled via the SLAM 2 simulation language on an IBM 4381 mainframe and an IBM PC microcomputer with 620K RAM, two double-sided disk drives and an 8087 coprocessor chip. Using a time phased mission (payload) schedule and parameters associated with the mission, orbiter (space shuttle) and ground facility databases, estimates for ground facility utilization are computed. Simulation output associated with the science and applications database is used to assess alternative mission schedules.

  4. Patient-Derived Xenograft Models of Non-Small Cell Lung Cancer and Their Potential Utility in Personalized Medicine

    Science.gov (United States)

    Morgan, Katherine M.; Riedlinger, Gregory M.; Rosenfeld, Jeffrey; Ganesan, Shridar; Pine, Sharon R.

    2017-01-01

    Traditional preclinical studies of cancer therapeutics have relied on the use of established human cell lines that have been adapted to grow in the laboratory and, therefore, may deviate from the cancer they were meant to represent. With the emphasis of cancer drug development shifting from non-specific cytotoxic agents to rationally designed molecularly targeted therapies or immunotherapy comes the need for better models with predictive value regarding therapeutic activity and response in clinical trials. Recently, the diversity and accessibility of immunodeficient mouse strains has greatly enhanced the production and utility of patient-derived xenograft (PDX) models for many tumor types, including non-small cell lung cancer (NSCLC). Combined with next-generation sequencing, NSCLC PDX mouse models offer an exciting tool for drug development and for studying targeted therapies while utilizing patient samples with the hope of eventually aiding in clinical decision-making. Here, we describe NSCLC PDX mouse models generated by us and others, their ability to reflect the parental tumors’ histomorphological characteristics, as well as the effect of clonal selection and evolution on maintaining genomic integrity in low-passage PDXs compared to the donor tissue. We also raise vital questions regarding the practical utility of PDX and humanized PDX models in predicting patient response to therapy and make recommendations for addressing those questions. Once collaborations and standardized xenotransplantation and data management methods are established, NSCLC PDX mouse models have the potential to be universal and invaluable as a preclinical tool that guides clinical trials and standard therapeutic decisions. PMID:28154808

  5. Utilization of remote sensing data on meteorological and vegetation characteristics for modeling water and heat regimes of large agricultural region

    Science.gov (United States)

    Muzylev, Eugene; Startseva, Zoya; Uspensky, Alexander; Volkova, Elena

    2016-04-01

    region. In the frame of this approach the transition from the rainfall intensity estimation to the calculation of their daily sums has been fulfilled at that two variants of this calculation have been realized which focusing on climate researches and operational monitoring. Such transition has required verifying the accuracy of the estimates obtained in both variants at each time step. This verification has included comparison of area distributions of satellite-derived precipitation estimates and analogous estimates obtained by the interpolation of ground-based observation data. The probability of correct precipitation zone detection from satellite data when comparing with ground-based meteorological observations has amounted 75-85 %. In both variants of calculating precipitation for the region of interest in addition to the fields of daily rainfall the fields of their monthly and annual sums have been built. All three sums are consistent with each other and with a ground-based observation data although the satellite-derived estimates are more "smooth" in comparison with ground-based ones. Their discrepancies are in the range of the rainfall estimation errors using the MTM and they are peculiar to the local maxima for which satellite-derived rainfall is less than ground-measured values. This may be due to different scales of space-averaged satellite and point-wise ground-based estimates. To utilize satellite-derived estimates of meteorological and vegetation characteristics in the SVAT model the procedures of replacing the ground-based values of precipitation, LST, LAI and B by corresponding satellite-derived values have been developed taking into account spatial heterogeneity of their fields. The correctness of such replacement has been confirmed by the results of comparing the values of soil water content W and evapotranspiration Ev modeled and measured at agricultural meteorological stations. In particular, when the difference of precipitation sums for the vegetation

  6. A Study of the Driving Force Model Revealing Changes in Land Utilization Level Based on 3S Technologies--The Example of Yuanmou, Yunnan, China

    Institute of Scientific and Technical Information of China (English)

    HE Jin-feng; CHEN Guo-jie; YANG Zhong

    2005-01-01

    This paper introduced the theory and approaches of building driving force models revealing the changes in land utilization level by integrating RS, GPS, and GIS technologies based on the example of Yuanmou County of Yunnan Province. We first created the land utilization type database, natural driving forces for land utilization database, and human driving forces for land utilization database. Then we obtained the dependent and the independent variables of changes in land utilization level by exploring various data. Lastly we screened major factors affecting changes in land utilization level by using the po- werful spatial correlation analysis and main component analysis module of GIS and obtained a multivariable linear regression model of the changes in land utilization level by using GIS spatial regression analysis module.

  7. Utility usage forecasting

    Energy Technology Data Exchange (ETDEWEB)

    Hosking, Jonathan R. M.; Natarajan, Ramesh

    2017-08-22

    The computer creates a utility demand forecast model for weather parameters by receiving a plurality of utility parameter values, wherein each received utility parameter value corresponds to a weather parameter value. Determining that a range of weather parameter values lacks a sufficient amount of corresponding received utility parameter values. Determining one or more utility parameter values that corresponds to the range of weather parameter values. Creating a model which correlates the received and the determined utility parameter values with the corresponding weather parameters values.

  8. Structural model requirements to describe microbial inactivation during a mild heat treatment.

    Science.gov (United States)

    Geeraerd, A H; Herremans, C H; Van Impe, J F

    2000-09-10

    The classical concept of D and z values, established for sterilisation processes, is unable to deal with the typical non-loglinear behaviour of survivor curves occurring during the mild heat treatment of sous vide or cook-chill food products. Structural model requirements are formulated, eliminating immediately some candidate model types. Promising modelling approaches are thoroughly analysed and, if applicable, adapted to the specific needs: two models developed by Casolari (1988), the inactivation model of Sapru et al. (1992), the model of Whiting (1993), the Baranyi and Roberts growth model (1994), the model of Chiruta et al. (1997), the model of Daughtry et al. (1997) and the model of Xiong et al. (1999). A range of experimental data of Bacillus cereus, Yersinia enterocolitica, Escherichia coli O157:H7, Listeria monocytogenes and Lactobacillus sake are used to illustrate the different models' performances. Moreover, a novel modelling approach is developed, fulfilling all formulated structural model requirements, and based on a careful analysis of literature knowledge of the shoulder and tailing phenomenon. Although a thorough insight in the occurrence of shoulders and tails is still lacking from a biochemical point of view, this newly developed model incorporates the possibility of a straightforward interpretation within this framework.

  9. Pareto utility

    NARCIS (Netherlands)

    Masako, I.; Laeven, R.J.A.; Magnus, J.R.; Muris, C.

    2013-01-01

    In searching for an appropriate utility function in the expected utility framework, we formulate four properties that we want the utility function to satisfy. We conduct a search for such a function, and we identify Pareto utility as a function satisfying all four desired properties. Pareto utility

  10. The Utility of Cognitive Plausibility in Language Acquisition Modeling: Evidence From Word Segmentation

    National Research Council Canada - National Science Library

    Phillips, Lawrence; Pearl, Lisa

    2015-01-01

    The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's cognitive plausibility...

  11. Capital Regulation, Liquidity Requirements and Taxation in a Dynamic Model of Banking

    NARCIS (Netherlands)

    Di Nicolo, G.; Gamba, A.; Lucchetta, M.

    2011-01-01

    This paper formulates a dynamic model of a bank exposed to both credit and liquidity risk, which can resolve financial distress in three costly forms: fire sales, bond issuance ad equity issuance. We use the model to analyze the impact of capital regulation, liquidity requirements and taxation on ba

  12. Capital Regulation, Liquidity Requirements and Taxation in a Dynamic Model of Banking

    NARCIS (Netherlands)

    Di Nicolo, G.; Gamba, A.; Lucchetta, M.

    2011-01-01

    This paper formulates a dynamic model of a bank exposed to both credit and liquidity risk, which can resolve financial distress in three costly forms: fire sales, bond issuance and equity issuance. We use the model to analyze the impact of capital regulation, liquidity requirements and taxation on b

  13. UTILIZATION OF THE NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY FIRE DYNAMICS SIMULATION COMPUTER MODEL

    Energy Technology Data Exchange (ETDEWEB)

    L. BARTLEIN

    2001-05-01

    The objective of this report is to provide a methodology for utilization of the NIST FDS code to evaluate the effects of radiant and convective heating from single and multiple fire sources, on heat sensitive targets as Special Nuclear Materials (SNM), and High Explosives (HE). The presentation will demonstrate practical applications of the FDS computer program in fire hazards analysis, and illustrate the advantages over hand calculations for radiant heat and convective transfer and fire progression. The ''visualization'' of radiant and convective heat effects will be demonstrated as a tool for supporting conclusions of fire hazards analysis and TSR development.

  14. Connecting Requirements to Architecture and Analysis via Model-Based Systems Engineering

    Science.gov (United States)

    Cole, Bjorn F.; Jenkins, J. Steven

    2015-01-01

    In traditional systems engineering practice, architecture, concept development, and requirements development are related but still separate activities. Concepts for operation, key technical approaches, and related proofs of concept are developed. These inform the formulation of an architecture at multiple levels, starting with the overall system composition and functionality and progressing into more detail. As this formulation is done, a parallel activity develops a set of English statements that constrain solutions. These requirements are often called "shall statements" since they are formulated to use "shall." The separation of requirements from design is exacerbated by well-meaning tools like the Dynamic Object-Oriented Requirements System (DOORS) that remained separated from engineering design tools. With the Europa Clipper project, efforts are being taken to change the requirements development approach from a separate activity to one intimately embedded in formulation effort. This paper presents a modeling approach and related tooling to generate English requirement statements from constraints embedded in architecture definition.

  15. Connecting Requirements to Architecture and Analysis via Model-Based Systems Engineering

    Science.gov (United States)

    Cole, Bjorn F.; Jenkins, J. Steven

    2015-01-01

    In traditional systems engineering practice, architecture, concept development, and requirements development are related but still separate activities. Concepts for operation, key technical approaches, and related proofs of concept are developed. These inform the formulation of an architecture at multiple levels, starting with the overall system composition and functionality and progressing into more detail. As this formulation is done, a parallel activity develops a set of English statements that constrain solutions. These requirements are often called "shall statements" since they are formulated to use "shall." The separation of requirements from design is exacerbated by well-meaning tools like the Dynamic Object-Oriented Requirements System (DOORS) that remained separated from engineering design tools. With the Europa Clipper project, efforts are being taken to change the requirements development approach from a separate activity to one intimately embedded in formulation effort. This paper presents a modeling approach and related tooling to generate English requirement statements from constraints embedded in architecture definition.

  16. Wastewater treatment models in teaching and training: the mismatch between education and requirements for jobs.

    Science.gov (United States)

    Hug, Thomas; Benedetti, Lorenzo; Hall, Eric R; Johnson, Bruce R; Morgenroth, Eberhard; Nopens, Ingmar; Rieger, Leiv; Shaw, Andrew; Vanrolleghem, Peter A

    2009-01-01

    As mathematical modeling of wastewater treatment plants has become more common in research and consultancy, a mismatch between education and requirements for model-related jobs has developed. There seems to be a shortage of skilled people, both in terms of quantity and in quality. In order to address this problem, this paper provides a framework to outline different types of model-related jobs, assess the required skills for these jobs and characterize different types of education that modelers obtain "in school" as well as "on the job". It is important to consider that education of modelers does not mainly happen in university courses and that the variety of model related jobs goes far beyond use for process design by consulting companies. To resolve the mismatch, the current connection between requirements for different jobs and the various types of education has to be assessed for different geographical regions and professional environments. This allows the evaluation and improvement of important educational paths, considering quality assurance and future developments. Moreover, conclusions from a workshop involving practitioners and academics from North America and Europe are presented. The participants stressed the importance of non-technical skills and recommended strengthening the role of realistic modeling experience in university training. However, this paper suggests that all providers of modeling education and support, not only universities, but also software suppliers, professional associations and companies performing modeling tasks are called to assess and strengthen their role in training and support of professional modelers.

  17. The Nuremberg Code subverts human health and safety by requiring animal modeling

    Science.gov (United States)

    2012-01-01

    Background The requirement that animals be used in research and testing in order to protect humans was formalized in the Nuremberg Code and subsequent national and international laws, codes, and declarations. Discussion We review the history of these requirements and contrast what was known via science about animal models then with what is known now. We further analyze the predictive value of animal models when used as test subjects for human response to drugs and disease. We explore the use of animals for models in toxicity testing as an example of the problem with using animal models. Summary We conclude that the requirements for animal testing found in the Nuremberg Code were based on scientifically outdated principles, compromised by people with a vested interest in animal experimentation, serve no useful function, increase the cost of drug development, and prevent otherwise safe and efficacious drugs and therapies from being implemented. PMID:22769234

  18. The Nuremberg Code subverts human health and safety by requiring animal modeling

    Directory of Open Access Journals (Sweden)

    Greek Ray

    2012-07-01

    Full Text Available Abstract Background The requirement that animals be used in research and testing in order to protect humans was formalized in the Nuremberg Code and subsequent national and international laws, codes, and declarations. Discussion We review the history of these requirements and contrast what was known via science about animal models then with what is known now. We further analyze the predictive value of animal models when used as test subjects for human response to drugs and disease. We explore the use of animals for models in toxicity testing as an example of the problem with using animal models. Summary We conclude that the requirements for animal testing found in the Nuremberg Code were based on scientifically outdated principles, compromised by people with a vested interest in animal experimentation, serve no useful function, increase the cost of drug development, and prevent otherwise safe and efficacious drugs and therapies from being implemented.

  19. Solar energy hot water heating and electric utilities. A model validation

    Science.gov (United States)

    1981-10-01

    TRNSYS is a residential solar simulation program designed to provide detailed simulations of individual solar systems composed of almost any presently used residential solar technology. The model is described and a validation of the model is presented using a group of domestic solar hot water systems in the metropolitan Philadelphia area. The collection and reduction of the data used is discussed, and the TRNSYS modeling of the systems is presented. The model results are given and a sensitivity analysis of the models was performed to determine the effect of input changes on the electric auxiliary backup consumption.

  20. Utilizing anisotropic Preisach-type models in the accurate simulation of magnetostriction

    Energy Technology Data Exchange (ETDEWEB)

    Adly, A.A. [Cairo Univ., Giza (Egypt). Electrical Power and Machines Dept.; Mayergoyz, I.D. [Univ. of Maryland, College Park, MD (United States). Electrical Engineering Dept.; Bergqvist, A. [Royal Inst. of Tech., Stockholm (Sweden). Dept. of Electrical Power Engineering

    1997-09-01

    Magnetostriction models are being widely used in the development of fine positioning and active vibration damping devices. This paper presents a new approach for simulating 1-D magnetostriction using 2-D anisotropic Preisach-type models. In this approach, identification of the model takes into account measured flux density versus field and strain versus field curves for different stress values. Consequently, a more accurate magnetostriction model may be obtained. Details of the identification procedure as well as experimental testing of the proposed model are given.

  1. Implications of outer-zone radiations on operations in the geostationary region utilizing the AE4 environmental model

    Science.gov (United States)

    Wilson, J. W.; Denn, F. M.

    1977-01-01

    The radiation exposure in the region of geostationary orbits is examined in search for means of optimizing human performance. It is found that the use of slightly inclined circular orbits is one means by which exposure and spacesuit thickness requirements can be reduced. Another effective technique is to limit the extravehicular activity to those days when the short term fluctuations result in low exposure. Space-suit shielding approaching 1/2 sq cm or less may be possible by utilizing work stoppages and inclined orbits. If aluminum and other low-atomic-number materials are used to construct the habitat, then excessive wall thicknesses are required. If special bremsstrahlung shielding is used, then the habitat shield may be reduced to as low as 2 g/sq cm. Numerous tables and graphs are presented for future analysis of dose in the geostationary region.

  2. Cognitive Radio Engine Model Utilizing Soft Fusion Based Genetic Algorithm for Cooperative Spectrum Optimization

    Directory of Open Access Journals (Sweden)

    Kamal Hossain

    2013-04-01

    Full Text Available Cognitive radio (CR is to detect the presence of primary users (PUs reliably in order to reduce theinterference to licensed communications. Genetic algorithms (GAs are well suited for CR optimizationproblems to increase efficiency of bandwidth utilization by manipulating its unused portions of theapparent spectrum. In this paper, a binary genetic algorithm (BGA-based soft fusion (SF scheme forcooperative spectrum sensing in cognitive radio network is proposed to improve detection performance andbandwidth utilization. The BGA-based optimization method is implemented at the fusion centre of a linearSF scheme to optimize the weighting coefficients vector to maximize global probability of detectionperformance. Simulation results and analyses confirm that the proposed scheme meets real timerequirements of cognitive radio spectrum sensing and it outperforms conventional natural deflectioncoefficient- (NDC-, modified deflection coefficient- (MDC-, maximal ratio combining- (MRC- and equalgain combining- (EGC- based SDF schemes as well as the OR-rule based hard decision fusion (HDF. Thepropose BGA scheme also converges fast and achieves the optimum performance, which means that BGAbasedmethod is efficient and quite stable also.

  3. THE STRUCTURAL DYNAMIC ECONOMIC MODEL SDEM-2: FROM SYSTEM DYNAMIC SOLUTIONS TO LINEAR AND LOGARITHMIC UTILITY MAXIMIZATION

    Directory of Open Access Journals (Sweden)

    Kovalevsky D. V.

    2014-12-01

    Full Text Available The Structural Dynamic Economic Model SDEM-2 is essentially a model of a closed economy growing under conditions of conflict of interests of two powerful aggregate actors: entrepreneurs and wage-earners. We study the economic growth within SDEM-2 both in system dynamic and optimization model setups. For the system dynamic model setup, four alternative control strategies of entrepreneurs are considered in detail: the “altruistic” control strategy, the “moderate output growth” control strategy, the “here and now” control strategy, and the “moderate dividend growth” control strategy. In the optimization setup the Pontryagin's maximum principle is applied to SDEM-2 to solve the linear and logarithmic utility maximization problems. The degree of sub-optimality of system dynamic solutions is evaluated

  4. The experimental autoimmune encephalomyelitis (EAE) model of MS: utility for understanding disease pathophysiology and treatment.

    Science.gov (United States)

    Robinson, Andrew P; Harp, Christopher T; Noronha, Avertano; Miller, Stephen D

    2014-01-01

    While no single model can exactly recapitulate all aspects of multiple sclerosis (MS), animal models are essential in understanding the induction and pathogenesis of the disease and to develop therapeutic strategies that limit disease progression and eventually lead to effective treatments for the human disease. Several different models of MS exist, but by far the best understood and most commonly used is the rodent model of experimental autoimmune encephalomyelitis (EAE). This model is typically induced by either active immunization with myelin-derived proteins or peptides in adjuvant or by passive transfer of activated myelin-specific CD4+ T lymphocytes. Mouse models are most frequently used because of the inbred genotype of laboratory mice, their rapid breeding capacity, the ease of genetic manipulation, and availability of transgenic and knockout mice to facilitate mechanistic studies. Although not all therapeutic strategies for MS have been developed in EAE, all of the current US Food and Drug Administration (FDA)-approved immunomodulatory drugs are effective to some degree in treating EAE, a strong indicator that EAE is an extremely useful model to study potential treatments for MS. Several therapies, such as glatiramer acetate (GA: Copaxone), and natalizumab (Tysabri), were tested first in the mouse model of EAE and then went on to clinical trials. Here we discuss the usefulness of the EAE model in understanding basic disease pathophysiology and developing treatments for MS as well as the potential drawbacks of this model.

  5. Quality Requirements Put On The Inconel 625 Austenite Layer Used On The Sheet Pile Walls Of The Boiler’s Evaporator To Utilize Waste Thermally

    Directory of Open Access Journals (Sweden)

    Słania J.

    2015-06-01

    Full Text Available Quality requirements and tests taken on the surfacing layer Inconel 625 are presented in the article. The reasons of using surfacing layer Inconel 625 and technologies of its making with a particular emphasis on the CMT method are described. Quality requirements for the surfacing weld Inconel 625 are provided. Basic requirements included in the Merkblatt 1166, as well as additional requirements, which are reflected in the technical specifications of the boilers’ producers are specified.

  6. Relevance Of Utility Maximization In Student University Choice – A Consumption-Based Model For Higher Education

    Directory of Open Access Journals (Sweden)

    Eric S. SCHWARTZ

    2011-05-01

    Full Text Available This paper applies a model of utility-maximization to better understand the university choice process. Student decision-making for university choice is conceptualized as a purchase decision process through which students weigh the costs of colleges or universities they choose against their perceived benefits of attending these institutions. The key issues are the impact of consumer’s preferences, income, tuition, and costs in college decision-making. From this perspective, the paper describes the relationship between utility maximization and educational demand, effects of tuition increases, tuition discounting, and financial aid subsidies on university choice. A decision-making scheme for educational consumption is used in order to identify the stages of the university choice process and to predict the behavior of consumers in the higher education marketplace. The analysis points to the need to better inform students about the cost of postsecondary education which is a highly relevant aspect in the university choice process.

  7. Driving forces behind the increasing cardiovascular treatment intensity.A dynamic epidemiologic model of trends in Danish cardiovascular drug utilization.

    DEFF Research Database (Denmark)

    Kildemoes, Helle Wallach; Andersen, Morten

    to 619 DDD/TID from 1996 to 2005 (117%). Population ageing accounted for 22 percentage points. Treatment intensity with statins increased from 5 to 121 DDD/TID. Population ageing accounted for one eighth of this increase. Increasing incidence rates was the main driving force behind the growing statin......Background: In many Western countries cardiovascular treatment intensity (DDD/1000 inhabitants/day, DDD/TID) has grown substantially during the last decades. Changed drug utilization pattern - rather than population ageing - was hypothesized to be the main driving force behind the growth....... Objectives: To investigate the driving forces behind the increasing treatment prevalence of cardiovascular drugs, in particular statins, by means of a dynamic epidemiologic drug utilization model. Methods: Material: All Danish residents older than 20 years by January 1, 1996 (4.0 million inhabitants), were...

  8. Requirement and utilization of sex abuse consultation among unmarried youth in China%中国未婚青年性侵犯咨询需要与实现状况分析

    Institute of Scientific and Technical Information of China (English)

    韩优莉; 郑晓瑛; 陈功; 张蕾

    2012-01-01

    Objective To analyze the requirement and utilization of sex abuse consultation among unmarried youth in China and to explore the obstacles for the utilization of sex abuse consultation. Methods The data from the first country-level unmarried youth sexual and reproductive health survey were used. Binary logistic model was used to analyze the diversities. Results The rate for the consultation requirement was 2. 1% ,thereinto the usage rate was only 32. 3%. The rates of consultation requirement were significantly higher in the youth living in western,studying in school,having higher education level .with lower family income, being one child, and having girl/boy fiends. The consultation usage rate was significantly higher in the youth at young age and with high education level. The main obstacles preventing the youth taking sex abuse consulation included embarrassment(23. 82% ) .being unware of the way from whom to get the consulattion(20. 78% ), without serious sex abuse( 14. 32% ) ,and no service facility nearby( 12.43% ). Conclusion There is a great requirement for sex abuse consultation among unmarried youth in China,especially among those in western area and with low income. Education is a protection factor for the utilization of sex abuse consultation.%目的 分析未婚青年性侵犯咨询需要和实现状况以及青年利用咨询服务的障碍.方法 利用首次(2009年)全国范围22 288名未婚青年性与生殖健康状况调查数据,采用两分类Logistic回归分析不同特征青年性侵犯咨询需要和实现状况.结果 未婚青年性侵犯咨询需要率为2.1%,实现率为32.3%;西部地区、在校生、大专及以上教育程度、低收入组、独生子女、有男/女朋友青年性侵犯咨询需要率较高;低年龄组、高教育程度青年咨询实现较高;“不好意思”(23.82%),“不知道向谁咨询”(20.78%)“问题不严重”(14.32%),“附近没有这样的服务”(12.43%)是咨询实

  9. Army Business Transformation: The Utility of Using Corporate Business Models within the Institutional Army

    National Research Council Canada - National Science Library

    Bailer, Jr., John J

    2007-01-01

    .... Through a survey of the literature of published corporate business plans and models, military reports, Army depot case studies, and comparative analysis of emerging computer software technology...

  10. The Effect of Utilizing Organizational Culture Improvement Model of Patient Education on Coronary Artery Bypass Graft Patients’ Anxiety and Satisfaction: Theory Testing

    Science.gov (United States)

    Farahani, Mansoureh Ashghali; Ghaffari, Fatemeh; Norouzinezhad, Faezeh; Orak, Roohangiz Jamshidi

    2016-01-01

    Introduction Due to the increasing prevalence of arteriosclerosis and the mortality caused by this disease, Coronary Artery Bypass Graft (CABG) has become one of the most common surgical procedures. Utilization of patient education is approved as an effective solution for increasing patient survival and outcomes of treatment. However, failure to consider different aspects of patient education has turned this goal into an unattainable one. The objective of this research was to determine the effect of utilizing the organizational culture improvement model of patient education on CABG patients’ anxiety and satisfaction. Methods The present study is a randomized controlled trial. This study was conducted on eighty CABG patients. The patients were selected from the CCU and Post-CCU wards of a hospital affiliated with Iran University of Medical Sciences in Tehran, Iran, during 2015. Eshpel Burger’s Anxiety Inventory and Patients’ Satisfaction Questionnaire were used to collect the required information. Levels of anxiety and satisfaction of patients before intervention and at the time of release were measured. The intervention took place after preparing a programmed package based on the organizational culture improvement model for the following dimensions: effective communication, participatory decision-making, goal setting, planning, implementation and recording, supervision and control, and improvement of motivation. After recording the data, it was analyzed in the chi-square test, t-independent and Mann-Whitney U tests. The significance level of tests was assumed to be 0.05. SPSS version 18 was also utilized for data analysis. Results Research results revealed that variations in the mean scores of situational and personality anxiety of the control and experiment group were descending following the intervention, but the decrease was higher in the experiment group (p≤0.0001). In addition, the variations of the mean scores of patients’ satisfaction with

  11. GIS-based suitability modeling and multi-criteria decision analysis for utility scale solar plants in four states in the Southeast U.S

    Science.gov (United States)

    Tisza, Kata

    Photovoltaic (PV) development shows significantly smaller growth in the Southeast U.S., than in the Southwest; which is mainly due to the low cost of fossil-fuel based energy production in the region and the lack of solar incentives. However, the Southeast has appropriate insolation conditions (4.0-6.0 KWh/m2/day) for photovoltaic deployment and in the past decade the region has experienced the highest population growth for the entire country. These factors, combined with new renewable energy portfolio policies, could create an opportunity for PV to provide some of the energy that will be required to sustain this growth. The goal of the study was to investigate the potential for PV generation in the Southeast region by identifying suitable areas for a utility-scale solar power plant deployment. Four states with currently low solar penetration were studied: Georgia, North Carolina, South Carolina and Tennessee. Feasible areas were assessed with Geographic Information Systems (GIS) software using solar, land use and population growth criteria combined with proximity to transmission lines and roads. After the GIS-based assessment of the areas, technological potential was calculated for each state. Multi-decision analysis model (MCDA) was used to simulate the decision making method for a strategic PV installation. The model accounted for all criteria necessary to consider in case of a PV development and also included economic and policy criteria, which is thought to be a strong influence on the PV market. Three different scenarios were established, representing decision makers' theoretical preferences. Map layers created in the first part were used as basis for the MCDA and additional technical, economic and political/market criteria were added. A sensitivity analysis was conducted to test the model's robustness. Finally, weighted criteria were assigned to the GIS map layers, so that the different preference systems could be visualized. As a result, lands suitable for

  12. Expected Utility and Entropy-Based Decision-Making Model for Large Consumers in the Smart Grid

    Directory of Open Access Journals (Sweden)

    Bingtuan Gao

    2015-09-01

    Full Text Available In the smart grid, large consumers can procure electricity energy from various power sources to meet their load demands. To maximize its profit, each large consumer needs to decide their energy procurement strategy under risks such as price fluctuations from the spot market and power quality issues. In this paper, an electric energy procurement decision-making model is studied for large consumers who can obtain their electric energy from the spot market, generation companies under bilateral contracts, the options market and self-production facilities in the smart grid. Considering the effect of unqualified electric energy, the profit model of large consumers is formulated. In order to measure the risks from the price fluctuations and power quality, the expected utility and entropy is employed. Consequently, the expected utility and entropy decision-making model is presented, which helps large consumers to minimize their expected profit of electricity procurement while properly limiting the volatility of this cost. Finally, a case study verifies the feasibility and effectiveness of the proposed model.

  13. The Utility of Remotely-Sensed Land Surface Temperature from Multiple Platforms For Testing Distributed Hydrologic Models over Complex Terrain

    Science.gov (United States)

    Xiang, T.; Vivoni, E. R.; Gochis, D. J.

    2011-12-01

    Land surface temperature (LST) is a key parameter in watershed energy and water budgets that is relatively unexplored as a validation metric for distributed hydrologic models. Ground-based or remotely-sensed LST datasets can provide insights into a model's ability in reproducing water and energy fluxes across a large range of terrain, vegetation, soil and meteorological conditions. As a result, spatiotemporal LST observations can serve as a strong constraint for distributed simulations and can augment other available in-situ data. LST fields are particular useful in mountainous areas where temperature varies with terrain properties and time-variable surface conditions. In this study, we collect and process remotely-sensed fields from several satellite platforms - Landsat 5/7, MODIS and ASTER - to capture spatiotemporal LST dynamics at multiple resolutions and with frequent repeat visits. We focus our analysis of these fields over the Sierra Los Locos basin (~100 km2) in Sonora, Mexico, for a period encompassing the Soil Moisture Experiment in 2004 and the North American Monsoon Experiment (SMEX04-NAME). Satellite observations are verified using a limited set of ground data from manual sampling at 30 locations and continuous measurements at 2 sites. First, we utilize the remotely-sensed fields to understand the summer seasonal evolution of LST in the basin in response to the arrival of summer storms and the vigorous ecosystem greening organized along elevation bands. Then, we utilize the ground and remote-sensing datasets to test the distributed predictions of the TIN-based Real-time Integrated Basin Simulator (tRIBS) under conditions accounting static and dynamic vegetation patterns. Basin-averaged and distributed comparisons are carried out for two different terrain products (INEGI aerial photogrammetry and ASTER stereo processing) used to derive the distributed model domain. Results from the comparisons are discussed in light of the utility of remotely-sensed LST

  14. Renewable Resources: a national catalog of model projects. Volume 4. Western Solar Utilization Network Region

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-07-01

    This compilation of diverse conservation and renewable energy projects across the United States was prepared through the enthusiastic participation of solar and alternate energy groups from every state and region. Compiled and edited by the Center for Renewable Resources, these projects reflect many levels of innovation and technical expertise. In many cases, a critique analysis is presented of how projects performed and of the institutional conditions associated with their success or failure. Some 2000 projects are included in this compilation; most have worked, some have not. Information about all is presented to aid learning from these experiences. The four volumes in this set are arranged in state sections by geographic region, coinciding with the four Regional Solar Energy Centers. The table of contents is organized by project category so that maximum cross-referencing may be obtained. This volume includes information on the Western Solar Utilization Network Region. (WHK)

  15. Phenomenological model of the clavulanic acid production process utilizing Streptomyces clavuligerus

    Directory of Open Access Journals (Sweden)

    A. Baptista-Neto

    2000-12-01

    Full Text Available The kinetics of clavulanic acid production process by Streptomyces clavuligerus NRRL 3585 was studied. Experiments were carried out in a 4 liters bioreactor, utilizing 2 complex media containing glycerol as the carbon and energy source, and peptone or Samprosoy 90NB (soybean protein as nitrogen source. Temperature was kept at 28°C and the dissolved oxygen was controlled automatically at 40 % saturation value. Samples were withdrawn for determination of cell mass (only peptone medium, glycerol and product concentrations. Gas analyzers allowed on line determination of CO2 and O2 contents in the exit gas. With Samprosoy, cell mass was evaluated by determining glycerol consumption and considering the cell yield, Y X/S, as being the same for both cases. Oxygen uptake and CO2 production rates were strongly related to growth and substrate consumption, allowing determination of stoichiometric constants in relation to growth, substrate, oxygen, product and carbon dioxide.

  16. Models and Tabu Search Metaheuristics for Service Network Design with Asset-Balance Requirements

    DEFF Research Database (Denmark)

    Pedersen, Michael Berliner; Crainic, T.G.; Madsen, Oli B.G.

    2009-01-01

    This paper focuses on a generic model for service network design, which includes asset positioning and utilization through constraints on asset availability at terminals. We denote these relations as "design-balance constraints" and focus on the design-balanced capacitated multicommodity network...... design model, a generalization of the capacitated multicommodity network design model generally used in service network design applications. Both arc-and cycle-based formulations for the new model are presented. The paper also proposes a tabu search metaheuristic framework for the arc-based formulation....... Results on a wide range of network design problem instances from the literature indicate the proposed method behaves very well in terms of computational efficiency and solution quality....

  17. Utilization of sulfate additives in biomass combustion: fundamental and modeling aspects

    DEFF Research Database (Denmark)

    Wu, Hao; Jespersen, Jacob Boll; Grell, Morten Nedergaard

    2013-01-01

    Sulfates, such as ammonium sulfate, aluminum sulfate and ferric sulfate, are effective additives for converting the alkali chlorides released from biomass combustion to the less harmful alkali sulfates. Optimization of the use of these additives requires knowledge on their decomposition rate and ...

  18. An emergency dispatch model considering the urgency of the requirement for reliefs in different disaster areas

    Directory of Open Access Journals (Sweden)

    Liu Sheng

    2015-11-01

    Full Text Available Abstract: Purpose: Frequent sudden-onset disasters which have threatened the survival of human and the development of society force the public to pay an increasing attention to emergency management. A challenging task in the process of emergency management is emergency dispatch of reliefs. An emergency dispatch model considering the urgency of the requirement for reliefs in different disaster areas is proposed in this paper to dispatch reliefs reasonably and reduce the effect of sudden-onset disasters. Design/methodology/approach: Firstly, quantitative assessment on the urgency of the requirement for reliefs in different disaster areas is done by an evaluation method based on Fuzzy Comprehensive Evaluation and improved Evidence Reasoning which is proposed in this paper. And then based the quantitative results, an emergency dispatch model aiming to minimize the response time, the distribution cost and the unsatisfied rate of the requirement for reliefs is proposed, which reflects the requests of disaster areas under emergency, including the urgency of requirement, the economy of distribution and the equity of allocation. Finally, the Genetic Algorithm is improved based on the adaptive crossover and mutation probability function to solve the emergency dispatch model. Findings and Originality/value: A case that the Y hydraulic power enterprise carries on emergency dispatch of reliefs under continuous sudden-onset heavy rain is given to illustrate the availability of the emergency dispatch model proposed in this paper. The results show that the emergency dispatch model meets the distribution priority requirement of disaster area with the higher urgency, so thatreliefs are supplied more timely. Research limitations/implications: The emergency dispatch model faced to large scale sudden-onset disasters is complex. The quantity of reliefs that disaster area requires and the running time of vehicles are viewed as available information, and the problem

  19. The utility of Apc-mutant rats in modeling human colon cancer

    Directory of Open Access Journals (Sweden)

    Amy A. Irving

    2014-11-01

    Full Text Available Prior to the advent of genetic engineering in the mouse, the rat was the model of choice for investigating the etiology of cancer. Now, recent advances in the manipulation of the rat genome, combined with a growing recognition of the physiological differences between mice and rats, have reignited interest in the rat as a model of human cancer. Two recently developed rat models, the polyposis in the rat colon (Pirc and Kyoto Apc Delta (KAD strains, each carry mutations in the intestinal-cancer-associated adenomatous polyposis coli (Apc gene. In contrast to mouse models carrying Apc mutations, in which cancers develop mainly in the small intestine rather than in the colon and there is no gender bias, these rat models exhibit colonic predisposition and gender-specific susceptibility, as seen in human colon cancer. The rat also provides other experimental resources as a model organism that are not provided by the mouse: the structure of its chromosomes facilitates the analysis of genomic events, the size of its colon permits longitudinal analysis of tumor growth, and the size of biological samples from the animal facilitates multiplexed molecular analyses of the tumor and its host. Thus, the underlying biology and experimental resources of these rat models provide important avenues for investigation. We anticipate that advances in disease modeling in the rat will synergize with resources that are being developed in the mouse to provide a deeper understanding of human colon cancer.

  20. The utility of Apc-mutant rats in modeling human colon cancer

    Science.gov (United States)

    Irving, Amy A.; Yoshimi, Kazuto; Hart, Marcia L.; Parker, Taybor; Clipson, Linda; Ford, Madeline R.; Kuramoto, Takashi; Dove, William F.; Amos-Landgraf, James M.

    2014-01-01

    Prior to the advent of genetic engineering in the mouse, the rat was the model of choice for investigating the etiology of cancer. Now, recent advances in the manipulation of the rat genome, combined with a growing recognition of the physiological differences between mice and rats, have reignited interest in the rat as a model of human cancer. Two recently developed rat models, the polyposis in the rat colon (Pirc) and Kyoto Apc Delta (KAD) strains, each carry mutations in the intestinal-cancer-associated adenomatous polyposis coli (Apc) gene. In contrast to mouse models carrying Apc mutations, in which cancers develop mainly in the small intestine rather than in the colon and there is no gender bias, these rat models exhibit colonic predisposition and gender-specific susceptibility, as seen in human colon cancer. The rat also provides other experimental resources as a model organism that are not provided by the mouse: the structure of its chromosomes facilitates the analysis of genomic events, the size of its colon permits longitudinal analysis of tumor growth, and the size of biological samples from the animal facilitates multiplexed molecular analyses of the tumor and its host. Thus, the underlying biology and experimental resources of these rat models provide important avenues for investigation. We anticipate that advances in disease modeling in the rat will synergize with resources that are being developed in the mouse to provide a deeper understanding of human colon cancer. PMID:25288683

  1. Organizing a Special Education Program Utilizing the Theory Z Model of Management.

    Science.gov (United States)

    Lindeman, David P.; And Others

    An alternative management model was implemented to increase teacher productivity in an institutional school for 185 mildly to severely mentally retarded children. The model included three components: a management structure that allowed for problem solving while still motivating staff; an inservice program for staff; and an evaluation system to…

  2. A Brand Loyalty Model Utilizing Team Identification and Customer Satisfaction in the Licensed Sports Product Industry

    Science.gov (United States)

    Lee, Soonhwan; Shin, Hongbum; Park, Jung-Jun; Kwon, Oh-Ryun

    2010-01-01

    The purpose of this study was to investigate the relationship among the attitudinal brand loyalty variables (i.e., cognitive, affective, and conative components), team identification, and customer satisfaction by developing a structural equation model, based on Oliver's (1997) attitudinal brand loyalty model. The results of this study confirmed…

  3. Data availability and model complexity, generality, and utility: a reply to Lonergan

    NARCIS (Netherlands)

    Evans, M.R.; Benton, T.G.; Grimm, V.; Lessells, C.M.; O'Malley, M.A.; Moustakas, A.; Weisberg, M.

    2014-01-01

    Comment on Do simple models lead to generality in ecology? [Trends Ecol Evol. 2013]Do simple models lead to generality in ecology? Evans MR, Grimm V, Johst K, Knuuttila T, de Langhe R, Lessells CM, Merz M, O'Malley MA, Orzack SH, Weisberg M, et al. Trends Ecol Evol. 2013 Oct; 28(10):578-83. Epub 201

  4. Organizing a Special Education Program Utilizing the Theory Z Model of Management.

    Science.gov (United States)

    Lindeman, David P.; And Others

    An alternative management model was implemented to increase teacher productivity in an institutional school for 185 mildly to severely mentally retarded children. The model included three components: a management structure that allowed for problem solving while still motivating staff; an inservice program for staff; and an evaluation system to…

  5. A Case Study of Higher Education Competency Models Utilizing an Assessment Framework

    Science.gov (United States)

    Uden, Jayme

    2012-01-01

    The overall purpose of this study is to explore the creation and implementation of competency models in higher education masters level preparation programs. The study answers five research questions. Why and how did two higher education preparation programs create a professional competency model for the graduate students in the program and what…

  6. Modeling of Car-Following Required Safe Distance Based on Molecular Dynamics

    OpenAIRE

    Dayi Qu; Xiufeng Chen; Wansan Yang; Xiaohua Bian

    2014-01-01

    In car-following procedure, some distances are reserved between the vehicles, through which drivers can avoid collisions with vehicles before and after them in the same lane and keep a reasonable clearance with lateral vehicles. This paper investigates characters of vehicle operating safety in car following state based on required safe distance. To tackle this problem, we probe into required safe distance and car-following model using molecular dynamics, covering longitudinal and lateral safe...

  7. Pseudomonas lini Strain ZBG1 Revealed Carboxylic Acid Utilization and Copper Resistance Features Required for Adaptation to Vineyard Soil Environment: A Draft Genome Analysis

    Science.gov (United States)

    Chan, Kok-Gan; Chong, Teik-Min; Adrian, Tan-Guan-Sheng; Kher, Heng Leong; Grandclément, Catherine; Faure, Denis; Yin, Wai-Fong; Dessaux, Yves; Hong, Kar-Wai

    2016-01-01

    Pseudomonas lini strain ZBG1 was isolated from the soil of vineyard in Zellenberg, France and the draft genome was reported in this study. Bioinformatics analyses of the genome revealed presence of genes encoding tartaric and malic acid utilization as well as copper resistance that correspond to the adaptation this strain in vineyard soil environment. PMID:27512520

  8. IDENTIFYING OPERATIONAL REQUIREMENTS TO SELECT SUITABLE DECISION MODELS FOR A PUBLIC SECTOR EPROCUREMENT DECISION SUPPORT SYSTEM

    Directory of Open Access Journals (Sweden)

    Mohamed Adil

    2014-10-01

    Full Text Available Public sector procurement should be a transparent and fair process. Strict legal requirements are enforced on public sector procurement to make it a standardised process. To make fair decisions on selecting suppliers, a practical method which adheres to legal requirements is important. The research that is the base for this paper aimed at identifying a suitable Multi-Criteria Decision Analysis (MCDA method for the specific legal and functional needs of the Maldivian Public Sector. To identify such operational requirements, a set of focus group interviews were conducted in the Maldives with public officials responsible for procurement decision making. Based on the operational requirements identified through focus groups, criteria-based evaluation is done on published MCDA methods to identify the suitable methods for e-procurement decision making. This paper describes the identification of the operational requirements and the results of the evaluation to select suitable decision models for the Maldivian context.

  9. A Logistic Regression Model with a Hierarchical Random Error Term for Analyzing the Utilization of Public Transport

    Directory of Open Access Journals (Sweden)

    Chong Wei

    2015-01-01

    Full Text Available Logistic regression models have been widely used in previous studies to analyze public transport utilization. These studies have shown travel time to be an indispensable variable for such analysis and usually consider it to be a deterministic variable. This formulation does not allow us to capture travelers’ perception error regarding travel time, and recent studies have indicated that this error can have a significant effect on modal choice behavior. In this study, we propose a logistic regression model with a hierarchical random error term. The proposed model adds a new random error term for the travel time variable. This term structure enables us to investigate travelers’ perception error regarding travel time from a given choice behavior dataset. We also propose an extended model that allows constraining the sign of this error in the model. We develop two Gibbs samplers to estimate the basic hierarchical model and the extended model. The performance of the proposed models is examined using a well-known dataset.

  10. Implications of Model Structure and Detail for Utility Planning. Scenario Case Studies using the Resource Planning Model

    Energy Technology Data Exchange (ETDEWEB)

    Mai, Trieu [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Barrows, Clayton [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Lopez, Anthony [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hale, Elaine [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Dyson, Mark [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Eurek, Kelly [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2015-04-23

    We examine how model investment decisions change under different model configurations and assumptions related to renewable capacity credit, the inclusion or exclusion of operating reserves, dispatch period sampling, transmission power flow modeling, renewable spur line costs, and the ability of a planning region to import and export power. For all modeled scenarios, we find that under market conditions where new renewable deployment is predominantly driven by renewable portfolio standards, model representations of wind and solar capacity credit and interactions between balancing areas are most influential in avoiding model investments in excess thermal capacity. We also compare computation time between configurations to evaluate tradeoffs between computational burden and model accuracy. From this analysis, we find that certain advanced dispatch representations (e.g., DC optimal power flow) can have dramatic adverse effects on computation time but can be largely inconsequential to model investment outcomes, at least at the renewable penetration levels modeled. Finally, we find that certain underappreciated aspects of new capacity investment decisions and model representations thereof, such as spur lines for new renewable capacity, can influence model outcomes particularly in the renewable technology and location chosen by the model. Though this analysis is not comprehensive and results are specific to the model region, input assumptions, and optimization-modeling framework employed, the findings are intended to provide a guide for model improvement opportunities.

  11. Lagrangian Particle Dispersion Model Intercomparison and Evaluation Utilizing Measurements from Controlled Tracer Release Experiments

    Science.gov (United States)

    Hegarty, J. D.; Draxler, R.; Stein, A. F.; Brioude, J.; Eluszkiewicz, J.; Mountain, M.; Nehrkorn, T.; Andrews, A. E.

    2012-12-01

    The accuracy of greenhouse gas (GHG) fluxes estimated using inverse methods is highly dependent on the fidelity of the atmospheric transport model employed. Lagrangian particle dispersion models (LPDMs) driven by customized meteorological output from mesoscale models have emerged as a powerful tool in inverse GHG estimates at policy-relevant regional and urban scales, for several reasons: 1) Mesoscale meteorology can be available at higher resolution than in most global models, and therefore has the potential to be more realistic, 2) the Lagrangian approach minimizes numerical diffusion present in Eulerian models and is thus better able to represent transport in the near-field of measurement locations, and 3) the Lagrangian approach offers an efficient way to compute the grid-scale adjoint of the transport model ("footprints") by running transport backwards in time. Motivated by these considerations, we intercompare three widely used LPDMs (HYSPLIT, STILT, and FLEXPART) driven by identical meteorological input from the Weather Research and Forecasting (WRF) model against measurements from the controlled tracer release experiments (ready-testbed.arl.noaa.gov/HYSPLIT_datem.php). Our analysis includes statistical assessments of each LPDM in terms of its ability to simulate the observed tracer concentrations, reversibility, and sensitivity to the WRF configuration, particularly with regard to the simulation of the planetary boundary layer.

  12. Animal models and therapeutic molecular targets of cancer: utility and limitations.

    Science.gov (United States)

    Cekanova, Maria; Rathore, Kusum

    2014-01-01

    Cancer is the term used to describe over 100 diseases that share several common hallmarks. Despite prevention, early detection, and novel therapies, cancer is still the second leading cause of death in the USA. Successful bench-to-bedside translation of basic scientific findings about cancer into therapeutic interventions for patients depends on the selection of appropriate animal experimental models. Cancer research uses animal and human cancer cell lines in vitro to study biochemical pathways in these cancer cells. In this review, we summarize the important animal models of cancer with focus on their advantages and limitations. Mouse cancer models are well known, and are frequently used for cancer research. Rodent models have revolutionized our ability to study gene and protein functions in vivo and to better understand their molecular pathways and mechanisms. Xenograft and chemically or genetically induced mouse cancers are the most commonly used rodent cancer models. Companion animals with spontaneous neoplasms are still an underexploited tool for making rapid advances in human and veterinary cancer therapies by testing new drugs and delivery systems that have shown promise in vitro and in vivo in mouse models. Companion animals have a relatively high incidence of cancers, with biological behavior, response to therapy, and response to cytotoxic agents similar to those in humans. Shorter overall lifespan and more rapid disease progression are factors contributing to the advantages of a companion animal model. In addition, the current focus is on discovering molecular targets for new therapeutic drugs to improve survival and quality of life in cancer patients.

  13. On the utility of proxy system models for estimating climate states over the common era

    Science.gov (United States)

    Dee, Sylvia G.; Steiger, Nathan J.; Emile-Geay, Julien; Hakim, Gregory J.

    2016-09-01

    Paleoclimate data assimilation has recently emerged as a promising technique to estimate past climate states. Here we test two of the underlying assumptions of paleoclimate data assimilation as applied so far: (1) climate proxies can be modeled as linear, univariate recorders of temperature and (2) structural errors in GCMs can be neglected. To investigate these two points and related uncertainties, we perform a series of synthetic, paleoclimate data assimilation-based reconstructions where "pseudo" proxies are generated with physically based proxy system models (PSMs) for coral δ18O, tree ring width, and ice core δ18O using two isotope-enabled atmospheric general circulation models. For (1), we find that linear-univariate models efficiently capture the GCM's climate in ice cores and corals and do not lead to large losses in reconstruction skill. However, this does not hold for tree ring width, especially in regions where the trees' response is dominated by moisture supply; we quantify how the breakdown of this assumption lowers reconstruction skill for each proxy class. For (2), we find that climate model biases can introduce errors that greatly reduce reconstruction skill, with or without perfect proxy system models. We explore possible strategies for mitigating structural modeling errors in GCMs and discuss implications for paleoclimate reanalyses.

  14. Mars Colony in situ resource utilization: An integrated architecture and economics model

    Science.gov (United States)

    Shishko, Robert; Fradet, René; Do, Sydney; Saydam, Serkan; Tapia-Cortez, Carlos; Dempster, Andrew G.; Coulton, Jeff

    2017-09-01

    This paper reports on our effort to develop an ensemble of specialized models to explore the commercial potential of mining water/ice on Mars in support of a Mars Colony. This ensemble starts with a formal systems architecting framework to describe a Mars Colony and capture its artifacts' parameters and technical attributes. The resulting database is then linked to a variety of ;downstream; analytic models. In particular, we integrated an extraction process (i.e., ;mining;) model, a simulation of the colony's environmental control and life support infrastructure known as HabNet, and a risk-based economics model. The mining model focuses on the technologies associated with in situ resource extraction, processing, storage and handling, and delivery. This model computes the production rate as a function of the systems' technical parameters and the local Mars environment. HabNet simulates the fundamental sustainability relationships associated with establishing and maintaining the colony's population. The economics model brings together market information, investment and operating costs, along with measures of market uncertainty and Monte Carlo techniques, with the objective of determining the profitability of commercial water/ice in situ mining operations. All told, over 50 market and technical parameters can be varied in order to address ;what-if; questions, including colony location.

  15. Animal models and therapeutic molecular targets of cancer: utility and limitations

    Directory of Open Access Journals (Sweden)

    Cekanova M

    2014-10-01

    Full Text Available Maria Cekanova, Kusum Rathore Department of Small Animal Clinical Sciences, College of Veterinary Medicine, The University of Tennessee, Knoxville, TN, USA Abstract: Cancer is the term used to describe over 100 diseases that share several common hallmarks. Despite prevention, early detection, and novel therapies, cancer is still the second leading cause of death in the USA. Successful bench-to-bedside translation of basic scientific findings about cancer into therapeutic interventions for patients depends on the selection of appropriate animal experimental models. Cancer research uses animal and human cancer cell lines in vitro to study biochemical pathways in these cancer cells. In this review, we summarize the important animal models of cancer with focus on their advantages and limitations. Mouse cancer models are well known, and are frequently used for cancer research. Rodent models have revolutionized our ability to study gene and protein functions in vivo and to better understand their molecular pathways and mechanisms. Xenograft and chemically or genetically induced mouse cancers are the most commonly used rodent cancer models. Companion animals with spontaneous neoplasms are still an underexploited tool for making rapid advances in human and veterinary cancer therapies by testing new drugs and delivery systems that have shown promise in vitro and in vivo in mouse models. Companion animals have a relatively high incidence of cancers, with biological behavior, response to therapy, and response to cytotoxic agents similar to those in humans. Shorter overall lifespan and more rapid disease progression are factors contributing to the advantages of a companion animal model. In addition, the current focus is on discovering molecular targets for new therapeutic drugs to improve survival and quality of life in cancer patients. Keywords: mouse cancer model, companion animal cancer model, dogs, cats, molecular targets

  16. Fan fiction metadata creation and utilization within fan fiction archives: Three primary models

    Directory of Open Access Journals (Sweden)

    Shannon Fay Johnson

    2014-09-01

    Full Text Available Issues related to searchability and ease of access have plagued fan fiction since its inception. This paper discusses the predominate forms of fan-mediated indexing and descriptive metadata, commonly referred to as folksonomy or tagging, and compares the benefits and disadvantages of each model. These models fall into three broad categories: free tagging, controlled vocabulary, and hybrid folksonomy. Each model has distinct advantages and shortcomings related to findability, results filtering, and creative empowerment. Examples for each are provided. Possible ramifications to fan fiction from improved metadata and access are also discussed.

  17. Not just a theory--the utility of mathematical models in evolutionary biology.

    Directory of Open Access Journals (Sweden)

    Maria R Servedio

    2014-12-01

    Full Text Available Progress in science often begins with verbal hypotheses meant to explain why certain biological phenomena exist. An important purpose of mathematical models in evolutionary research, as in many other fields, is to act as “proof-of-concept” tests of the logic in verbal explanations, paralleling the way in which empirical data are used to test hypotheses. Because not all subfields of biology use mathematics for this purpose, misunderstandings of the function of proof-of-concept modeling are common. In the hope of facilitating communication, we discuss the role of proof-of-concept modeling in evolutionary biology.

  18. A Stochastic Traffic Assignment Model Considering Differences in Passengers Utility Functions

    DEFF Research Database (Denmark)

    Nielsen, Otto Anker

    1997-01-01

    The paper presents a framework for public transport assignment that builds on the probit-based model of Sheffi & Powell (1981 & 1982). Hereby, the problems with overlapping routes that occur in many public transport models can be avoided. In the paper, the probit-based model in its pure form...... to describe differences in the distribution of travel times and waiting times for different sub-modes. Parallel lines are frequency aggregated in order to handle waiting times appropriate. Initial tests shows, that the methodology can be estimated to describe route choices in public transport very well...

  19. Modeling and simulation of combined gas turbine engine and heat pipe system for waste heat recovery and utilization

    Energy Technology Data Exchange (ETDEWEB)

    Lamfon, N.J. [Saudi Aramco Jeddah Refinery, Jeddah (Saudi Arabia); Najjar, Y.S.H.; Akyurt, M. [King Abdulaziz Univ., Mechanical Engineering Dept., Jeddah (Saudi Arabia)

    1998-12-01

    The results of a modeling and simulation study are presented for a combined system consisting of a gas turbine engine, a heat pipe recovery system and an inlet-air cooling system. The presentation covers performance data related to the gas turbine engine with precooled air intake as coupled to the water-in-copper heat pipe recovery system. This is done by matching the two mathematical models. The net power output is improved by 11% when the gas turbine engine is supplied with cold air produced by the heat-pipe recovery and utilization system. It is further concluded from the results produced by the combined mathematical model that the thermal efficiency of the gas turbine engine rises to 6% at 75% part load. It is to be anticipated that this rising trend in increases of thermal efficiency of the gas turbine engine would continue for operations at other (lower) part load conditions. (author)

  20. A corpus for mining drug-related knowledge from Twitter chatter: Language models and their utilities

    Directory of Open Access Journals (Sweden)

    Abeed Sarker

    2017-02-01

    Full Text Available In this data article, we present to the data science, natural language processing and public heath communities an unlabeled corpus and a set of language models. We collected the data from Twitter using drug names as keywords, including their common misspelled forms. Using this data, which is rich in drug-related chatter, we developed language models to aid the development of data mining tools and methods in this domain. We generated several models that capture (i distributed word representations and (ii probabilities of n-gram sequences. The data set we are releasing consists of 267,215 Twitter posts made during the four-month period—November, 2014 to February, 2015. The posts mention over 250 drug-related keywords. The language models encapsulate semantic and sequential properties of the texts.

  1. Characterizing biogenous sediments using multibeam echosounder backscatter data - Estimating power law parameter utilizing various models

    Digital Repository Service at National Institute of Oceanography (India)

    Chakraborty, B.; Kodagali, V.N.

    In this paper, Helmholtz-Kirchhoff (H-K) roughness model is employed to characterize seafloor sediment and roughness parameters from the eastern sector of the Southern Oceans The multibeam- Hydroswcep system's angular-backscatter data, which...

  2. Utilizing multiple scale models to improve predictions of extra-axial hemorrhage in the immature piglet.

    Science.gov (United States)

    Scott, Gregory G; Margulies, Susan S; Coats, Brittany

    2016-10-01

    Traumatic brain injury (TBI) is a leading cause of death and disability in the USA. To help understand and better predict TBI, researchers have developed complex finite element (FE) models of the head which incorporate many biological structures such as scalp, skull, meninges, brain (with gray/white matter differentiation), and vasculature. However, most models drastically simplify the membranes and substructures between the pia and arachnoid membranes. We hypothesize that substructures in the pia-arachnoid complex (PAC) contribute substantially to brain deformation following head rotation, and that when included in FE models accuracy of extra-axial hemorrhage prediction improves. To test these hypotheses, microscale FE models of the PAC were developed to span the variability of PAC substructure anatomy and regional density. The constitutive response of these models were then integrated into an existing macroscale FE model of the immature piglet brain to identify changes in cortical stress distribution and predictions of extra-axial hemorrhage (EAH). Incorporating regional variability of PAC substructures substantially altered the distribution of principal stress on the cortical surface of the brain compared to a uniform representation of the PAC. Simulations of 24 non-impact rapid head rotations in an immature piglet animal model resulted in improved accuracy of EAH prediction (to 94 % sensitivity, 100 % specificity), as well as a high accuracy in regional hemorrhage prediction (to 82-100 % sensitivity, 100 % specificity). We conclude that including a biofidelic PAC substructure variability in FE models of the head is essential for improved predictions of hemorrhage at the brain/skull interface.

  3. Utility of Parental Mediation Model on Youth’s Problematic Online Gaming

    OpenAIRE

    Benrazavi, R; Teimouri, M; Griffiths, MD

    2015-01-01

    The Parental Mediation Model PMM) was initially designed to regulate children’s attitudes towards the traditional media. In the present era, because of prevalent online media there is a need for similar regulative measures. Spending long hours on social media and playing online games increase the risks of exposure to the negative outcomes of online gaming. This paper initially applied the PMM developed by European Kids Online to (i) test the reliability and validity of this model and (ii) ide...

  4. On Early Conflict Identification by Requirements Modeling of Energy System Control Structures

    DEFF Research Database (Denmark)

    Heussen, Kai; Gehrke, Oliver; Niemann, Hans Henrik

    2015-01-01

    Control systems are purposeful systems involving goal-oriented information processing (cyber) and technical (physical) structures. Requirements modeling formalizes fundamental concepts and relations of a system architecture at a high-level design stage and can be used to identify potential design...... at later design stages. However, languages employed for requirements modeling today do not offer the expressiveness necessary to represent control purposes in relation to domain level interactions and therefore miss several types of interdependencies. This paper introduces the idea of control structure...

  5. Experimental and Numerical Analysis of Triaxially Braided Composites Utilizing a Modified Subcell Modeling Approach

    Science.gov (United States)

    Cater, Christopher; Xiao, Xinran; Goldberg, Robert K.; Kohlman, Lee W.

    2015-01-01

    A combined experimental and analytical approach was performed for characterizing and modeling triaxially braided composites with a modified subcell modeling strategy. Tensile coupon tests were conducted on a [0deg/60deg/-60deg] braided composite at angles of 0deg, 30deg, 45deg, 60deg and 90deg relative to the axial tow of the braid. It was found that measured coupon strength varied significantly with the angle of the applied load and each coupon direction exhibited unique final failures. The subcell modeling approach implemented into the finite element software LS-DYNA was used to simulate the various tensile coupon test angles. The modeling approach was successful in predicting both the coupon strength and reported failure mode for the 0deg, 30deg and 60deg loading directions. The model over-predicted the strength in the 90deg direction; however, the experimental results show a strong influence of free edge effects on damage initiation and failure. In the absence of these local free edge effects, the subcell modeling approach showed promise as a viable and computationally efficient analysis tool for triaxially braided composite structures. Future work will focus on validation of the approach for predicting the impact response of the braided composite against flat panel impact tests.

  6. A WYNER-ZIV VIDEO CODING METHOD UTILIZING MIXTURE CORRELATION NOISE MODEL

    Institute of Scientific and Technical Information of China (English)

    Hu Xiaofei; Zhu Xiuchang

    2012-01-01

    In Wyner-Ziv (WZ) Distributed Video Coding (DVC),correlation noise model is often used to describe the error distribution between WZ frame and the side information.The accuracy of the model can influence the performance of the video coder directly.A mixture correlation noise model in Discrete Cosine Transform (DCT) domain for WZ video coding is established in this paper.Different correlation noise estimation method is used for direct current and alternating current coefficients.Parameter estimation method based on expectation maximization algorithm is used to estimate the Laplace distribution center of direct current frequency band and Mixture Laplace-Uniform Distribution Model (MLUDM) is established for alternating current coefficients.Experimental results suggest that the proposed mixture correlation noise model can describe the heavy tail and sudden change of the noise accurately at high rate and make significant improvement on the coding efficiency compared with the noise model presented by DIStributed COding for Video sERvices (DISCOVER).

  7. Construction and utilization of linear empirical core models for PWR in-core fuel management

    Energy Technology Data Exchange (ETDEWEB)

    Okafor, K.C.

    1988-01-01

    An empirical core-model construction procedure for pressurized water reactor (PWR) in-core fuel management is developed that allows determining the optimal BOC k{sub {infinity}} profiles in PWRs as a single linear-programming problem and thus facilitates the overall optimization process for in-core fuel management due to algorithmic simplification and reduction in computation time. The optimal profile is defined as one that maximizes cycle burnup. The model construction scheme treats the fuel-assembly power fractions, burnup, and leakage as state variables and BOC zone enrichments as control variables. The core model consists of linear correlations between the state and control variables that describe fuel-assembly behavior in time and space. These correlations are obtained through time-dependent two-dimensional core simulations. The core model incorporates the effects of composition changes in all the enrichment control zones on a given fuel assembly and is valid at all times during the cycle for a given range of control variables. No assumption is made on the geometry of the control zones. A scatter-composition distribution, as well as annular, can be considered for model construction. The application of the methodology to a typical PWR core indicates good agreement between the model and exact simulation results.

  8. Sensitivity Analysis of Corrosion Rate Prediction Models Utilized for Reinforced Concrete Affected by Chloride

    Science.gov (United States)

    Siamphukdee, Kanjana; Collins, Frank; Zou, Roger

    2013-06-01

    Chloride-induced reinforcement corrosion is one of the major causes of premature deterioration in reinforced concrete (RC) structures. Given the high maintenance and replacement costs, accurate modeling of RC deterioration is indispensable for ensuring the optimal allocation of limited economic resources. Since corrosion rate is one of the major factors influencing the rate of deterioration, many predictive models exist. However, because the existing models use very different sets of input parameters, the choice of model for RC deterioration is made difficult. Although the factors affecting corrosion rate are frequently reported in the literature, there is no published quantitative study on the sensitivity of predicted corrosion rate to the various input parameters. This paper presents the results of the sensitivity analysis of the input parameters for nine selected corrosion rate prediction models. Three different methods of analysis are used to determine and compare the sensitivity of corrosion rate to various input parameters: (i) univariate regression analysis, (ii) multivariate regression analysis, and (iii) sensitivity index. The results from the analysis have quantitatively verified that the corrosion rate of steel reinforcement bars in RC structures is highly sensitive to corrosion duration time, concrete resistivity, and concrete chloride content. These important findings establish that future empirical models for predicting corrosion rate of RC should carefully consider and incorporate these input parameters.

  9. Mathematical Model for the Optimal Utilization Percentile in M/M/1 Systems: A Contribution about Knees in Performance Curves

    CERN Document Server

    Gonzalez-Horta, Francisco A; Ramirez-Cortes, Juan M; Martinez-Carballido, Jorge; Buenfil-Alpuche, Eldamira

    2011-01-01

    Performance curves of queueing systems can be analyzed by separating them into three regions: the flat region, the knee region, and the exponential region. Practical considerations, usually locate the knee region between 70-90% of the theoretical maximum utilization. However, there is not a clear agreement about where the boundaries between regions are, and where exactly the utilization knee is located. An open debate about knees in performance curves was undertaken at least 20 years ago. This historical debate is mainly divided between those who claim that a knee in the curve is not a well defined term in mathematics, or it is a subjective and not really meaningful concept, and those who define knees mathematically and consider their relevance and application. In this paper, we present a mathematical model and analysis for identifying the three mentioned regions on performance curves for M/M/1 systems; specifically, we found the knees, or optimal utilization percentiles, at the vertices of the hyperbolas tha...

  10. Evidence evaluation: measure Z corresponds to human utility judgments better than measure L and optimal-experimental-design models.

    Science.gov (United States)

    Rusconi, Patrice; Marelli, Marco; D'Addario, Marco; Russo, Selena; Cherubini, Paolo

    2014-05-01

    Evidence evaluation is a crucial process in many human activities, spanning from medical diagnosis to impression formation. The present experiments investigated which, if any, normative model best conforms to people's intuition about the value of the obtained evidence. Psychologists, epistemologists, and philosophers of science have proposed several models to account for people's intuition about the utility of the obtained evidence with respect either to a focal hypothesis or to a constellation of hypotheses. We pitted against each other the so-called optimal-experimental-design models (i.e., Bayesian diagnosticity, log₁₀ diagnosticity, information gain, Kullback-Leibler distance, probability gain, and impact) and measures L and Z to compare their ability to describe humans' intuition about the value of the obtained evidence. Participants received words-and-numbers scenarios concerning 2 hypotheses and binary features. They were asked to evaluate the utility of "yes" and "no" answers to questions about some features possessed in different proportions (i.e., the likelihoods) by 2 types of extraterrestrial creatures (corresponding to 2 mutually exclusive and exhaustive hypotheses). Participants evaluated either how an answer was helpful or how an answer decreased/increased their beliefs with respect either to a single hypothesis or to both hypotheses. We fitted mixed-effects models and used the Akaike information criterion and the Bayesian information criterion values to compare the competing models of the value of the obtained evidence. Overall, the experiments showed that measure Z was the best fitting model of participants' judgments of the value of obtained answers. We discussed the implications for the human hypothesis-evaluation process.

  11. Improved Traceability of a Small Satellite Mission Concept to Requirements Using Model Based System Engineering

    Science.gov (United States)

    Reil, Robin L.

    2014-01-01

    Model Based Systems Engineering (MBSE) has recently been gaining significant support as a means to improve the "traditional" document-based systems engineering (DBSE) approach to engineering complex systems. In the spacecraft design domain, there are many perceived and propose benefits of an MBSE approach, but little analysis has been presented to determine the tangible benefits of such an approach (e.g. time and cost saved, increased product quality). This paper presents direct examples of how developing a small satellite system model can improve traceability of the mission concept to its requirements. A comparison of the processes and approaches for MBSE and DBSE is made using the NASA Ames Research Center SporeSat CubeSat mission as a case study. A model of the SporeSat mission is built using the Systems Modeling Language standard and No Magic's MagicDraw modeling tool. The model incorporates mission concept and requirement information from the mission's original DBSE design efforts. Active dependency relationships are modeled to demonstrate the completeness and consistency of the requirements to the mission concept. Anecdotal information and process-duration metrics are presented for both the MBSE and original DBSE design efforts of SporeSat.

  12. Improved Traceability of Mission Concept to Requirements Using Model Based Systems Engineering

    Science.gov (United States)

    Reil, Robin

    2014-01-01

    Model Based Systems Engineering (MBSE) has recently been gaining significant support as a means to improve the traditional document-based systems engineering (DBSE) approach to engineering complex systems. In the spacecraft design domain, there are many perceived and propose benefits of an MBSE approach, but little analysis has been presented to determine the tangible benefits of such an approach (e.g. time and cost saved, increased product quality). This thesis presents direct examples of how developing a small satellite system model can improve traceability of the mission concept to its requirements. A comparison of the processes and approaches for MBSE and DBSE is made using the NASA Ames Research Center SporeSat CubeSat mission as a case study. A model of the SporeSat mission is built using the Systems Modeling Language standard and No Magics MagicDraw modeling tool. The model incorporates mission concept and requirement information from the missions original DBSE design efforts. Active dependency relationships are modeled to analyze the completeness and consistency of the requirements to the mission concept. Overall experience and methodology are presented for both the MBSE and original DBSE design efforts of SporeSat.

  13. Financial constraints in capacity planning: a national utility regulatory model (NUREG). Volume I of III: methodology. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1981-10-29

    This report develops and demonstrates the methodology for the National Utility Regulatory (NUREG) Model developed under contract number DEAC-01-79EI-10579. It is accompanied by two supporting volumes. Volume II is a user's guide for operation of the NUREG software. This includes description of the flow of software and data, as well as the formats of all user data files. Finally, Volume III is a software description guide. It briefly describes, and gives a listing of, each program used in NUREG.

  14. Automatically multi-paradigm requirements modeling and analyzing: An ontology-based approach

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    There are several purposes for modeling and analyzing the problem domain before starting the software requirements analysis. First, it focuses on the problem domain, so that the domain users could be involved easily. Secondly, a comprehensive description on the problem domain will advantage getting a comprehensive software requirements model. This paper proposes an ontology-based approach for mod-eling the problem domain. It interacts with the domain users by using terminology that they can under-stand and guides them to provide the relevant information. A multiple paradigm analysis approach, with the basis of the description on the problem domain, has also been presented. Three criteria, i.e. the ra-tionality of organization structure, the achievability of organization goals, and the feasibility of organiza-tion process, have been proposed. The results of the analysis could be used as feedbacks for guiding the domain users to provide further information on the problem domain. And those models on the problem domain could be a kind of document for the pre-requirements analysis phase. They also will be the basis for further software requirements modeling.

  15. Fractured rock modeling in the National Waste Terminal Storage Program: a review of requirements and status

    Energy Technology Data Exchange (ETDEWEB)

    St. John, C.; Krug, A.; Key, S.; Monsees, J.

    1983-05-01

    Generalized computer codes capable of forming the basis for numerical models of fractured rock masses are being used within the NWTS program. Little additional development of these codes is considered justifiable, except in the area of representation of discrete fractures. On the other hand, model preparation requires definition of medium-specific constitutive descriptions and site characteristics and is therefore legitimately conducted by each of the media-oriented projects within the National Waste Terminal Storage program. However, it is essential that a uniform approach to the role of numerical modeling be adopted, including agreement upon the contribution of modeling to the design and licensing process and the need for, and means of, model qualification for particular purposes. This report discusses the role of numerical modeling, reviews the capabilities of several computer codes that are being used to support design or performance assessment, and proposes a framework for future numerical modeling activities within the NWTS program.

  16. A restraint molecular dynamics and simulated annealing approach for protein homology modeling utilizing mean angles

    Directory of Open Access Journals (Sweden)

    Maurer Till

    2005-04-01

    Full Text Available Abstract Background We have developed the program PERMOL for semi-automated homology modeling of proteins. It is based on restrained molecular dynamics using a simulated annealing protocol in torsion angle space. As main restraints defining the optimal local geometry of the structure weighted mean dihedral angles and their standard deviations are used which are calculated with an algorithm described earlier by Döker et al. (1999, BBRC, 257, 348–350. The overall long-range contacts are established via a small number of distance restraints between atoms involved in hydrogen bonds and backbone atoms of conserved residues. Employing the restraints generated by PERMOL three-dimensional structures are obtained using standard molecular dynamics programs such as DYANA or CNS. Results To test this modeling approach it has been used for predicting the structure of the histidine-containing phosphocarrier protein HPr from E. coli and the structure of the human peroxisome proliferator activated receptor γ (Ppar γ. The divergence between the modeled HPr and the previously determined X-ray structure was comparable to the divergence between the X-ray structure and the published NMR structure. The modeled structure of Ppar γ was also very close to the previously solved X-ray structure with an RMSD of 0.262 nm for the backbone atoms. Conclusion In summary, we present a new method for homology modeling capable of producing high-quality structure models. An advantage of the method is that it can be used in combination with incomplete NMR data to obtain reasonable structure models in accordance with the experimental data.

  17. Utilizing high throughput screening data for predictive toxicology models: protocols and application to MLSCN assays

    Science.gov (United States)

    Guha, Rajarshi; Schürer, Stephan C.

    2008-06-01

    Computational toxicology is emerging as an encouraging alternative to experimental testing. The Molecular Libraries Screening Center Network (MLSCN) as part of the NIH Molecular Libraries Roadmap has recently started generating large and diverse screening datasets, which are publicly available in PubChem. In this report, we investigate various aspects of developing computational models to predict cell toxicity based on cell proliferation screening data generated in the MLSCN. By capturing feature-based information in those datasets, such predictive models would be useful in evaluating cell-based screening results in general (for example from reporter assays) and could be used as an aid to identify and eliminate potentially undesired compounds. Specifically we present the results of random forest ensemble models developed using different cell proliferation datasets and highlight protocols to take into account their extremely imbalanced nature. Depending on the nature of the datasets and the descriptors employed we were able to achieve percentage correct classification rates between 70% and 85% on the prediction set, though the accuracy rate dropped significantly when the models were applied to in vivo data. In this context we also compare the MLSCN cell proliferation results with animal acute toxicity data to investigate to what extent animal toxicity can be correlated and potentially predicted by proliferation results. Finally, we present a visualization technique that allows one to compare a new dataset to the training set of the models to decide whether the new dataset may be reliably predicted.

  18. Lunar-Forming Giant Impact Model Utilizing Modern Graphics Processing Units

    Indian Academy of Sciences (India)

    J. C. Eiland; T. C. Salzillo; B. H. Hokr; J. L. Highland; W. D. Mayfield; B. M. Wyatt

    2014-12-01

    Recent giant impact models focus on producing a circumplanetary disk of the proper composition around the Earth and defer to earlier works for the accretion of this disk into the Moon. The discontinuity between creating the circumplanetary disk and accretion of the Moon is unnatural and lacks simplicity. In addition, current giant impact theories are being questioned due to their inability to find conditions that will produce a system with both the proper angular momentum and a resultant Moon that is isotopically similar to the Earth. Here we return to first principles and produce a continuous model that can be used to rapidly search the vast impact parameter space to identify plausible initial conditions. This is accomplished by focusing on the three major components of planetary collisions: constant gravitational attraction, short range repulsion and energy transfer. The structure of this model makes it easily parallelizable and well-suited to harness the power of modern Graphics Processing Units (GPUs). The model makes clear the physically relevant processes, and allows a physical picture to naturally develop. We conclude by demonstrating how the model readily produces stable Earth–Moon systems from a single, continuous simulation. The resultant systems possess many desired characteristics such as an iron-deficient, heterogeneously-mixed Moon and accurate axial tilt of the Earth.

  19. An alternative approach for choice models in transportation: Use of possibility theory for comparison of utilities

    Directory of Open Access Journals (Sweden)

    Dell’orco Mauro

    2004-01-01

    Full Text Available Modeling of human choice mechanism has been a topic of intense discussion in the transportation community for many years. The framework of modeling has been rooted in probability theory in which the analyst’s uncertainty about the integrity of the model is expressed in probability. In most choice situations, the decision-maker (traveler also experiences uncertainty because of the lack of complete information on the choices. In the traditional modeling framework, the uncertainty of the analyst and that of the decision-maker are both embedded in the same random term and not clearly separated. While the analyst's uncertainty may be represented by probability due to the statistical nature of events, that of the decision maker, however, is not always subjected to randomness; rather, it is the perceptive uncertainty. This paper proposes a modeling framework that attempts to account for the decision maker’s uncertainty by possibility theory and then the analyst's uncertainty by probability theory. The possibility to probability transformation is performed using the principle of uncertainty invariance. The proposed approach accounts for the quality of information on the changes in choice probability. The paper discusses the thought process, mathematics of possibility theory and probability transformation, and examples.

  20. Information support model and its impact on utility, satisfaction and loyalty of users

    Directory of Open Access Journals (Sweden)

    Sead Šadić

    2016-11-01

    Full Text Available In today’s modern age, information systems are of vital importance for successful performance of any organization. The most important role of any information system is its information support. This paper develops an information support model and presents the results of the survey examining the effects of such model. The survey was performed among the employees of Brčko District Government and comprised three phases. The first phase assesses the influence of the quality of information support and information on information support when making decisions. The second phase examines the impact of information support when making decisions on the perceived availability and user satisfaction with information support. The third phase examines the effects of perceived usefulness as well as information support satisfaction on user loyalty. The model is presented using six hypotheses, which were tested by means of a multivariate regression analysis. The demonstrated model shows that the quality of information support and information is of vital importance in the decision-making process. The perceived usefulness and customer satisfaction are of vital importance for continuous usage of information support. The model is universal, and if slightly modified, it can be used in any sphere of life where satisfaction is measured for clients and users of some service.

  1. River Loire levees hazard studies – CARDigues’ model principles and utilization examples on Blois levees

    Directory of Open Access Journals (Sweden)

    Durand Eduard

    2016-01-01

    Full Text Available Along the river Loire, in order to have a homogenous method to do specific risk assessment studies, a new model named CARDigues (for Levee Breach Hazard Calculation was developed in a partnership with DREAL Centre-Val de Loire (owner of levees, Cerema and Irstea. This model enables to approach the probability of failure on every levee sections and to integrate and cross different “stability” parameters such topography and included structures, geology and material geotechnical characteristics, hydraulic loads… and observations of visual inspections or instrumentation results considered as disorders (seepage, burrowing animals, vegetation, pipes, etc.. This model and integrated tool CARDigues enables to check for each levee section, the probability of appearance and rupture of five breaching scenarios initiated by: overflowing, internal erosion, slope instability, external erosion and uplift. It has been recently updated and has been applied on several levee systems by different contractors. The article presents the CARDigues model principles and its recent developments (version V28.00 with examples on river Loire and how it is currently used for a relevant and global levee system diagnosis and assessment. Levee reinforcement or improvement management is also a perspective of applications for this model CARDigues.

  2. Utilizing Conceptual Indexing to Enhance the Effectiveness of Vector Space Model

    Directory of Open Access Journals (Sweden)

    Aya M. Al-Zoghby

    2013-10-01

    Full Text Available One of the main purposes of the semantic Web is to improve the retrieval performance of search systems. Unlike keyword based search systems, the semantic search systems aim to discover pages related to the query's concepts rather than merely collecting all pages instantiating its keywords. To that end, the concepts must be defined to be used as a semantic index instead of the traditional lexical one. In fact, The Arabic language is still far from being semantically searchable. Therefore, this paper proposed a model that exploits the Universal Word Net ontology for producing an Arabic Concepts-Space to be used as the index of Semantic Vector Space Model. The Vector Space Model is one of the most common information retrieval models due to its capability of expressing the documents' structure. However, like all keyword-based search systems, its sensitivity to the query's keywords reduces its retrieval effectiveness. The proposed model allows the VSM to represent Arabic documents by their topic, and thus classify them semantically. This, consequently, enhances the retrieval effectiveness of the search system.

  3. Path Loss Prediction Over the Lunar Surface Utilizing a Modified Longley-Rice Irregular Terrain Model

    Science.gov (United States)

    Foore, Larry; Ida, Nathan

    2007-01-01

    This study introduces the use of a modified Longley-Rice irregular terrain model and digital elevation data representative of an analogue lunar site for the prediction of RF path loss over the lunar surface. The results are validated by theoretical models and past Apollo studies. The model is used to approximate the path loss deviation from theoretical attenuation over a reflecting sphere. Analysis of the simulation results provides statistics on the fade depths for frequencies of interest, and correspondingly a method for determining the maximum range of communications for various coverage confidence intervals. Communication system engineers and mission planners are provided a link margin and path loss policy for communication frequencies of interest.

  4. An Examination of College and University Athletic Directors' Perception of Management Models Utilized to Operate Intercollegiate Athletic Arenas

    Science.gov (United States)

    Palmero, Mauro R.

    2010-01-01

    Demands for enhanced accountability and effectiveness in higher education have also affected athletic departments, requiring a more cost-efficient managerial approach to the administration of athletic facilities, especially arenas. The purpose of this study was to examine athletic directors' perceptions towards the arena management models they…

  5. Analytic model utilizing the complex ABCD method for range dependency of a monostatic coherent lidar

    DEFF Research Database (Denmark)

    Olesen, Anders Sig; Pedersen, Anders Tegtmeier; Hanson, Steen Grüner;

    2014-01-01

    In this work, we present an analytic model for analyzing the range and frequency dependency of a monostatic coherent lidar measuring velocities of a diffuse target. The model of the signal power spectrum includes both the contribution from the optical system as well as the contribution from...... the time dependencies of the optical field. A specific coherent Doppler wind lidar system measuring wind velocity in the atmosphere is considered, in which a Gaussian field is transmitted through a simple telescope consisting of a lens and an aperture. The effects of the aperture size, the beam waist...

  6. Using a DSGE Model to Assess the Macroeconomic Effects of Reserve Requirements in Brazil

    OpenAIRE

    Waldyr Dutra Areosa; Christiano Arrigoni Coelho

    2013-01-01

    The goal of this paper is to present how a Dynamic General Equilibrium Model (DSGE) can be used by policy makers in the qualitative and quantitative evaluation of the macroeconomics impacts of two monetary policy instruments: (i) short term interest rate and (ii) reserve requirements ratio. In our model, this last instrument affects the leverage of banks that have to deal with agency problems in order to raise funds from depositors. We estimated a modified version of Gertler and Karadi (2011)...

  7. A Formal Method to Model Early Requirement of Multi-Agent System

    Institute of Scientific and Technical Information of China (English)

    MAO Xin-jun; YU Eric

    2004-01-01

    A formal specification language iFL based on i* framework is presented in this paper to formally specify and analyze the early requirement of multi-agent system. It is a branching temporal logic which defines the concepts and models in i* framework in a rigorous way. The method to transform the i* models to iFL formal specification is also put forward.

  8. System Design Description and Requirements for Modeling the Off-Gas Systems for Fuel Recycling Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Daryl R. Haefner; Jack D. Law; Troy J. Tranter

    2010-08-01

    This document provides descriptions of the off-gases evolved during spent nuclear fuel processing and the systems used to capture the gases of concern. Two reprocessing techniques are discussed, namely aqueous separations and electrochemical (pyrochemical) processing. The unit operations associated with each process are described in enough detail so that computer models to mimic their behavior can be developed. The document also lists the general requirements for the desired computer models.

  9. Using a DSGE Model to Assess the Macroeconomic Effects of Reserve Requirements in Brazil

    OpenAIRE

    Waldyr Dutra Areosa; Christiano Arrigoni Coelho

    2013-01-01

    The goal of this paper is to present how a Dynamic General Equilibrium Model (DSGE) can be used by policy makers in the qualitative and quantitative evaluation of the macroeconomics impacts of two monetary policy instruments: (i) short term interest rate and (ii) reserve requirements ratio. In our model, this last instrument affects the leverage of banks that have to deal with agency problems in order to raise funds from depositors. We estimated a modified version of Gertler and Karadi (2011)...

  10. A discrete event simulation to model the cost-utility of fingolimod and natalizumab in rapidly evolving severe relapsing-remitting multiple sclerosis in the UK.

    Science.gov (United States)

    Montgomery, Stephen M; Maruszczak, Maciej J; Slater, David; Kusel, Jeanette; Nicholas, Richard; Adlard, Nicholas

    2017-05-01

    Two disease-modifying therapies are licensed in the EU for use in rapidly-evolving severe (RES) relapsing-remitting multiple sclerosis (RRMS), fingolimod and natalizumab. Here a discrete event simulation (DES) model to analyze the cost-effectiveness of natalizumab and fingolimod in the RES population, from the perspective of the National Health Service (NHS) in the UK, is reported. A DES model was developed to track individual RES patients, based on Expanded Disability Status Scale scores. Individual patient characteristics were taken from the RES sub-groups of the pivotal trials for fingolimod. Utility data were in line with previous models. Published costs were inflated to NHS cost year 2015. Owing to the confidential patient access scheme (PAS) discount applied to fingolimod in the UK, a range of discount levels were applied to the fingolimod list price, to capture the likelihood of natalizumab being cost-effective in a real-world setting. At the lower National Institute of Health and Care Excellence (NICE) threshold of £20,000/quality-adjusted life year (QALY), fingolimod only required a discount greater than 0.8% of list price to be cost-effective. At the upper threshold of £30,000/QALY employed by the NICE, fingolimod was cost-effective if the confidential discount is greater than 2.5%. Sensitivity analyses conducted using fingolimod list-price showed the model to be most sensitive to changes in the cost of each drug, particularly fingolimod. The DES model shows that only a modest discount to the UK fingolimod list-price is required to make fingolimod a more cost-effective option than natalizumab in RES RRMS.

  11. Prevention of radiation-induced salivary gland dysfunction utilizing a CDK inhibitor in a mouse model.

    Directory of Open Access Journals (Sweden)

    Katie L Martin

    Full Text Available BACKGROUND: Treatment of head and neck cancer with radiation often results in damage to surrounding normal tissues such as salivary glands. Permanent loss of function in the salivary glands often leads patients to discontinue treatment due to incapacitating side effects. It has previously been shown that IGF-1 suppresses radiation-induced apoptosis and enhances G2/M arrest leading to preservation of salivary gland function. In an effort to recapitulate the effects of IGF-1, as well as increase the likelihood of translating these findings to the clinic, the small molecule therapeutic Roscovitine, is being tested. Roscovitine is a cyclin-dependent kinase inhibitor that acts to transiently inhibit cell cycle progression and allow for DNA repair in damaged tissues. METHODOLOGY/PRINCIPAL FINDINGS: Treatment with Roscovitine prior to irradiation induced a significant increase in the percentage of cells in the G(2/M phase, as demonstrated by flow cytometry. In contrast, mice treated with radiation exhibit no differences in the percentage of cells in G(2/M when compared to unirradiated controls. Similar to previous studies utilizing IGF-1, pretreatment with Roscovitine leads to a significant up-regulation of p21 expression and a significant decrease in the number of PCNA positive cells. Radiation treatment leads to a significant increase in activated caspase-3 positive salivary acinar cells, which is suppressed by pretreatment with Roscovitine. Administration of Roscovitine prior to targeted head and neck irradiation preserves normal tissue function in mouse parotid salivary glands, both acutely and chronically, as measured by salivary output. CONCLUSIONS/SIGNIFICANCE: These studies suggest that induction of transient G(2/M cell cycle arrest by Roscovitine allows for suppression of apoptosis, thus preserving normal salivary function following targeted head and neck irradiation. This could have an important clinical impact by preventing the negative side

  12. Prevention of Radiation-Induced Salivary Gland Dysfunction Utilizing a CDK Inhibitor in a Mouse Model

    Science.gov (United States)

    Martin, Katie L.; Hill, Grace A.; Klein, Rob R.; Arnett, Deborah G.; Burd, Randy; Limesand, Kirsten H.

    2012-01-01

    Background Treatment of head and neck cancer with radiation often results in damage to surrounding normal tissues such as salivary glands. Permanent loss of function in the salivary glands often leads patients to discontinue treatment due to incapacitating side effects. It has previously been shown that IGF-1 suppresses radiation-induced apoptosis and enhances G2/M arrest leading to preservation of salivary gland function. In an effort to recapitulate the effects of IGF-1, as well as increase the likelihood of translating these findings to the clinic, the small molecule therapeutic Roscovitine, is being tested. Roscovitine is a cyclin-dependent kinase inhibitor that acts to transiently inhibit cell cycle progression and allow for DNA repair in damaged tissues. Methodology/Principal Findings Treatment with Roscovitine prior to irradiation induced a significant increase in the percentage of cells in the G2/M phase, as demonstrated by flow cytometry. In contrast, mice treated with radiation exhibit no differences in the percentage of cells in G2/M when compared to unirradiated controls. Similar to previous studies utilizing IGF-1, pretreatment with Roscovitine leads to a significant up-regulation of p21 expression and a significant decrease in the number of PCNA positive cells. Radiation treatment leads to a significant increase in activated caspase-3 positive salivary acinar cells, which is suppressed by pretreatment with Roscovitine. Administration of Roscovitine prior to targeted head and neck irradiation preserves normal tissue function in mouse parotid salivary glands, both acutely and chronically, as measured by salivary output. Conclusions/Significance These studies suggest that induction of transient G2/M cell cycle arrest by Roscovitine allows for suppression of apoptosis, thus preserving normal salivary function following targeted head and neck irradiation. This could have an important clinical impact by preventing the negative side effects of radiation

  13. Performance Moderated Functions Server’s (PMFserv) Military Utility: A Model and Discussion

    Science.gov (United States)

    2009-05-01

    like to thank Troy Kelley, Laurel Allender, Liz Bowman, Luci Salvi, and Don Headley for their thoughtful comments on earlier versions of this report...Occupational Specialty (MOS). The selected MOS that was modeled was that of a Robotics NCO. Although the Robotics NCO ( Jensen , Tasoluk, Sanders, Marshall

  14. The Fixed-Effects Zero-Inflated Poisson Model with an Application to Health Care Utilization

    NARCIS (Netherlands)

    Majo, M.C.; van Soest, A.H.O.

    2011-01-01

    Response variables that are scored as counts and that present a large number of zeros often arise in quantitative health care analysis. We define a zero-in flated Poisson model with fixed-effects in both of its equations to identify respondent and health-related characteristics associated with

  15. The Fixed-Effects Zero-Inflated Poisson Model with an Application to Health Care Utilization

    NARCIS (Netherlands)

    Majo, M.C.; van Soest, A.H.O.

    2011-01-01

    Response variables that are scored as counts and that present a large number of zeros often arise in quantitative health care analysis. We define a zero-in flated Poisson model with fixed-effects in both of its equations to identify respondent and health-related characteristics associated with healt

  16. Utilizing the Active and Collaborative Learning Model in the Introductory Physics Course

    Science.gov (United States)

    Nam, Nguyen Hoai

    2014-01-01

    Model of active and collaborative learning (ACLM) applied in training specific subject makes clear advantage due to the goals of knowledge, skills that students got to develop successful future job. The author exploits the learning management system (LMS) of Hanoi National University of Education (HNUE) to establish a learning environment in the…

  17. The utility of behavioral economics in expanding the free-feed model of obesity.

    Science.gov (United States)

    Rasmussen, Erin B; Robertson, Stephen H; Rodriguez, Luis R

    2016-06-01

    Animal models of obesity are numerous and diverse in terms of identifying specific neural and peripheral mechanisms related to obesity; however, they are limited when it comes to behavior. The standard behavioral measure of food intake in most animal models occurs in a free-feeding environment. While easy and cost-effective for the researcher, the free-feeding environment omits some of the most important features of obesity-related food consumption-namely, properties of food availability, such as effort and delay to obtaining food. Behavior economics expands behavioral measures of obesity animal models by identifying such behavioral mechanisms. First, economic demand analysis allows researchers to understand the role of effort in food procurement, and how physiological and neural mechanisms are related. Second, studies on delay discounting contribute to a growing literature that shows that sensitivity to delayed food- and food-related outcomes is likely a fundamental process of obesity. Together, these data expand the animal model in a manner that better characterizes how environmental factors influence food consumption.

  18. Assessing the Utility of the Willingness/Prototype Model in Predicting Help-Seeking Decisions

    Science.gov (United States)

    Hammer, Joseph H.; Vogel, David L.

    2013-01-01

    Prior research on professional psychological help-seeking behavior has operated on the assumption that the decision to seek help is based on intentional and reasoned processes. However, research on the dual-process prototype/willingness model (PWM; Gerrard, Gibbons, Houlihan, Stock, & Pomery, 2008) suggests health-related decisions may also…

  19. An Optimization-Based System Model of Disturbance-Generated Forest Biomass Utilization

    Science.gov (United States)

    Curry, Guy L.; Coulson, Robert N.; Gan, Jianbang; Tchakerian, Maria D.; Smith, C. Tattersall

    2008-01-01

    Disturbance-generated biomass results from endogenous and exogenous natural and cultural disturbances that affect the health and productivity of forest ecosystems. These disturbances can create large quantities of plant biomass on predictable cycles. A systems analysis model has been developed to quantify aspects of system capacities (harvest,…

  20. Utilizing uncoded consultation notes from electronic medical records for predictive modeling of colorectal cancer

    NARCIS (Netherlands)

    Hoogendoorn, Mark; Szolovits, Peter; Moons, Leon M G; Numans, ME

    2016-01-01

    OBJECTIVE: Machine learning techniques can be used to extract predictive models for diseases from electronic medical records (EMRs). However, the nature of EMRs makes it difficult to apply off-the-shelf machine learning techniques while still exploiting the rich content of the EMRs. In this paper, w