WorldWideScience

Sample records for model requires utilization

  1. Utility requirements for fusion

    International Nuclear Information System (INIS)

    Vondrasek, R.J.

    1982-02-01

    This report describes work done and results obtained during performance of Task 1 of a study of Utility Requirements and Criteria for Fusion Options. The work consisted of developing a list of utility requirements for fusion optics containing definition of the requirements and showing their relative importance to the utility industry. The project team members developed a preliminary list which was refined by discussions and literature searches. The refined list was recast as a questionnaire which was sent to a substantial portion of the utility industry in this country. Forty-three questionnaire recipients responded including thirty-two utilities. A workshop was held to develop a revised requirements list using the survey responses as a major input. The list prepared by the workshop was further refined by a panel consisting of vice presidents of the three project team firms. The results of the study indicate that in addition to considering the cost of energy for a power plant, utilities consider twenty-three other requirements. Four of the requirements were judged to be vital to plant acceptability: Plant Capital Cost, Financial Liability, Plant Safety and Licensability

  2. Utility requirements for HTGRs

    International Nuclear Information System (INIS)

    Nicholls, D.R.

    1997-01-01

    Eskom, the state utility of South Africa, is currently evaluating the technical and economic feasibility of the helium cooled Pebble Bed Modular Reactor with a closed cycle gas turbine power conversion system for future power generating additions to its electric system. This paper provides an overview of the Eskom system including the needs of the utility for future generation capacity and the key performance requirements necessary for incorporation of this gas cooled reactor plant. (author)

  3. Utilization of a mental health collaborative care model among patients who require interpreter services.

    Science.gov (United States)

    Njeru, Jane W; DeJesus, Ramona S; St Sauver, Jennifer; Rutten, Lila J; Jacobson, Debra J; Wilson, Patrick; Wieland, Mark L

    2016-01-01

    Immigrants and refugees to the United States have a higher prevalence of depression compared to the general population and are less likely to receive adequate mental health services and treatment. Those with limited English proficiency (LEP) are at an even higher risk of inadequate mental health care. Collaborative care management (CCM) models for depression are effective in achieving treatment goals among a wide range of patient populations, including patients with LEP. The purpose of this study was to assess the utilization of a statewide initiative that uses CCM for depression management, among patients with LEP in a large primary care practice. This was a retrospective cohort study of patients with depression in a large primary care practice in Minnesota. Patients who met criteria for enrollment into the CCM [with a provider-generated diagnosis of depression or dysthymia in the electronic medical records, and a Patient Health Questionnaire-9 (PHQ-9) score ≥10]. Patient-identified need for interpreter services was used as a proxy for LEP. Rates of enrollment into the DIAMOND (Depression Improvement Across Minnesota, Offering A New Direction) program, a statewide initiative that uses CCM for depression management were measured. These rates were compared between eligible patients who require interpreter services versus patients who do not. Of the 7561 patients who met criteria for enrollment into the DIAMOND program during the study interval, 3511 were enrolled. Only 18.2 % of the eligible patients with LEP were enrolled into DIAMOND compared with the 47.2 % of the eligible English proficient patients. This finding persisted after adjustment for differences in age, gender and depression severity scores (adjusted OR [95 % confidence interval] = 0.43 [0.23, 0.81]). Within primary care practices, tailored interventions are needed, including those that address cultural competence and language navigation, to improve the utilization of this effective model among

  4. Utilizing inheritance in requirements engineering

    Science.gov (United States)

    Kaindl, Hermann

    1994-01-01

    The scope of this paper is the utilization of inheritance for requirements specification, i.e., the tasks of analyzing and modeling the domain, as well as forming and defining requirements. Our approach and the tool supporting it are named RETH (Requirements Engineering Through Hypertext). Actually, RETH uses a combination of various technologies, including object-oriented approaches and artificial intelligence (in particular frames). We do not attempt to exclude or replace formal representations, but try to complement and provide means for gradually developing them. Among others, RETH has been applied in the CERN (Conseil Europeen pour la Rechereche Nucleaire) Cortex project. While it would be impossible to explain this project in detail here, it should be sufficient to know that it deals with a generic distributed control system. Since this project is not finished yet, it is difficult to state its size precisely. In order to give an idea, its final goal is to substitute the many existing similar control systems at CERN by this generic approach. Currently, RETH is also tested using real-world requirements for the Pastel Mission Planning System at ESOC in Darmstadt. First, we outline how hypertext is integrated into a frame system in our approach. Moreover, the usefulness of inheritance is demonstrated as performed by the tool RETH. We then summarize our experiences of utilizing inheritance in the Cortex project. Lastly, RETH will be related to existing work.

  5. The European Utility Requirement Document

    International Nuclear Information System (INIS)

    Roche, I.I.

    1999-01-01

    The major European electricity producers work on a common requirement document for future LWR plants since 1992. They aim at requirements acceptable together by the owners, the public and the authorities. Thus the designers can develop standard LWR designs acceptable everywhere in Europe and the utilities can open their consultations to vendors on common bases. Such a standardisation promotes an improvement of generation costs and of safety : public and authorities acceptance should be improved as well ; significant savings are expected in development and construction costs. Since the early stages of the project, the EUR group has grown significantly. It now includes utilities from nine European countries. Utilities from two other European countries are joining the group. Specific cooperation agreements are also in progress with a few extra-European partners

  6. Virtual Observatories: Requirements for Utility

    Science.gov (United States)

    Paxton, L. J.

    2008-12-01

    The principal act that separates science from engineering is that of discovery. Virtual Observatories are a development with great potential for advancing our ability to do science by enabling us to do research effectively and to do research across disciplines. Access to data is one of the factors that enables discovery. A well-designed VO should enable discovery as well as providing for a uniform means by which data are accessed: thus, enabling discovery is the key challenge of a VO in fact it is and should be the principle that distinguishes a VO from a traditional archive. As the number of satellites in the Heliophysics Great observatory starts to decline due to the slower launch cadence and the reduction in funding for extended missions, it becomes more imperative that the community have the means to fully utilize and access the available resources. With the proliferation of low-cost computing and community-based models, cross-disciplinary studies become the new frontier. Many, if not the great majority of research papers are, at this time, confined to a particular discipline. Some of this "stove piping" may be due to the difficulty in accessing products from outside one's own discipline. One would hope and expect that VOs would address this. Two of the principal challenges associated with the vitality of the VOs, aside from the provision of the funds required to maintain the VOs, is 1) the limitation on the availability of data from non-NASA sources and 2) the need for some level of continued support for expertise on the data accessed through the VOs. The first issue is one of culture - some organizations support the view that the data belong to the PI whereas in Heliophysics "data rights" are curtailed. The second issue is to be addressed by the concept of the Resident Archive. This talk will provide an overview of the issues and challenges associated with VOs, Resident Archives, data rights, space missions, and instruments and their associated ground data

  7. The EURS (European utilities requirements)

    International Nuclear Information System (INIS)

    Berbey, P.

    2000-01-01

    The major European electricity producers have worked on a common requirement document for future LWR plants since 1992 to get specifications acceptable together by the owners, the public and the authorities. Thus the designers can develop standard LWR designs that could be acceptable everywhere in Europe and the utilities can open their consultations to vendors on common bases. Public and authority's acceptance should be improved as well. Significant saving are expected in development and construction costs. Since the release of the last versions of the EUR texts in 1996, a lot of work has been carried out: reviews by the regulators and other external organisations, comparisons, assessment of compliance of designs vs. EUR and clarification works on the controversial topics that deserved changes or clarification. At the beginning of 1999 enough material was available to start a complete revision of the EUR document

  8. Light duty utility arm software requirements specification

    International Nuclear Information System (INIS)

    Kiebel, G.R.

    1995-01-01

    This document defines the software requirements for the integrated control and data acquisition system of the Light Duty Utility Arm (LDUA) System. It is intended to be used to guide the design of the application software, to be a basis for assessing the application software design, and to establish what is to be tested in the finished application software product

  9. The utility target market model

    International Nuclear Information System (INIS)

    Leng, G.J.; Martin, J.

    1994-01-01

    A new model (the Utility Target Market Model) is used to evaluate the economic benefits of photovoltaic (PV) power systems located at the electrical utility customer site. These distributed PV demand-side generation systems can be evaluated in a similar manner to other demand-side management technologies. The energy and capacity values of an actual PV system located in the service area of the New England Electrical System (NEES) are the two utility benefits evaluated. The annual stream of energy and capacity benefits calculated for the utility are converted to the installed cost per watt that the utility should be willing to invest to receive this benefit stream. Different discount rates are used to show the sensitivity of the allowable installed cost of the PV systems to a utility's average cost of capital. Capturing both the energy and capacity benefits of these relatively environmentally friendly distributed generators, NEES should be willing to invest in this technology when the installed cost per watt declines to ca $2.40 using NEES' rated cost of capital (8.78%). If a social discount rate of 3% is used, installation should be considered when installed cost approaches $4.70/W. Since recent installations in the Sacramento Municipal Utility District have cost between $7-8/W, cost-effective utility applications of PV are close. 22 refs., 1 fig., 2 tabs

  10. Utility survey of requirements for a HTS fault current limiter

    DEFF Research Database (Denmark)

    Nielsen, Jan Nygaard; Jørgensen, P.; Østergaard, Jacob

    2000-01-01

    The application of superconducting fault current limiters (SFCL) in the electric utility sector will clearly dependent on to what extent the needs and requirements of electric utilities can be met by the ongoing development of SFCL technology. This paper considers a questionnaire survey of which ...... needs and expectations the Danish electric utilities have to this new technology. A bus-tie application of SFCL in a distribution substation with three parallel-coupled transformers is discussed...

  11. Effect of long construction times on utility financial requirements

    International Nuclear Information System (INIS)

    Francis, J.M.

    1981-01-01

    It is well-known that long construction times significantly increase the cost of an individual nuclear plant. Long construction times, however, are not confined to either a single plant or a single utility. Rather, they apparently occur in almost all nuclear plants currently under construction. The total financial requirement to complete the 82 nuclear plants currently under construction was assessed. The analysis was performed assuming a construction time of ten years in one case, and six years in another. It was found that decreasing the construction time from ten to six years will reduce the financial requirements of the utility industry by $89 billion

  12. ALWR utility requirements - A technical basis for updated emergency planning

    International Nuclear Information System (INIS)

    Leaver, David E.W.; DeVine, John C. Jr.; Santucci, Joseph

    2004-01-01

    U.S. utilities, with substantial support from international utilities, are developing a comprehensive set of design requirements in the form of a Utility Requirements Document (URD) as part of an industry wide effort to establish a technical foundation for the next generation of light water reactors. A key aspect of the URD is a set of severe accident-related design requirements which have been developed to provide a technical basis for updated emergency planning for the ALWR. The technical basis includes design criteria for containment performance and offsite dose during severe accident conditions. An ALWR emergency planning concept is being developed which reflects this severe accident capability. The main conclusion from this work is that the likelihood and consequences of a severe accident for an ALWR are fundamentally different from that assumed in the technical basis for existing emergency planning requirements, at least in the U.S. The current technical understanding of severe accident risk is greatly improved compared to that available when the existing U.S. emergency planning requirements were established nearly 15 years ago, and the emerging ALWR designs have superior core damage prevention and severe accident mitigation capability. Thus, it is reasonable and prudent to reflect this design capability in the emergency planning requirements for the ALWR. (author)

  13. Utilities for high performance dispersion model PHYSIC

    International Nuclear Information System (INIS)

    Yamazawa, Hiromi

    1992-09-01

    The description and usage of the utilities for the dispersion calculation model PHYSIC were summarized. The model was developed in the study of developing high performance SPEEDI with the purpose of introducing meteorological forecast function into the environmental emergency response system. The procedure of PHYSIC calculation consists of three steps; preparation of relevant files, creation and submission of JCL, and graphic output of results. A user can carry out the above procedure with the help of the Geographical Data Processing Utility, the Model Control Utility, and the Graphic Output Utility. (author)

  14. Deriving minimal models for resource utilization

    NARCIS (Netherlands)

    te Brinke, Steven; Bockisch, Christoph; Bergmans, Lodewijk; Malakuti Khah Olun Abadi, Somayeh; Aksit, Mehmet; Katz, Shmuel

    2013-01-01

    We show how compact Resource Utilization Models (RUMs) can be extracted from concrete overly-detailed models of systems or sub-systems in order to model energy-aware software. Using the Counterexample-Guided Abstraction Refinement (CEGAR) approach, along with model-checking tools, abstract models

  15. Space Mission Utility and Requirements for a Heat Melt Compactor

    Science.gov (United States)

    Fisher, John W.; Lee, Jeffrey M.

    2016-01-01

    Management of waste on long-duration space missions is both a problem and an opportunity. Uncontained or unprocessed waste is a crew health hazard and a habitat storage problem. A Heat Melt Compactor (HMC) such as NASA has been developing is capable of processing space mission trash and converting it to useful products. The HMC is intended to process space mission trash to achieve a number of objectives including: volume reduction, biological safening and stabilization, water recovery, radiation shielding, and planetary protection. This paper explores the utility of the HMC to future space missions and how this translates into HMC system requirements.

  16. Requirements for Medical Modeling Languages

    Science.gov (United States)

    van der Maas, Arnoud A.F.; Ter Hofstede, Arthur H.M.; Ten Hoopen, A. Johannes

    2001-01-01

    Objective: The development of tailor-made domain-specific modeling languages is sometimes desirable in medical informatics. Naturally, the development of such languages should be guided. The purpose of this article is to introduce a set of requirements for such languages and show their application in analyzing and comparing existing modeling languages. Design: The requirements arise from the practical experience of the authors and others in the development of modeling languages in both general informatics and medical informatics. The requirements initially emerged from the analysis of information modeling techniques. The requirements are designed to be orthogonal, i.e., one requirement can be violated without violation of the others. Results: The proposed requirements for any modeling language are that it be “formal” with regard to syntax and semantics, “conceptual,” “expressive,” “comprehensible,” “suitable,” and “executable.” The requirements are illustrated using both the medical logic modules of the Arden Syntax as a running example and selected examples from other modeling languages. Conclusion: Activity diagrams of the Unified Modeling Language, task structures for work flows, and Petri nets are discussed with regard to the list of requirements, and various tradeoffs are thus made explicit. It is concluded that this set of requirements has the potential to play a vital role in both the evaluation of existing domain-specific languages and the development of new ones. PMID:11230383

  17. The European Utility Requirements (EUR). Status and near term activities

    International Nuclear Information System (INIS)

    Berbey, Pierre; Hedin, Francois

    2010-01-01

    In 1991 5 major European Utilities participating in the US ALWR program decided to develop together a common specification that would contribute to keep the nuclear option open. The European Utility Requirements (EUR) are addressed to the designers and suppliers of LWR plants in order to allow the development of standards designs that can be build and licensed in several European countries with only minor variations. The EUR organization has kept enlarging; today 16 utilities are members of the EUR organization. Seven compliance analyses dedicated respectively to the BWR90, EPR, EPP, ABWR, SWR1000, AP1000 and to the AES92 projects have been already published. The revised version of the EPR subset of the EUR volume 3 was finalized in mid 2009. New LWR projects of potential interest for the EUR utilities are being contemplated. For instance a preliminary assessment of compliance of MHI's APWR project has been worked out in the first months of 2008. Recently EUR organization has decided to launch coordinated actions with other industry groups and other stakeholders. In particular EUR and ENISS organizations have decided to join their efforts in their relations with the IAEA and WENRA organizations with respect to the LWR Gen3 designs. In addition EUR and CORDEL (Cooperation in Reactor Design Evaluation and Licensing), which is a WNA (World Nuclear Association) working group decided also to coordinate their efforts for the industry benefit, in relation with the MDEP (Multinational Design Evaluation Program) initiative of safety nuclear regulators. Contacts have been also initiated with ENEN and the WNU in order to develop new courses for young professionals. (orig.)

  18. National Maglev initiative: California line electric utility power system requirements

    Science.gov (United States)

    Save, Phil

    1994-05-01

    The electrical utility power system requirements were determined for a Maglev line from San Diego to San Francisco and Sacramento with a maximum capacity of 12,000 passengers an hour in each direction at a speed of 300 miles per hour, or one train every 30 seconds in each direction. Basically the Maglev line requires one 50-MVA substation every 12.5 miles. The need for new power lines to serve these substations and their voltage levels are based not only on equipment loading criteria but also on limitations due to voltage flicker and harmonics created by the Maglev system. The resulting power system requirements and their costs depend mostly on the geographical area, urban or suburban with 'strong' power systems, or mountains and rural areas with 'weak' power systems. A reliability evaluation indicated that emergency power sources, such as a 10-MW battery at each substation, were not justified if sufficient redundancy is provided in the design of the substations and the power lines serving them. With a cost of $5.6 M per mile, the power system requirements, including the 12-kV DC cables and the inverters along the Maglev line, were found to be the second largest cost component of the Maglev system, after the cost of the guideway system ($9.1 M per mile), out of a total cost of $23 M per mile.

  19. Proline requirement for glucose utilization by Peptostreptococcus anaerobius ATCC 27337.

    Science.gov (United States)

    Curtis, M A; Wittenberger, C L; Thompson, J

    1987-02-01

    Resting cells of Peptostreptococcus anaerobius maintained under anaerobic conditions were unable to metabolize either glucose or alanine. The addition of proline to the appropriate suspension, however, resulted in the immediate utilization of both compounds. Fermentation of alanine by the cells required that stoichiometric concentrations of proline be present in the medium; and during the oxidation of alanine, proline was simultaneously reduced to the ring cleavage product delta-aminovaleric acid. Although proline was required to initiate glucose transport, stoichiometric amounts of the imino acid were not necessary for glucose fermentation. Proline also stimulated the uptake and concomitant phosphorylation of the nonmetabolizable glucose analog 2-deoxy-D-glucose. The proline requirement for glucose transport by P. anaerobius could be replaced by adding ferricyanide or simply by aerating the cell suspension. The initiation of sugar uptake by proline, ferricyanide, and O2 was attributed to the capacity of these compounds to function as electron acceptors, which permitted reoxidation of the (reduced) intracellular nucleotide pool and the formation (from an endogenous reserve) of the high-energy donor(s) required for the vectorial transport and phosphorylation of sugar.

  20. National Maglev initiative: California line electric utility power system requirements

    Science.gov (United States)

    Save, Phil

    1994-01-01

    The electrical utility power system requirements were determined for a Maglev line from San Diego to San Francisco and Sacramento with a maximum capacity of 12,000 passengers an hour in each direction at a speed of 300 miles per hour, or one train every 30 seconds in each direction. Basically the Maglev line requires one 50-MVA substation every 12.5 miles. The need for new power lines to serve these substations and their voltage levels are based not only on equipment loading criteria but also on limitations due to voltage flicker and harmonics created by the Maglev system. The resulting power system requirements and their costs depend mostly on the geographical area, urban or suburban with 'strong' power systems, or mountains and rural areas with 'weak' power systems. A reliability evaluation indicated that emergency power sources, such as a 10-MW battery at each substation, were not justified if sufficient redundancy is provided in the design of the substations and the power lines serving them. With a cost of $5.6 M per mile, the power system requirements, including the 12-kV DC cables and the inverters along the Maglev line, were found to be the second largest cost component of the Maglev system, after the cost of the guideway system ($9.1 M per mile), out of a total cost of $23 M per mile.

  1. Continuous utility factor in segregation models.

    Science.gov (United States)

    Roy, Parna; Sen, Parongama

    2016-02-01

    We consider the constrained Schelling model of social segregation in which the utility factor of agents strictly increases and nonlocal jumps of the agents are allowed. In the present study, the utility factor u is defined in a way such that it can take continuous values and depends on the tolerance threshold as well as the fraction of unlike neighbors. Two models are proposed: in model A the jump probability is determined by the sign of u only, which makes it equivalent to the discrete model. In model B the actual values of u are considered. Model A and model B are shown to differ drastically as far as segregation behavior and phase transitions are concerned. In model A, although segregation can be achieved, the cluster sizes are rather small. Also, a frozen state is obtained in which steady states comprise many unsatisfied agents. In model B, segregated states with much larger cluster sizes are obtained. The correlation function is calculated to show quantitatively that larger clusters occur in model B. Moreover for model B, no frozen states exist even for very low dilution and small tolerance parameter. This is in contrast to the unconstrained discrete model considered earlier where agents can move even when utility remains the same. In addition, we also consider a few other dynamical aspects which have not been studied in segregation models earlier.

  2. Requirements for effective modelling strategies.

    NARCIS (Netherlands)

    Gaunt, J.L.; Riley, J.; Stein, A.; Penning de Vries, F.W.T.

    1997-01-01

    As the result of a recent BBSRC-funded workshop between soil scientists, modellers, statisticians and others to discuss issues relating to the derivation of complex environmental models, a set of modelling guidelines is presented and the required associated research areas are discussed.

  3. The linear utility model for optimal selection

    NARCIS (Netherlands)

    Mellenbergh, Gideon J.; van der Linden, Willem J.

    A linear utility model is introduced for optimal selection when several subpopulations of applicants are to be distinguished. Using this model, procedures are described for obtaining optimal cutting scores in subpopulations in quota-free as well as quota-restricted selection situations. The cutting

  4. Metabolic utilization of energy and maintenance requirements in lactating sows.

    Science.gov (United States)

    Noblet, J; Etienne, M

    1987-03-01

    Metabolizable energy (ME), heat production (measured by indirect calorimetry in respiration chambers), milk energy output and body energy mobilization were measured in 20 gilts (10 replicates of two littermates) during a 21-d lactation. Two energy levels were used: 14.2 and 10.4 Mcal ME X d-1 X sow-1 in the high energy (HE) and low energy (LE) groups, respectively. The daily supply of other nutrients in the diets was identical in both treatments. Measurements of metabolic rate and energy balance of the litters were carried out. These data were used to estimate the maintenance requirements of the sows (MEm) and the efficiencies of utilization of energy of food (kl) and body reserves (krl) for energy production in milk. Nitrogen balance of the sows was also determined. Energy mobilization was increased by energy restriction (-5.35 vs -2.04 Mcal X d-1 X sow-1 for HE and LE gilts, respectively) and by the increment of milk production with the advancement of lactation. Energy restriction (LE vs HE gilts) resulted in increased weight loss consisting mainly of fat tissue depletion. Muscle depletion represented a rather large proportion of weight loss, even in sows fed the high energy level. Maintenance requirements amounted to 109 kcal ME X kg weight-.75 X d-1. The estimations for kl and krl were 72 and 88%, respectively. These results show that the overall efficiency of energy storage during pregnancy and its mobilization during lactation (68.6 to 70.9%) is similar to that of direct utilization of ME during lactation.

  5. The Utility of Ada for Army Modeling

    Science.gov (United States)

    1990-04-10

    34 Ada " for Ada Lovelace (1815-1851), a mathematician who worked with Charles Babbage on his difference and analytic engines.9 Later in 1979, the HOLWG...OF ADA FOR ARMY MODELING BY COLONEL MICHAEL L. YOCOM DISTRIBUTION STATEMENT A: Approved for publie releases distribution is unlimited. 1% LF-, EC TE...TITLE (ad Subtitle) a. TYPE OF REPORT & PERIOD COVERED The Utility of Ada for Army Modeling Individual Study Project 6 PERFORMING ORG. REPORT NUMBER

  6. Modelling of biomass utilization for energy purpose

    Energy Technology Data Exchange (ETDEWEB)

    Grzybek, Anna (ed.)

    2010-07-01

    the overall farms structure, farms land distribution on several separate subfields for one farm, villages' overpopulation and very high employment in agriculture (about 27% of all employees in national economy works in agriculture). Farmers have low education level. In towns 34% of population has secondary education and in rural areas - only 15-16%. Less than 2% inhabitants of rural areas have higher education. The structure of land use is as follows: arable land 11.5%, meadows and pastures 25.4%, forests 30.1%. Poland requires implementation of technical and technological progress for intensification of agricultural production. The reason of competition for agricultural land is maintenance of the current consumption level and allocation of part of agricultural production for energy purposes. Agricultural land is going to be key factor for biofuels production. In this publication research results for the Project PL0073 'Modelling of energetical biomass utilization for energy purposes' have been presented. The Project was financed from the Norwegian Financial Mechanism and European Economic Area Financial Mechanism. The publication is aimed at moving closer and explaining to the reader problems connected with cultivations of energy plants and dispelling myths concerning these problems. Exchange of fossil fuels by biomass for heat and electric energy production could be significant input in carbon dioxide emission reduction. Moreover, biomass crop and biomass utilization for energetical purposes play important role in agricultural production diversification in rural areas transformation. Agricultural production widening enables new jobs creation. Sustainable development is going to be fundamental rule for Polish agriculture evolution in long term perspective. Energetical biomass utilization perfectly integrates in the evolution frameworks, especially on local level. There are two facts. The fist one is that increase of interest in energy crops in Poland

  7. Estimation of peginesatide utilization requires patient-level data

    Directory of Open Access Journals (Sweden)

    Alex Yang

    2012-06-01

    Due to the nonlinear dose relationship between peginesatide and epoetin, facilities with similar epoetin use (<2% relative difference had up to 35% difference in estimate of peginesatide use. For accurate estimation of peginesatide utilization, it is important to base conversions on epoetin dose distribution rather than mean epoetin dose.fx1

  8. The European utility requirements - purposes and requirements to be fulfilled by the next generation of LWRs

    International Nuclear Information System (INIS)

    Broecker, B.; Essmann, J.

    1995-01-01

    With the first big phase of nuclear power reactor engineering and construction having come to an end in west european countries, and current activities of reactor manufacturers being reduced to building few plants for meeting growing electricity demand, or replacing retired power plants, the available market for manufacturers of nuclear power systems has become so small that the market of one country alone does not justify investment for the development of novel reactor types, or design enhancements. In all countries of Western Europe, the facility operators are responsible for safe and economically efficient operation of their nuclear power reactors, and this is why they decided to jointly elaborate and present to both manufacturers and supervising authorities as well as to the general public their requirements for the forthcoming generation of nuclear power plants. The resulting European Utility Requirements is a document specifying the details relating to engineered safety, reliability, operating performance, and economics of the reactors to be built by manufacturers for the European market. (orig./UA) [de

  9. Modeling utilization distributions in space and time

    Science.gov (United States)

    Keating, K.A.; Cherry, S.

    2009-01-01

    W. Van Winkle defined the utilization distribution (UD) as a probability density that gives an animal's relative frequency of occurrence in a two-dimensional (x, y) plane. We extend Van Winkle's work by redefining the UD as the relative frequency distribution of an animal's occurrence in all four dimensions of space and time. We then describe a product kernel model estimation method, devising a novel kernel from the wrapped Cauchy distribution to handle circularly distributed temporal covariates, such as day of year. Using Monte Carlo simulations of animal movements in space and time, we assess estimator performance. Although not unbiased, the product kernel method yields models highly correlated (Pearson's r - 0.975) with true probabilities of occurrence and successfully captures temporal variations in density of occurrence. In an empirical example, we estimate the expected UD in three dimensions (x, y, and t) for animals belonging to each of two distinct bighorn sheep {Ovis canadensis) social groups in Glacier National Park, Montana, USA. Results show the method can yield ecologically informative models that successfully depict temporal variations in density of occurrence for a seasonally migratory species. Some implications of this new approach to UD modeling are discussed. ?? 2009 by the Ecological Society of America.

  10. Animal Models Utilized in HTLV-1 Research

    Directory of Open Access Journals (Sweden)

    Amanda R. Panfil

    2013-01-01

    Full Text Available Since the isolation and discovery of human T-cell leukemia virus type 1 (HTLV-1 over 30 years ago, researchers have utilized animal models to study HTLV-1 transmission, viral persistence, virus-elicited immune responses, and HTLV-1-associated disease development (ATL, HAM/TSP. Non-human primates, rabbits, rats, and mice have all been used to help understand HTLV-1 biology and disease progression. Non-human primates offer a model system that is phylogenetically similar to humans for examining viral persistence. Viral transmission, persistence, and immune responses have been widely studied using New Zealand White rabbits. The advent of molecular clones of HTLV-1 has offered the opportunity to assess the importance of various viral genes in rabbits, non-human primates, and mice. Additionally, over-expression of viral genes using transgenic mice has helped uncover the importance of Tax and Hbz in the induction of lymphoma and other lymphocyte-mediated diseases. HTLV-1 inoculation of certain strains of rats results in histopathological features and clinical symptoms similar to that of humans with HAM/TSP. Transplantation of certain types of ATL cell lines in immunocompromised mice results in lymphoma. Recently, “humanized” mice have been used to model ATL development for the first time. Not all HTLV-1 animal models develop disease and those that do vary in consistency depending on the type of monkey, strain of rat, or even type of ATL cell line used. However, the progress made using animal models cannot be understated as it has led to insights into the mechanisms regulating viral replication, viral persistence, disease development, and, most importantly, model systems to test disease treatments.

  11. Utility models as a business inspiration

    Directory of Open Access Journals (Sweden)

    Dlask, Petr

    2016-06-01

    Full Text Available Nowadays, there are many possibilities and conditions for individual business ideas. At first glance, it may seem that restrictions come from not getting the necessary funds. The fact is that funds are only a secondary issue. The primary one is the quality of the business idea. If the idea is good and sufficiently inventive, raising the necessary investment resources is not a problem. Utility models registered through the Industrial Property Office present a broad potential for invention. The authors offer an insight into the submitted innovative practices associated with traditional building materials, such as stone. Its use in construction has a long-standing tradition, and new manufacturing and processing methods offer new opportunities for unconventional business.

  12. Modeling regulated water utility investment incentives

    Science.gov (United States)

    Padula, S.; Harou, J. J.

    2014-12-01

    This work attempts to model the infrastructure investment choices of privatized water utilities subject to rate of return and price cap regulation. The goal is to understand how regulation influences water companies' investment decisions such as their desire to engage in transfers with neighbouring companies. We formulate a profit maximization capacity expansion model that finds the schedule of new supply, demand management and transfer schemes that maintain the annual supply-demand balance and maximize a companies' profit under the 2010-15 price control process in England. Regulatory incentives for costs savings are also represented in the model. These include: the CIS scheme for the capital expenditure (capex) and incentive allowance schemes for the operating expenditure (opex) . The profit-maximizing investment program (what to build, when and what size) is compared with the least cost program (social optimum). We apply this formulation to several water companies in South East England to model performance and sensitivity to water network particulars. Results show that if companies' are able to outperform the regulatory assumption on the cost of capital, a capital bias can be generated, due to the fact that the capital expenditure, contrarily to opex, can be remunerated through the companies' regulatory capital value (RCV). The occurrence of the 'capital bias' or its entity depends on the extent to which a company can finance its investments at a rate below the allowed cost of capital. The bias can be reduced by the regulatory penalties for underperformances on the capital expenditure (CIS scheme); Sensitivity analysis can be applied by varying the CIS penalty to see how and to which extent this impacts the capital bias effect. We show how regulatory changes could potentially be devised to partially remove the 'capital bias' effect. Solutions potentially include allowing for incentives on total expenditure rather than separately for capex and opex and allowing

  13. Utility of Small Animal Models of Developmental Programming.

    Science.gov (United States)

    Reynolds, Clare M; Vickers, Mark H

    2018-01-01

    Any effective strategy to tackle the global obesity and rising noncommunicable disease epidemic requires an in-depth understanding of the mechanisms that underlie these conditions that manifest as a consequence of complex gene-environment interactions. In this context, it is now well established that alterations in the early life environment, including suboptimal nutrition, can result in an increased risk for a range of metabolic, cardiovascular, and behavioral disorders in later life, a process preferentially termed developmental programming. To date, most of the mechanistic knowledge around the processes underpinning development programming has been derived from preclinical research performed mostly, but not exclusively, in laboratory mouse and rat strains. This review will cover the utility of small animal models in developmental programming, the limitations of such models, and potential future directions that are required to fully maximize information derived from preclinical models in order to effectively translate to clinical use.

  14. Minimizing the Naming Facilities Requiring Protection in a Computing Utility

    Science.gov (United States)

    1975-09-01

    because some security kernel mechanism happens to depend upon its correct operation. The motivation behind ir.üluding a mechanism in the security...will choose an access control list (ACL) basci information protection scheme for our model. The basic motivation behind this choice is that Multics...simpler and more efficient than Multics 24.2. More details of our design than were presentee In the body of the thesis may be found In the

  15. 17 CFR 210.3A-05 - Special requirements as to public utility holding companies.

    Science.gov (United States)

    2010-04-01

    ... Consolidated and Combined Financial Statements § 210.3A-05 Special requirements as to public utility holding companies. There shall be shown in the consolidated balance sheet of a public utility holding company the... SECURITIES AND EXCHANGE COMMISSION FORM AND CONTENT OF AND REQUIREMENTS FOR FINANCIAL STATEMENTS, SECURITIES...

  16. Utility operating strategy and requirements for wind power forecast

    Science.gov (United States)

    Dub, W.; Pape, H.

    1983-06-01

    The commitment of a generation system including wind energy conversion systems will be based on wind speed and wind power forecasts. Forecasts for time spans of equal length with the startup/shutdown times of conventional units will be of great importance. The paper discusses forecast horizons up to 3 hours and 6 hours respectively. In addition, the problem of getting good wind speed forecasts is investigated by fitting time series models to wind speed data. Finally, the impact of hypothetical perfect forecasts on the commitment of intermediate load units is demonstrated by means of the wind power variations within spans up to 3 hours.

  17. Mathematical models for estimating radio channels utilization when ...

    African Journals Online (AJOL)

    Definition of the radio channel utilization indicator is given. Mathematical models for radio channels utilization assessment by real-time flows transfer in the wireless self-organized network are presented. Estimated experiments results according to the average radio channel utilization productivity with and without buffering of ...

  18. General review of quality assurance system requirements. The utility or customer requirement

    International Nuclear Information System (INIS)

    Fowler, J.L.

    1976-01-01

    What are the customer's Quality Assurance requirements and how does he convey these to his contractor, or apply them to himself. Many documents have been prepared mostly by countries with high technology availability and it is significant to note that many of the documents, particularly those of the United States of America, were prepared for nuclear safety related plant, but the logic of these documents equally applied to heavy engineering projects that are cost effective, and this is the current thinking and practice within the CEGB (Central Electricity Generating Board). Some documents have legislative backing, others rely on contractual disciplines, but they all appear to repeat the same basic requirements, so why does one continue to write more documents. The basic problem is that customers have to satisfy differing national legislative, economic and commercial requirements and, like all discerning customers, wish to reserve the right to satisfy their own needs, which are very often highly specialized. The CEGB are aware of this problem and are actively co-operating with most of the national and international authorities who are leading in this field, with a view to obtaining compatibility of requirements, but now there still remains the problem of satisfying national custom and practice. (author)

  19. Model Penentuan Nilai Target Functional Requirement Berbasis Utilitas

    Directory of Open Access Journals (Sweden)

    Cucuk Nur Rosyidi

    2012-01-01

    Full Text Available In a product design and development process, a designer faces a problem to decide functional requirement (FR target values. That decision is made under a risk since it is conducted in the early design phase using incomplete information. Utility function can be used to reflect the decision maker attitude towards the risk in making such decision. In this research, we develop a utility-based model to determine FR target values using quadratic utility function and information from Quality Function Deployment (QFD. A pencil design is used as a numerical example using quadratic utility function for each FR. The model can be applied for balancing customer and designer interest in determining FR target values.

  20. Dynamic Modeling and Simulation Tools for Utility Systems

    National Research Council Canada - National Science Library

    Hock, Vincent

    2002-01-01

    Utility systems are enablers for the force projection process. They provide the electricity, water, transportation fuel, heating, cooling, compressed air, and communications required for the various steps of force projection...

  1. Integration of photovoltaic units into electric utility grids: experiment information requirements and selected issues

    Energy Technology Data Exchange (ETDEWEB)

    1980-09-01

    A number of investigations, including those conducted by The Aerospace Corporation and other contractors, have led to the recognition of technical, economic, and institutional issues relating to the interface between solar electric technologies and electric utility systems. These issues derive from three attributes of solar electric power concepts, including (1) the variability and unpredictability of the solar resources, (2) the dispersed nature of those resources which suggests the feasible deployment of small dispersed power units, and (3) a high initial capital cost coupled with relatively low operating costs. It is imperative that these integration issues be pursued in parallel with the development of each technology if the nation's electric utility systems are to effectively utilize these technologies in the near to intermediate term. Analyses of three of these issues are presented: utility information requirements, generation mix and production cost impacts, and rate structures in the context of photovoltaic units integrated into the utility system. (WHK)

  2. Meta-requirements that Model Change

    OpenAIRE

    Gouri Prakash

    2010-01-01

    One of the common problems encountered in software engineering is addressing and responding to the changing nature of requirements. While several approaches have been devised to address this issue, ranging from instilling resistance to changing requirements in order to mitigate impact to project schedules, to developing an agile mindset towards requirements, the approach discussed in this paper is one of conceptualizing the delta in requirement and modeling it, in order t...

  3. Development of utility generic functional requirements for electronic work packages and computer-based procedures

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-06-01

    The Nuclear Electronic Work Packages - Enterprise Requirements (NEWPER) initiative is a step toward a vision of implementing an eWP framework that includes many types of eWPs. This will enable immediate paper-related cost savings in work management and provide a path to future labor efficiency gains through enhanced integration and process improvement in support of the Nuclear Promise (Nuclear Energy Institute 2016). The NEWPER initiative was organized by the Nuclear Information Technology Strategic Leadership (NITSL) group, which is an organization that brings together leaders from the nuclear utility industry and regulatory agencies to address issues involved with information technology used in nuclear-power utilities. NITSL strives to maintain awareness of industry information technology-related initiatives and events and communicates those events to its membership. NITSL and LWRS Program researchers have been coordinating activities, including joint organization of NEWPER-related meetings and report development. The main goal of the NEWPER initiative was to develop a set of utility generic functional requirements for eWP systems. This set of requirements will support each utility in their process of identifying plant-specific functional and non-functional requirements. The NEWPER initiative has 140 members where the largest group of members consists of 19 commercial U.S. nuclear utilities and eleven of the most prominent vendors of eWP solutions. Through the NEWPER initiative two sets of functional requirements were developed; functional requirements for electronic work packages and functional requirements for computer-based procedures. This paper will describe the development process as well as a summary of the requirements.

  4. Animal models of asthma: utility and limitations

    Directory of Open Access Journals (Sweden)

    Aun MV

    2017-11-01

    Full Text Available Marcelo Vivolo Aun,1,2 Rafael Bonamichi-Santos,1,2 Fernanda Magalhães Arantes-Costa,2 Jorge Kalil,1 Pedro Giavina-Bianchi1 1Clinical Immunology and Allergy Division, Department of Internal Medicine, University of São Paulo School of Medicine, São Paulo, Brazil, 2Laboratory of Experimental Therapeutics (LIM20, Department of Internal Medicine, University of Sao Paulo, Sao Paulo, Brazil Abstract: Clinical studies in asthma are not able to clear up all aspects of disease pathophysiology. Animal models have been developed to better understand these mechanisms and to evaluate both safety and efficacy of therapies before starting clinical trials. Several species of animals have been used in experimental models of asthma, such as Drosophila, rats, guinea pigs, cats, dogs, pigs, primates and equines. However, the most common species studied in the last two decades is mice, particularly BALB/c. Animal models of asthma try to mimic the pathophysiology of human disease. They classically include two phases: sensitization and challenge. Sensitization is traditionally performed by intraperitoneal and subcutaneous routes, but intranasal instillation of allergens has been increasingly used because human asthma is induced by inhalation of allergens. Challenges with allergens are performed through aerosol, intranasal or intratracheal instillation. However, few studies have compared different routes of sensitization and challenge. The causative allergen is another important issue in developing a good animal model. Despite being more traditional and leading to intense inflammation, ovalbumin has been replaced by aeroallergens, such as house dust mites, to use the allergens that cause human disease. Finally, researchers should define outcomes to be evaluated, such as serum-specific antibodies, airway hyperresponsiveness, inflammation and remodeling. The present review analyzes the animal models of asthma, assessing differences between species, allergens and routes

  5. Generic Model to Send Secure Alerts for Utility Companies

    Directory of Open Access Journals (Sweden)

    Perez–Díaz J.A.

    2010-04-01

    Full Text Available In some industries such as logistics services, bank services, and others, the use of automated systems that deliver critical business information anytime and anywhere play an important role in the decision making process. This paper introduces a "Generic model to send secure alerts and notifications", which operates as a middleware between enterprise data sources and its mobile users. This model uses Short Message Service (SMS as its main mobile messaging technology, however is open to use new types of messaging technologies. Our model is interoperable with existing information systems, it can store any kind of information about alerts or notifications at different levels of granularity, it offers different types of notifications (as analert when critical business problems occur,asanotificationina periodical basis or as 2 way query. Notification rules can be customized by final users according to their preferences. The model provides a security framework in the cases where information requires confidentiality, it is extensible to existing and new messaging technologies (like e–mail, MMS, etc. It is a platform, mobile operator and hardware independent. Currently, our solution is being used at the Comisión Federal de Electricidad (Mexico's utility company to deliver secure alerts related to critical events registered in the main power generation plants of our country.

  6. Emergency department and inpatient health care utilization among patients who require interpreter services.

    Science.gov (United States)

    Njeru, Jane W; St Sauver, Jennifer L; Jacobson, Debra J; Ebbert, Jon O; Takahashi, Paul Y; Fan, Chun; Wieland, Mark L

    2015-05-29

    Limited English proficiency is associated with health disparities and suboptimal health outcomes. Although Limited English proficiency is a barrier to effective health care, its association with inpatient health care utilization is unclear. The aim of this study was to examine the association between patients with limited English proficiency, and emergency department visits and hospital admissions. We compared emergency department visits and hospitalizations in 2012 between patients requiring interpreter services and age-matched English-proficient patients (who did not require interpreters), in a retrospective cohort study of adult patients actively empanelled to a large primary health care network in a medium-sized United States city (n = 3,784). Patients who required interpreter services had significantly more Emergency Department visits (841 vs 620; P ≤ .001) and hospitalizations (408 vs 343; P ≤ .001) than patients who did not require interpreter services. On regression analysis the risk of a first Emergency Department visit was 60% higher for patients requiring interpreter services than those who did not (unadjusted hazard ratio [HR], 1.6; 95% confidence interval (CI), 1.4-1.9; P interpreter services had higher rates of inpatient health care utilization compared with patients who did not require an interpreter. Further research is required to understand factors associated with this utilization and to develop sociolinguistically tailored interventions to facilitate appropriate health care provision for this population.

  7. Requirements for clinical information modelling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Jódar-Sánchez, Francisco; Kalra, Dipak

    2015-07-01

    This study proposes consensus requirements for clinical information modelling tools that can support modelling tasks in medium/large scale institutions. Rather than identify which functionalities are currently available in existing tools, the study has focused on functionalities that should be covered in order to provide guidance about how to evolve the existing tools. After identifying a set of 56 requirements for clinical information modelling tools based on a literature review and interviews with experts, a classical Delphi study methodology was applied to conduct a two round survey in order to classify them as essential or recommended. Essential requirements are those that must be met by any tool that claims to be suitable for clinical information modelling, and if we one day have a certified tools list, any tool that does not meet essential criteria would be excluded. Recommended requirements are those more advanced requirements that may be met by tools offering a superior product or only needed in certain modelling situations. According to the answers provided by 57 experts from 14 different countries, we found a high level of agreement to enable the study to identify 20 essential and 21 recommended requirements for these tools. It is expected that this list of identified requirements will guide developers on the inclusion of new basic and advanced functionalities that have strong support by end users. This list could also guide regulators in order to identify requirements that could be demanded of tools adopted within their institutions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. 15 CFR 335.6 - Surrender, reallocation and license utilization requirement.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 2 2010-01-01 2010-01-01 false Surrender, reallocation and license utilization requirement. 335.6 Section 335.6 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade (Continued) INTERNATIONAL TRADE ADMINISTRATION, DEPARTMENT OF COMMERCE MISCELLANEOUS...

  9. Utility/user requirements for the modular high temperature gas-cooled reactor

    International Nuclear Information System (INIS)

    Boyer, V.S.; Kendall, J.M.; Gotschall, H.L.

    1989-01-01

    This paper describes the approach used by Gas-Cooled Reactor Associates (GCRA) in developing Utility/User Requirements for the Modular High Temperature Gas-cooled Reactor (MHTGR). As representatives of the Utility/User industry, it is GCRA's goal that the MHTGR concept be established as an attractive nuclear option offering competitive economics and limited ownership risks. Commercially deployed MHTGR systems should then compete favorably in a mixed-fuel economy with options using fossil, other nuclear and other non-fossil sources. To achieve this goal, the design of the MHTGR plant must address the problems experienced by the U.S. industrial infrastructure during deployment of the first generation of nuclear plants. Indeed, it is GCRA's intent to utilize the characteristics of MHTGR technology for the development of a nuclear alternative that poses regulatory, financial and operational demands on the Owner/Operator that are, in aggregate, comparable to those encountered with non-nuclear options. The dominant risks faced by U.S. Utilities with current nuclear plants derive from their operational complexity and the degree of regulatory involvement in virtually all aspects of utility operations. The MHTGR approach of using ceramic fuel coatings to contain fission products provides the technical basis for simplification of the plant and stabilization of licensing requirements and thus the opportunity for reducing the risks of nuclear plant ownership. The paper describes the rationale for the selection of key requirements for public safety, plant size and performance, operations and maintenance, investment protection, economics and siting in the context of a risk management philosophy. It also describes the ongoing participation of the Utility/User in interpreting requirements, conducting program and design reviews and establishing priorities from the Owner/Operator perspective. (author). 7 refs, 1 fig

  10. Network bandwidth utilization forecast model on high bandwidth networks

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Wuchert (William) [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sim, Alex [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-03-30

    With the increasing number of geographically distributed scientific collaborations and the scale of the data size growth, it has become more challenging for users to achieve the best possible network performance on a shared network. We have developed a forecast model to predict expected bandwidth utilization for high-bandwidth wide area network. The forecast model can improve the efficiency of resource utilization and scheduling data movements on high-bandwidth network to accommodate ever increasing data volume for large-scale scientific data applications. Univariate model is developed with STL and ARIMA on SNMP path utilization data. Compared with traditional approach such as Box-Jenkins methodology, our forecast model reduces computation time by 83.2%. It also shows resilience against abrupt network usage change. The accuracy of the forecast model is within the standard deviation of the monitored measurements.

  11. Network Bandwidth Utilization Forecast Model on High Bandwidth Network

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Wucherl; Sim, Alex

    2014-07-07

    With the increasing number of geographically distributed scientific collaborations and the scale of the data size growth, it has become more challenging for users to achieve the best possible network performance on a shared network. We have developed a forecast model to predict expected bandwidth utilization for high-bandwidth wide area network. The forecast model can improve the efficiency of resource utilization and scheduling data movements on high-bandwidth network to accommodate ever increasing data volume for large-scale scientific data applications. Univariate model is developed with STL and ARIMA on SNMP path utilization data. Compared with traditional approach such as Box-Jenkins methodology, our forecast model reduces computation time by 83.2percent. It also shows resilience against abrupt network usage change. The accuracy of the forecast model is within the standard deviation of the monitored measurements.

  12. Kinetic models of cell growth, substrate utilization and bio ...

    African Journals Online (AJOL)

    Bio-decolorization kinetic studies of distillery effluent in a batch culture were conducted using Aspergillus fumigatus. A simple model was proposed using the Logistic Equation for the growth, Leudeking-Piret kinetics for bio-decolorization, and also for substrate utilization. The proposed models appeared to provide a suitable ...

  13. A fuzzy model for exploiting customer requirements

    Directory of Open Access Journals (Sweden)

    Zahra Javadirad

    2016-09-01

    Full Text Available Nowadays, Quality function deployment (QFD is one of the total quality management tools, where customers’ views and requirements are perceived and using various techniques improves the production requirements and operations. The QFD department, after identification and analysis of the competitors, takes customers’ feedbacks to meet the customers’ demands for the products compared with the competitors. In this study, a comprehensive model for assessing the importance of the customer requirements in the products or services for an organization is proposed. The proposed study uses linguistic variables, as a more comprehensive approach, to increase the precision of the expression evaluations. The importance of these requirements specifies the strengths and weaknesses of the organization in meeting the requirements relative to competitors. The results of these experiments show that the proposed method performs better than the other methods.

  14. Utility of Social Modeling for Proliferation Assessment - Preliminary Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Coles, Garill A.; Gastelum, Zoe N.; Brothers, Alan J.; Thompson, Sandra E.

    2009-06-01

    This Preliminary Assessment draft report will present the results of a literature search and preliminary assessment of the body of research, analysis methods, models and data deemed to be relevant to the Utility of Social Modeling for Proliferation Assessment research. This report will provide: 1) a description of the problem space and the kinds of information pertinent to the problem space, 2) a discussion of key relevant or representative literature, 3) a discussion of models and modeling approaches judged to be potentially useful to the research, and 4) the next steps of this research that will be pursued based on this preliminary assessment. This draft report represents a technical deliverable for the NA-22 Simulations, Algorithms, and Modeling (SAM) program. Specifically this draft report is the Task 1 deliverable for project PL09-UtilSocial-PD06, Utility of Social Modeling for Proliferation Assessment. This project investigates non-traditional use of social and cultural information to improve nuclear proliferation assessment, including nonproliferation assessment, proliferation resistance assessments, safeguards assessments and other related studies. These assessments often use and create technical information about the State’s posture towards proliferation, the vulnerability of a nuclear energy system to an undesired event, and the effectiveness of safeguards. This project will find and fuse social and technical information by explicitly considering the role of cultural, social and behavioral factors relevant to proliferation. The aim of this research is to describe and demonstrate if and how social science modeling has utility in proliferation assessment.

  15. Small turbines in distributed utility application: Natural gas pressure supply requirements

    Energy Technology Data Exchange (ETDEWEB)

    Goldstein, H.L.

    1996-05-01

    Implementing distributed utility can strengthen the local distribution system and help avoid or delay the expense of upgrading transformers and feeders. The gas turbine-generator set is an attractive option based on its low front-end capital cost, reliable performance at unmanned stations, and environmental performance characteristics. This report assesses gas turbine utilization issues from a perspective of fuel supply pressure requirements and discusses both cost and operational factors. A primary operational consideration for siting gas turbines on the electric distribution system is whether the local gas distribution company can supply gas at the required pressure. Currently available gas turbine engines require gas supply pressures of at least 150 pounds per square inch gauge, more typically, 250 to 350 psig. Few LDCs maintain line pressure in excess of 125 psig. One option for meeting the gas pressure requirements is to upgrade or extend an existing pipeline and connect that pipeline to a high-pressure supply source, such as an interstate transmission line. However, constructing new pipeline is expensive, and the small volume of gas required by the turbine for the application offers little incentive for the LDC to provide this service. Another way to meet gas pressure requirements is to boost the compression of the fuel gas at the gas turbine site. Fuel gas booster compressors are readily available as stand-alone units and can satisfactorily increase the supply pressure to meet the turbine engine requirement. However, the life-cycle costs of this equipment are not inconsequential, and maintenance and reliability issues for boosters in this application are questionable and require further study. These factors may make the gas turbine option a less attractive solution in DU applications than first indicated by just the $/kW capital cost. On the other hand, for some applications other DU technologies, such as photovoltaics, may be the more attractive option.

  16. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  17. Identification of lactose phosphotransferase systems in Lactobacillus gasseri ATCC 33323 required for lactose utilization.

    Science.gov (United States)

    Francl, Alyssa L; Hoeflinger, Jennifer L; Miller, Michael J

    2012-04-01

    Improving the annotation of sugar catabolism-related genes requires functional characterization. Our objective was to identify the genes necessary for lactose utilization by Lactobacillus gasseri ATCC 33323 (NCK334). The mechanism of lactose transport in many lactobacilli is a lactose/galactose-specific permease, yet no orthologue was found in NCK334. Characterization of an EI knockout strain [EI (enzyme I) is required for phosphotransferase system transporter (PTS) function] demonstrated that L. gasseri requires PTS(s) to utilize lactose. In order to determine which PTS(s) were necessary for lactose utilization, we compared transcript expression profiles in response to lactose for the 15 complete PTSs identified in the NCK334 genome. PTS 6CB (LGAS_343) and PTS 8C (LGAS_497) were induced in the presence of lactose 107- and 53-fold, respectively. However, L. gasseri ATCC 33323 PTS 6CB, PTS 8C had a growth rate similar to that of the wild-type on semisynthetic deMan, Rogosa, Sharpe (MRS) medium with lactose. Expression profiles of L. gasseri ATCC 33323 PTS 6CB, PTS 8C in response to lactose identified PTS 9BC (LGAS_501) as 373-fold induced, whereas PTS 9BC was not induced in NCK334. Elimination of growth on lactose required the inactivation of both PTS 6CB and PTS 9BC. Among the six candidate phospho-β-galactosidase genes present in the NCK334 genome, LGAS_344 was found to be induced 156-fold in the presence of lactose. In conclusion, we have determined that: (1) NCK334 uses a PTS to import lactose; (2) PTS 6CB and PTS 8C gene expression is strongly induced by lactose; and (3) elimination of PTS 6CB and PTS 9BC is required to prevent growth on lactose.

  18. Modeling requirements for in situ vitrification

    International Nuclear Information System (INIS)

    MacKinnon, R.J.; Mecham, D.C.; Hagrman, D.L.; Johnson, R.W.; Murray, P.E.; Slater, C.E.; Marwil, E.S.; Weaver, R.A.; Argyle, M.D.

    1991-11-01

    This document outlines the requirements for the model being developed at the INEL which will provide analytical support for the ISV technology assessment program. The model includes representations of the electric potential field, thermal transport with melting, gas and particulate release, vapor migration, off-gas combustion and process chemistry. The modeling objectives are to (1) help determine the safety of the process by assessing the air and surrounding soil radionuclide and chemical pollution hazards, the nuclear criticality hazard, and the explosion and fire hazards, (2) help determine the suitability of the ISV process for stabilizing the buried wastes involved, and (3) help design laboratory and field tests and interpret results therefrom

  19. Spent fuel storage requirements for nuclear utilities and OCRWM [Office of Civilian Radioactive Waste Management

    International Nuclear Information System (INIS)

    Wood, T.W.

    1990-03-01

    Projected spent fuel generation at US power reactors exceeds estimated aggregate pool storage capacity by approximately 30,000 metric tons of uranium (MTU). Based on the current repository schedule, little of the spent fuel inventory will be disposed of prior to shutdown of existing reactors, and a large additional capacity for surface storage of spent fuel will be required, either at reactors or at a centralized DOE storage site. Allocation of this storage requirement across the utility-DOE interface, and the resulting implications for reactor sites and the performance of the federal waste management system, were studied during the DOE MRS System Study and again subsequent to the reassessment of the repository schedule. Spent fuel logistics and cost results from these analyses will be used in definition of spent fuel storage capacity requirements for the federal system. 9 refs., 8 figs., 1 tab

  20. Characterizing QALYs under a General Rank Dependent Utility Model

    NARCIS (Netherlands)

    H. Bleichrodt (Han); J. Quiggin (John)

    1997-01-01

    textabstractThis paper provides a characterization of QALYs, the most important outcome measure in medical decision making, in the context of a general rank dependent utility model. We show that both for chronic and for nonchronic health states the characterization of QALYs depends on intuitive

  1. Maximizing the model for Discounted Stream of Utility from ...

    African Journals Online (AJOL)

    Osagiede et al. (2009) considered an analytic model for maximizing discounted stream of utility from consumption when the rate of production is linear. A solution was provided to a level where methods of solving order differential equations will be applied, but they left off there, as a result of the mathematical complexity ...

  2. 78 FR 42889 - Pipeline Safety: Reminder of Requirements for Utility LP-Gas and LPG Pipeline Systems

    Science.gov (United States)

    2013-07-18

    ... Requirements for Utility LP-Gas and LPG Pipeline Systems AGENCY: Pipeline and Hazardous Materials Safety... Bulletin to remind owners and operators of liquefied petroleum gas (LPG) and utility liquefied petroleum... natural gas distribution system must meet the requirements of Part 192 and ANSI/NFPA 58 and 59 (2004) (192...

  3. Sustainable geothermal utilization - Case histories; definitions; research issues and modelling

    International Nuclear Information System (INIS)

    Axelsson, Gudni

    2010-01-01

    Sustainable development by definition meets the needs of the present without compromising the ability of future generations to meet their own needs. The Earth's enormous geothermal resources have the potential to contribute significantly to sustainable energy use worldwide as well as to help mitigate climate change. Experience from the use of numerous geothermal systems worldwide lasting several decades demonstrates that by maintaining production below a certain limit the systems reach a balance between net energy discharge and recharge that may be maintained for a long time (100-300 years). Modelling studies indicate that the effect of heavy utilization is often reversible on a time-scale comparable to the period of utilization. Thus, geothermal resources can be used in a sustainable manner either through (1) constant production below the sustainable limit, (2) step-wise increase in production, (3) intermittent excessive production with breaks, and (4) reduced production after a shorter period of heavy production. The long production histories that are available for low-temperature as well as high-temperature geothermal systems distributed throughout the world, provide the most valuable data available for studying sustainable management of geothermal resources, and reservoir modelling is the most powerful tool available for this purpose. The paper presents sustainability modelling studies for the Hamar and Nesjavellir geothermal systems in Iceland, the Beijing Urban system in China and the Olkaria system in Kenya as examples. Several relevant research issues have also been identified, such as the relevance of system boundary conditions during long-term utilization, how far reaching interference from utilization is, how effectively geothermal systems recover after heavy utilization and the reliability of long-term (more than 100 years) model predictions. (author)

  4. Energy Requirements of Hydrogen-utilizing Microbes: A Boundary Condition for Subsurface Life

    Science.gov (United States)

    Hoehler, Tori M.; Alperin, Marc J.; Albert, Daniel B.; Martens, Christopher S.

    2003-01-01

    Microbial ecosystems based on the energy supplied by water-rock chemistry carry particular significance in the context of geo- and astrobiology. With no direct dependence on solar energy, lithotrophic microbes could conceivably penetrate a planetary crust to a depth limited only by temperature or pressure constraints (several kilometers or more). The deep lithospheric habitat is thereby potentially much greater in volume than its surface counterpart, and in addition offers a stable refuge against inhospitable surface conditions related to climatic or atmospheric evolution (e.g., Mars) or even high-energy impacts (e.g., early in Earth's history). The possibilities for a deep microbial biosphere are, however, greatly constrained by life s need to obtain energy at a certain minimum rate (the maintenance energy requirement) and of a certain minimum magnitude (the energy quantum requirement). The mere existence of these requirements implies that a significant fraction of the chemical free energy available in the subsurface environment cannot be exploited by life. Similar limits may also apply to the usefulness of light energy at very low intensities or long wavelengths. Quantification of these minimum energy requirements in terrestrial microbial ecosystems will help to establish a criterion of energetic habitability that can significantly constrain the prospects for life in Earth's subsurface, or on other bodies in the solar system. Our early work has focused on quantifying the biological energy quantum requirement for methanogenic archaea, as representatives of a plausible subsurface metabolism, in anoxic sediments (where energy availability is among the most limiting factors in microbial population growth). In both field and laboratory experiments utilizing these sediments, methanogens retain a remarkably consistent free energy intake, in the face of fluctuating environmental conditions that affect energy availability. The energy yields apparently required by

  5. CONTROVERSIES REGARDING THE UTILIZATION OF ALTMAN MODEL IN ROMANIA

    Directory of Open Access Journals (Sweden)

    Mihaela ONOFREI

    2012-06-01

    Full Text Available Altman model was built for U.S. companies, based on the characteristics of that economy. Promising results were obtained in other countries such as Britain, Australia, Canada, Finland, Germany, Israel, Norway, India, South Korea; the percentage is over 80% predictability. However, as can be seen, they have an Anglo-Saxon legal system and also the economic environment is highly developed. While there is no reason why this model can be applied to companies in the whole world, we recognize that each has its own peculiarities economic environment, therefore, local models forecast could be better than American models, at least in their testing phase. But the utilization of Altman model is suitable for the Romanian economy? Taking this into account, the purpose of this paper is to test the Altman model on the Romanian market.

  6. Modeling the Dynamic Interrelations between Mobility, Utility, and Land Asking Price

    Science.gov (United States)

    Hidayat, E.; Rudiarto, I.; Siegert, F.; Vries, W. D.

    2018-02-01

    Limited and insufficient information about the dynamic interrelation among mobility, utility, and land price is the main reason to conduct this research. Several studies, with several approaches, and several variables have been conducted so far in order to model the land price. However, most of these models appear to generate primarily static land prices. Thus, a research is required to compare, design, and validate different models which calculate and/or compare the inter-relational changes of mobility, utility, and land price. The applied method is a combination of analysis of literature review, expert interview, and statistical analysis. The result is newly improved mathematical model which have been validated and is suitable for the case study location. This improved model consists of 12 appropriate variables. This model can be implemented in the Salatiga city as the case study location in order to arrange better land use planning to mitigate the uncontrolled urban growth.

  7. Modeling utility-scale wind power plants, part 1: Economics

    Energy Technology Data Exchange (ETDEWEB)

    Milligan, M.

    2000-06-29

    As the worldwide use of wind turbine generators continues to increase in utility-scale applications, it will become increasingly important to assess the economic and reliability impact of these intermittent resources. Although the utility industry in the United States appears to be moving towards a restructured environment, basic economic and reliability issues will continue to be relevant to companies involved with electricity generation. This paper is the first of two that address modeling approaches and results obtained in several case studies and research projects at the National Renewable Energy Laboratory (NREL). This first paper addresses the basic economic issues associated with electricity production from several generators that include large-scale wind power plants. An important part of this discussion is the role of unit commitment and economic dispatch in production-cost models. This paper includes overviews and comparisons of the prevalent production-cost modeling met hods, including several case studies applied to a variety of electric utilities. The second paper discusses various methods of assessing capacity credit and results from several reliability-based studies performed at NREL.

  8. Creating a Linear Model to Optimize Satellite Communication Bandwidth Utilization

    National Research Council Canada - National Science Library

    Stone, David A

    2006-01-01

    .... The paper then presents an example of a linear model that could be expanded for implementation and used for actual problem analysis. The final section of the paper describes areas that require further study and additional steps that must be taken to convert the concept presented in this paper to an actual model suitable for use.

  9. PPE Surface Proteins Are Required for Heme Utilization by Mycobacterium tuberculosis

    Directory of Open Access Journals (Sweden)

    Avishek Mitra

    2017-01-01

    Full Text Available Iron is essential for replication of Mycobacterium tuberculosis, but iron is efficiently sequestered in the human host during infection. Heme constitutes the largest iron reservoir in the human body and is utilized by many bacterial pathogens as an iron source. While heme acquisition is well studied in other bacterial pathogens, little is known in M. tuberculosis. To identify proteins involved in heme utilization by M. tuberculosis, a transposon mutant library was screened for resistance to the toxic heme analog gallium(III-porphyrin (Ga-PIX. Inactivation of the ppe36, ppe62, and rv0265c genes resulted in resistance to Ga-PIX. Growth experiments using isogenic M. tuberculosis deletion mutants showed that PPE36 is essential for heme utilization by M. tuberculosis, while the functions of PPE62 and Rv0265c are partially redundant. None of the genes restored growth of the heterologous M. tuberculosis mutants, indicating that the proteins encoded by the genes have separate functions. PPE36, PPE62, and Rv0265c bind heme as shown by surface plasmon resonance spectroscopy and are associated with membranes. Both PPE36 and PPE62 proteins are cell surface accessible, while the Rv0265c protein is probably located in the periplasm. PPE36 and PPE62 are, to our knowledge, the first proline-proline-glutamate (PPE proteins of M. tuberculosis that bind small molecules and are involved in nutrient acquisition. The absence of a virulence defect of the ppe36 deletion mutant indicates that the different iron acquisition pathways of M. tuberculosis may substitute for each other during growth and persistence in mice. The emerging model of heme utilization by M. tuberculosis as derived from this study is substantially different from those of other bacteria.

  10. Risk Decision Making Model for Reservoir Floodwater resources Utilization

    Science.gov (United States)

    Huang, X.

    2017-12-01

    Floodwater resources utilization(FRU) can alleviate the shortage of water resources, but there are risks. In order to safely and efficiently utilize the floodwater resources, it is necessary to study the risk of reservoir FRU. In this paper, the risk rate of exceeding the design flood water level and the risk rate of exceeding safety discharge are estimated. Based on the principle of the minimum risk and the maximum benefit of FRU, a multi-objective risk decision making model for FRU is constructed. Probability theory and mathematical statistics method is selected to calculate the risk rate; C-D production function method and emergy analysis method is selected to calculate the risk benefit; the risk loss is related to flood inundation area and unit area loss; the multi-objective decision making problem of the model is solved by the constraint method. Taking the Shilianghe reservoir in Jiangsu Province as an example, the optimal equilibrium solution of FRU of the Shilianghe reservoir is found by using the risk decision making model, and the validity and applicability of the model are verified.

  11. Hedonic travel cost and random utility models of recreation

    Energy Technology Data Exchange (ETDEWEB)

    Pendleton, L. [Univ. of Southern California, Los Angeles, CA (United States); Mendelsohn, R.; Davis, E.W. [Yale Univ., New Haven, CT (United States). School of Forestry and Environmental Studies

    1998-07-09

    Micro-economic theory began as an attempt to describe, predict and value the demand and supply of consumption goods. Quality was largely ignored at first, but economists have started to address quality within the theory of demand and specifically the question of site quality, which is an important component of land management. This paper demonstrates that hedonic and random utility models emanate from the same utility theoretical foundation, although they make different estimation assumptions. Using a theoretically consistent comparison, both approaches are applied to examine the quality of wilderness areas in the Southeastern US. Data were collected on 4778 visits to 46 trails in 20 different forest areas near the Smoky Mountains. Visitor data came from permits and an independent survey. The authors limited the data set to visitors from within 300 miles of the North Carolina and Tennessee border in order to focus the analysis on single purpose trips. When consistently applied, both models lead to results with similar signs but different magnitudes. Because the two models are equally valid, recreation studies should continue to use both models to value site quality. Further, practitioners should be careful not to make simplifying a priori assumptions which limit the effectiveness of both techniques.

  12. Animal models of myasthenia gravis: utility and limitations

    Science.gov (United States)

    Mantegazza, Renato; Cordiglieri, Chiara; Consonni, Alessandra; Baggi, Fulvio

    2016-01-01

    Myasthenia gravis (MG) is a chronic autoimmune disease caused by the immune attack of the neuromuscular junction. Antibodies directed against the acetylcholine receptor (AChR) induce receptor degradation, complement cascade activation, and postsynaptic membrane destruction, resulting in functional reduction in AChR availability. Besides anti-AChR antibodies, other autoantibodies are known to play pathogenic roles in MG. The experimental autoimmune MG (EAMG) models have been of great help over the years in understanding the pathophysiological role of specific autoantibodies and T helper lymphocytes and in suggesting new therapies for prevention and modulation of the ongoing disease. EAMG can be induced in mice and rats of susceptible strains that show clinical symptoms mimicking the human disease. EAMG models are helpful for studying both the muscle and the immune compartments to evaluate new treatment perspectives. In this review, we concentrate on recent findings on EAMG models, focusing on their utility and limitations. PMID:27019601

  13. Modelling of capital requirements in the energy sector: capital market access. Final memorandum

    Energy Technology Data Exchange (ETDEWEB)

    1978-04-01

    Formal modelling techniques for analyzing the capital requirements of energy industries have been performed at DOE. A survey has been undertaken of a number of models which forecast energy-sector capital requirements or which detail the interactions of the energy sector and the economy. Models are identified which can be useful as prototypes for some portion of DOE's modelling needs. The models are examined to determine any useful data bases which could serve as inputs to an original DOE model. A selected group of models are examined which can comply with the stated capabilities. The data sources being used by these models are covered and a catalog of the relevant data bases is provided. The models covered are: capital markets and capital availability models (Fossil 1, Bankers Trust Co., DRI Macro Model); models of physical capital requirements (Bechtel Supply Planning Model, ICF Oil and Gas Model and Coal Model, Stanford Research Institute National Energy Model); macroeconomic forecasting models with input-output analysis capabilities (Wharton Annual Long-Term Forecasting Model, Brookhaven/University of Illinois Model, Hudson-Jorgenson/Brookhaven Model); utility models (MIT Regional Electricity Model-Baughman Joskow, Teknekron Electric Utility Simulation Model); and others (DRI Energy Model, DRI/Zimmerman Coal Model, and Oak Ridge Residential Energy Use Model).

  14. Integrated modelling requires mass collaboration (Invited)

    Science.gov (United States)

    Moore, R. V.

    2009-12-01

    The need for sustainable solutions to the world’s problems is self evident; the challenge is to anticipate where, in the environment, economy or society, the proposed solution will have negative consequences. If we failed to realise that the switch to biofuels would have the seemingly obvious result of reduced food production, how much harder will it be to predict the likely impact of policies whose impacts may be more subtle? It has been clear for a long time that models and data will be important tools for assessing the impact of events and the measures for their mitigation. They are an effective way of encapsulating knowledge of a process and using it for prediction. However, most models represent a single or small group of processes. The sustainability challenges that face us now require not just the prediction of a single process but the prediction of how many interacting processes will respond in given circumstances. These processes will not be confined to a single discipline but will often straddle many. For example, the question, “What will be the impact on river water quality of the medical plans for managing a ‘flu pandemic and could they cause a further health hazard?” spans medical planning, the absorption of drugs by the body, the spread of disease, the hydraulic and chemical processes in sewers and sewage treatment works and river water quality. This question nicely reflects the present state of the art. We have models of the processes and standards, such as the Open Modelling Interface (the OpenMI), allow them to be linked together and to datasets. We can therefore answer the question but with the important proviso that we thought to ask it. The next and greater challenge is to deal with the open question, “What are the implications of the medical plans for managing a ‘flu pandemic?”. This implies a system that can make connections that may well not have occurred to us and then evaluate their probable impact. The final touch will be to

  15. Cost of energy from utility-owned solar electric systems. A required revenue method for ERDA/EPRI evaluations

    Energy Technology Data Exchange (ETDEWEB)

    1976-06-01

    This methodology calculates the electric energy busbar cost from a utility-owned solar electric system. This approach is applicable to both publicly- and privately-owned utilities. Busbar cost represents the minimum price per unit of energy consistent with producing system-resultant revenues equal to the sum of system-resultant costs. This equality is expressed in present value terms, where the discount rate used reflects the rate of return required on invested capital. Major input variables describe the output capabilities and capital cost of the energy system, the cash flows required for system operation and maintenance, and the financial structure and tax environment of the utility.

  16. European Utility Requirements: leveling the European electricity producers' playing ground for new NPPs

    International Nuclear Information System (INIS)

    Bernard Roche

    2006-01-01

    Full text of publication follows: Since 1992, the European Utility Requirement (EUR) document has been developed by the major European electricity producers. The main driver to this work has been the construction of a unified European market. The electricity producers have set out design requirements adapted to this new European environment, while keeping in mind experience feedback from operating NPPs worldwide. The EUR document is now fully operational and its set of generic requirements have been recently used as bid specification in Finland and in China. The EUR document keeps developing in two directions: 1- completing the assessment of the projects that could be proposed by the vendors for the European market. Five projects have been assessed between 1999 and 2002: BWR90, EPR, EP1000, ABWR and SWR1000. Two new projects are being assessed, the Westinghouse AP1000 and the Russian VVER AES92. It is currently planned to publish these two new assessments in the first half of 2006. Others may be undertaken meanwhile. 2- revision of the generic requirements. A revision C of the volume 4 dedicated to power generation plant is being completed. It includes responses to vendors comments and feedback from the TVO call for bid for Finland 5. A revision D of the volumes 1 and 2 dedicated to nuclear islands is foreseen. The main contributions to this revision are the harmonization actions going on in Europe about nuclear safety (WENRA study on reactor safety harmonization, EC works, evolution of the IAEA guides and requirements), the harmonization works on the conditions of connection to the European HV grid as well as harmonization works on other matters, like codes and standards. This has given a unified frame in which the future nuclear plants can be designed and built. In this frame development of standards designs usable throughout Europe without major design change is possible, thus helping to increase competition, and ultimately to save investment and operating costs

  17. European utility requirements: common rules to design next LWR plants in an open electricity market

    International Nuclear Information System (INIS)

    Berbey, Pierre; Ingemarsson, Karl-Fredrik

    2004-01-01

    The major European electricity producers want to keep able to build new nuclear power plants and they believe 3. generation LWRs would be the most adapted response to their needs in the first decades of this century. Producing a common European Utility Requirement (EUR) document has been one of the basic tasks towards this objective. In this common frame, standardized and competitive LWR NPPs could be developed and offered to the investors. This idea is now well supported by all the other actors on the European electricity market: vendors, regulators, grid managers, administrations although in the competitive and unified European electricity market that is emerging, the electricity producers' stakes are more and more different from the other electricity business actors'. The next term objectives of the electricity producers involved in EUR are focused on negotiating common rules of the game together with the regulators. This covers the nuclear safety approaches, the conditions requested to connect a plant to a HV grid, as well as the design standards. Discussions are going on between the EUR organization and all the corresponding bodies to develop stabilized and predictable design rules that would meet the constraints of nuclear electricity generation in this new environment. Finally there cannot be competition without competitors. The EUR organization has proven to be the right place to establish trustful relationship between the vendors and their potential customers, through fair assessment of the proposed designs performance vs. the utility needs. This will be continued and developed with the main vendors present in Europe, so as to keep alive a list of 4 to 6 designs 'qualified', i.e. showing an acceptable score of non-compliance vs. EUR. (authors)

  18. DIAMOND: A model of incremental decision making for resource acquisition by electric utilities

    Energy Technology Data Exchange (ETDEWEB)

    Gettings, M.; Hirst, E.; Yourstone, E.

    1991-02-01

    Uncertainty is a major issue facing electric utilities in planning and decision making. Substantial uncertainties exist concerning future load growth; the lifetimes and performances of existing power plants; the construction times, costs, and performances of new resources being brought online; and the regulatory and economic environment in which utilities operate. This report describes a utility planning model that focuses on frequent and incremental decisions. The key features of this model are its explicit treatment of uncertainty, frequent user interaction with the model, and the ability to change prior decisions. The primary strength of this model is its representation of the planning and decision-making environment that utility planners and executives face. Users interact with the model after every year or two of simulation, which provides an opportunity to modify past decisions as well as to make new decisions. For example, construction of a power plant can be started one year, and if circumstances change, the plant can be accelerated, mothballed, canceled, or continued as originally planned. Similarly, the marketing and financial incentives for demand-side management programs can be changed from year to year, reflecting the short lead time and small unit size of these resources. This frequent user interaction with the model, an operational game, should build greater understanding and insights among utility planners about the risks associated with different types of resources. The model is called DIAMOND, Decision Impact Assessment Model. In consists of four submodels: FUTURES, FORECAST, SIMULATION, and DECISION. It runs on any IBM-compatible PC and requires no special software or hardware. 19 refs., 13 figs., 15 tabs.

  19. Orlistat for the treatment of obesity: cost utility model.

    Science.gov (United States)

    Foxcroft, D R

    2005-11-01

    This study aimed to assess the cost utility of orlistat treatment based on (i) criteria from recent guidance from the National Institute for Clinical Excellence (NICE) for England and Wales (treatment discontinued if weight loss < 5% at 3 months; and < 10% at 6 months); and (ii) alternative criteria from the European Agency for the Evaluation of Medicinal Products (EMEA) licence for orlistat prescription in the European Community (treatment discontinued if weight loss < 5% at 3 months). Subjects were 1398 obese individuals who participated in three large European Phase III trials of orlistat treatment for adults (BMI: 28-47 kg m(-2)). Measures were: response to treatment in orlistat and placebo treatment groups; health benefit expressed as quality adjusted life years (QALYs) gained associated with weight loss; costs associated with orlistat treatment. In the cost utility model with multiway sensitivity analysis, the cost/QALY gained using the NICE criteria was estimated to be 24,431 pounds (sensitivity analysis range: 10,856 to 77,197 pounds). The cost/QALY gained using the alternative EMEA criteria was estimated to be 19,005 pounds (range: 8,840 to 57,798 pounds). In conclusion, NICE guidance for the continued use of orlistat was supported in this updated cost utility model, comparing favourably with a previously published estimate of 45,881 pounds per QALY gained. Moreover, the value for money of orlistat treatment is improved further if EMEA treatment criteria for continued orlistat treatment are applied. The EMEA criteria should be considered in any future changes to the NICE guidance or in guidance issued by similar agencies.

  20. Animal models of GM2 gangliosidosis: utility and limitations

    Directory of Open Access Journals (Sweden)

    Lawson CA

    2016-07-01

    Full Text Available Cheryl A Lawson,1,2 Douglas R Martin2,3 1Department of Pathobiology, 2Scott-Ritchey Research Center, 3Department of Anatomy, Physiology and Pharmacology, Auburn University College of Veterinary Medicine, Auburn, AL, USA Abstract: GM2 gangliosidosis, a subset of lysosomal storage disorders, is caused by a deficiency of the glycohydrolase, β-N-acetylhexosaminidase, and includes the closely related Tay–Sachs and Sandhoff diseases. The enzyme deficiency prevents the normal, stepwise degradation of ganglioside, which accumulates unchecked within the cellular lysosome, particularly in neurons. As a result, individuals with GM2 gangliosidosis experience progressive neurological diseases including motor deficits, progressive weakness and hypotonia, decreased responsiveness, vision deterioration, and seizures. Mice and cats are well-established animal models for Sandhoff disease, whereas Jacob sheep are the only known laboratory animal model of Tay–Sachs disease to exhibit clinical symptoms. Since the human diseases are relatively rare, animal models are indispensable tools for further study of pathogenesis and for development of potential treatments. Though no effective treatments for gangliosidoses currently exist, animal models have been used to test promising experimental therapies. Herein, the utility and limitations of gangliosidosis animal models and how they have contributed to the development of potential new treatments are described. Keywords: GM2 gangliosidosis, Tay–Sachs disease, Sandhoff disease, lysosomal storage disorder, sphingolipidosis, brain disease

  1. Animal models of GM2 gangliosidosis: utility and limitations.

    Science.gov (United States)

    Lawson, Cheryl A; Martin, Douglas R

    2016-01-01

    GM2 gangliosidosis, a subset of lysosomal storage disorders, is caused by a deficiency of the glycohydrolase, β-N-acetylhexosaminidase, and includes the closely related Tay-Sachs and Sandhoff diseases. The enzyme deficiency prevents the normal, stepwise degradation of ganglioside, which accumulates unchecked within the cellular lysosome, particularly in neurons. As a result, individuals with GM2 gangliosidosis experience progressive neurological diseases including motor deficits, progressive weakness and hypotonia, decreased responsiveness, vision deterioration, and seizures. Mice and cats are well-established animal models for Sandhoff disease, whereas Jacob sheep are the only known laboratory animal model of Tay-Sachs disease to exhibit clinical symptoms. Since the human diseases are relatively rare, animal models are indispensable tools for further study of pathogenesis and for development of potential treatments. Though no effective treatments for gangliosidoses currently exist, animal models have been used to test promising experimental therapies. Herein, the utility and limitations of gangliosidosis animal models and how they have contributed to the development of potential new treatments are described.

  2. ON THE UTILITY OF SORNETTE’S CRASH PREDICTION MODEL

    Directory of Open Access Journals (Sweden)

    IOAN ROXANA

    2015-10-01

    Full Text Available Stock market crashes have been a constant subject of interest among capital market researchers. Crashes’ behavior has been largely studied, but the problem that remained unsolved until recently, was that of a prediction algorithm. Stock market crashes are complex and global events, rarely taking place on a singular national capital market. They usually occur simultaneously on several if not most capital markets, implying important losses among the investors. Investments made within various stock markets have an extremely important role within the global economy, influencing people’s lives in many ways. Presently, stock market crashes are being studied with great interest, not only because of the necessity of a deep understanding of the phenomenon, but also because of the fact that these crashes belong to the so-called category of “extreme phenomena”. Those are the main reasons that determined scientists to try building mathematical models for crashes prediction. Such a model was built by Professor Didier Sornette, inspired and adapted from an earthquake detection model. Still, the model keeps many characteristics of its predecessor, not being fully adapted to the economic realities and demands, or to the stock market’s characteristics. This paper attempts to test the utility of the model in predicting Bucharest Stock Exchange’s price falls, as well as the possibility of it being successfully used by investors.

  3. A workflow learning model to improve geovisual analytics utility.

    Science.gov (United States)

    Roth, Robert E; Maceachren, Alan M; McCabe, Craig A

    2009-01-01

    INTRODUCTION: This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. OBJECTIVES: The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. METHODOLOGY: The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on

  4. Energy Requirements of Hydrogen-Utilizing Microbes: Boundary Condition for Subsurface Life

    Science.gov (United States)

    Hoehler, Tori M.; Alperin, Marc J.; Albert, Daniel B.; Martens, Christopher S.; DeVincenzi, Donald (Technical Monitor)

    2002-01-01

    For planetary bodies with surface conditions that are too harsh to permit continuous occupation by life, the deep subsurface offers a potentially stable and habitable niche. For organisms occupying this niche, the spectrum of possible metabolisms must be limited to those which do not include sunlight as an energy source or oxygen as a chemical reagent - generally, low-energy anaerobic oxidation-reduction processes. The quantity of energy released in such processes is critical, because currently understood mechanisms of biological energy conservation indicate that energy is only 'useful' to an organism when it is available at a certain minimum level - the 'biological energy quantum'. The mere existence of a BEQ implies that a significant fraction of the chemical energy present in the environment cannot be exploited by life; similarly, the absolute magnitude of the BEQ must be a key variable in determining the potential viability and distribution of subsurface microbial communities. Laboratory culture studies suggest that organisms require an energy of about -20 kJ/mol to grow. However, we find that hydrogen-utilizing microorganisms in an energy-limited natural ecosystem are active with energy yields as low as -10 kJ/mol. A lower BEQ would mean a significantly expanded range of energetically viable subsurface habitat for life.

  5. NRC review of Electric Power Research Institute's Advanced Light Reactor Utility Requirements Document - Program summary, Project No. 669

    International Nuclear Information System (INIS)

    1992-08-01

    The staff of the US Nuclear Regulatory Commission has prepared Volume 1 of a safety evaluation report (SER), ''NRC Review of Electric Power Research Institute's Advanced Light Water Reactor Utility Requirements Document -- Program Summary,'' to document the results of its review of the Electric Power Research Institute's ''Advanced Light Water Reactor Utility Requirements Document.'' This SER provides a discussion of the overall purpose and scope of the Requirements Document, the background of the staff's review, the review approach used by the staff, and a summary of the policy and technical issues raised by the staff during its review

  6. Economic analysis of open space box model utilization in spacecraft

    Science.gov (United States)

    Mohammad, Atif F.; Straub, Jeremy

    2015-05-01

    It is a known fact that the amount of data about space that is stored is getting larger on an everyday basis. However, the utilization of Big Data and related tools to perform ETL (Extract, Transform and Load) applications will soon be pervasive in the space sciences. We have entered in a crucial time where using Big Data can be the difference (for terrestrial applications) between organizations underperforming and outperforming their peers. The same is true for NASA and other space agencies, as well as for individual missions and the highly-competitive process of mission data analysis and publication. In most industries, conventional opponents and new candidates alike will influence data-driven approaches to revolutionize and capture the value of Big Data archives. The Open Space Box Model is poised to take the proverbial "giant leap", as it provides autonomic data processing and communications for spacecraft. We can find economic value generated from such use of data processing in our earthly organizations in every sector, such as healthcare, retail. We also can easily find retailers, performing research on Big Data, by utilizing sensors driven embedded data in products within their stores and warehouses to determine how these products are actually used in the real world.

  7. Modeling a Packed Bed Reactor Utilizing the Sabatier Process

    Science.gov (United States)

    Shah, Malay G.; Meier, Anne J.; Hintze, Paul E.

    2017-01-01

    A numerical model is being developed using Python which characterizes the conversion and temperature profiles of a packed bed reactor (PBR) that utilizes the Sabatier process; the reaction produces methane and water from carbon dioxide and hydrogen. While the specific kinetics of the Sabatier reaction on the RuAl2O3 catalyst pellets are unknown, an empirical reaction rate equation1 is used for the overall reaction. As this reaction is highly exothermic, proper thermal control is of the utmost importance to ensure maximum conversion and to avoid reactor runaway. It is therefore necessary to determine what wall temperature profile will ensure safe and efficient operation of the reactor. This wall temperature will be maintained by active thermal controls on the outer surface of the reactor. Two cylindrical PBRs are currently being tested experimentally and will be used for validation of the Python model. They are similar in design except one of them is larger and incorporates a preheat loop by feeding the reactant gas through a pipe along the center of the catalyst bed. The further complexity of adding a preheat pipe to the model to mimic the larger reactor is yet to be implemented and validated; preliminary validation is done using the smaller PBR with no reactant preheating. When mapping experimental values of the wall temperature from the smaller PBR into the Python model, a good approximation of the total conversion and temperature profile has been achieved. A separate CFD model incorporates more complex three-dimensional effects by including the solid catalyst pellets within the domain. The goal is to improve the Python model to the point where the results of other reactor geometry can be reasonably predicted relatively quickly when compared to the much more computationally expensive CFD approach. Once a reactor size is narrowed down using the Python approach, CFD will be used to generate a more thorough prediction of the reactors performance.

  8. A Functional Model of Quality Assurance for Psychiatric Hospitals and Corresponding Staffing Requirements.

    Science.gov (United States)

    Kamis-Gould, Edna; And Others

    1991-01-01

    A model for quality assurance (QA) in psychiatric hospitals is described. Its functions (general QA, utilization review, clinical records, evaluation, management information systems, risk management, and infection control), subfunctions, and corresponding staffing requirements are reviewed. This model was designed to foster standardization in QA…

  9. From requirements to Java in a snap model-driven requirements engineering in practice

    CERN Document Server

    Smialek, Michal

    2015-01-01

    This book provides a coherent methodology for Model-Driven Requirements Engineering which stresses the systematic treatment of requirements within the realm of modelling and model transformations. The underlying basic assumption is that detailed requirements models are used as first-class artefacts playing a direct role in constructing software. To this end, the book presents the Requirements Specification Language (RSL) that allows precision and formality, which eventually permits automation of the process of turning requirements into a working system by applying model transformations and co

  10. Utility of Social Modeling for Proliferation Assessment - Preliminary Findings

    International Nuclear Information System (INIS)

    Coles, Garill A.; Gastelum, Zoe N.; Brothers, Alan J.; Thompson, Sandra E.

    2009-01-01

    Often the methodologies for assessing proliferation risk are focused around the inherent vulnerability of nuclear energy systems and associated safeguards. For example an accepted approach involves ways to measure the intrinsic and extrinsic barriers to potential proliferation. This paper describes preliminary investigation into non-traditional use of social and cultural information to improve proliferation assessment and advance the approach to assessing nuclear material diversion. Proliferation resistance assessment, safeguard assessments and related studies typically create technical information about the vulnerability of a nuclear energy system to diversion of nuclear material. The purpose of this research project is to find ways to integrate social information with technical information by explicitly considering the role of culture, groups and/or individuals to factors that impact the possibility of proliferation. When final, this work is expected to describe and demonstrate the utility of social science modeling in proliferation and proliferation risk assessments.

  11. Modeling uncertainty in requirements engineering decision support

    Science.gov (United States)

    Feather, Martin S.; Maynard-Zhang, Pedrito; Kiper, James D.

    2005-01-01

    One inherent characteristic of requrements engineering is a lack of certainty during this early phase of a project. Nevertheless, decisions about requirements must be made in spite of this uncertainty. Here we describe the context in which we are exploring this, and some initial work to support elicitation of uncertain requirements, and to deal with the combination of such information from multiple stakeholders.

  12. A mangrove creek restoration plan utilizing hydraulic modeling.

    Science.gov (United States)

    Marois, Darryl E; Mitsch, William J

    2017-11-01

    Despite the valuable ecosystem services provided by mangrove ecosystems they remain threatened around the globe. Urban development has been a primary cause for mangrove destruction and deterioration in south Florida USA for the last several decades. As a result, the restoration of mangrove forests has become an important topic of research. Using field sampling and remote-sensing we assessed the past and present hydrologic conditions of a mangrove creek and its connected mangrove forest and brackish marsh systems located on the coast of Naples Bay in southwest Florida. We concluded that the hydrology of these connected systems had been significantly altered from its natural state due to urban development. We propose here a mangrove creek restoration plan that would extend the existing creek channel 1.1 km inland through the adjacent mangrove forest and up to an adjacent brackish marsh. We then tested the hydrologic implications using a hydraulic model of the mangrove creek calibrated with tidal data from Naples Bay and water levels measured within the creek. The calibrated model was then used to simulate the resulting hydrology of our proposed restoration plan. Simulation results showed that the proposed creek extension would restore a twice-daily flooding regime to a majority of the adjacent mangrove forest and that there would still be minimal tidal influence on the brackish marsh area, keeping its salinity at an acceptable level. This study demonstrates the utility of combining field data and hydraulic modeling to aid in the design of mangrove restoration plans.

  13. Modeling Resource Utilization of a Large Data Acquisition System

    CERN Document Server

    AUTHOR|(SzGeCERN)756497; The ATLAS collaboration; Garcia Garcia, Pedro Javier; Vandelli, Wainer; Froening, Holger

    2017-01-01

    The ATLAS 'Phase-II' upgrade, scheduled to start in 2024, will significantly change the requirements under which the data-acquisition system operates. The input data rate, currently fixed around 150 GB/s, is anticipated to reach 5 TB/s. In order to deal with the challenging conditions, and exploit the capabilities of newer technologies, a number of architectural changes are under consideration. Of particular interest is a new component, known as the Storage Handler, which will provide a large buffer area decoupling real-time data taking from event filtering. Dynamic operational models of the upgraded system can be used to identify the required resources and to select optimal techniques. In order to achieve a robust and dependable model, the current data-acquisition architecture has been used as a test case. This makes it possible to verify and calibrate the model against real operation data. Such a model can then be evolved toward the future ATLAS Phase-II architecture. In this paper we introduce the current ...

  14. Modelling Resource Utilization of a Large Data Acquisition System

    CERN Document Server

    Santos, Alejandro; The ATLAS collaboration

    2017-01-01

    The ATLAS 'Phase-II' upgrade, scheduled to start in 2024, will significantly change the requirements under which the data-acquisition system operates. The input data rate, currently fixed around 150 GB/s, is anticipated to reach 5 TB/s. In order to deal with the challenging conditions, and exploit the capabilities of newer technologies, a number of architectural changes are under consideration. Of particular interest is a new component, known as the Storage Handler, which will provide a large buffer area decoupling real-time data taking from event filtering. Dynamic operational models of the upgraded system can be used to identify the required resources and to select optimal techniques. In order to achieve a robust and dependable model, the current data-acquisition architecture has been used as a test case. This makes it possible to verify and calibrate the model against real operation data. Such a model can then be evolved toward the future ATLAS Phase-II architecture. In this paper we introduce the current ...

  15. A goal-oriented requirements modelling language for enterprise architecture

    NARCIS (Netherlands)

    Quartel, Dick; Engelsman, W.; Jonkers, Henk; van Sinderen, Marten J.

    2009-01-01

    Methods for enterprise architecture, such as TOGAF, acknowledge the importance of requirements engineering in the development of enterprise architectures. Modelling support is needed to specify, document, communicate and reason about goals and requirements. Current modelling techniques for

  16. Proline utilization system is required for infection by the pathogenic α-proteobacterium Brucella abortus.

    Science.gov (United States)

    Caudill, Mitchell T; Budnick, James A; Sheehan, Lauren M; Lehman, Christian R; Purwantini, Endang; Mukhopadhyay, Biswarup; Caswell, Clayton C

    2017-07-01

    Proline utilization (Put) systems have been described in a number of bacteria; however, the importance and functionality of the Put system in the intracellular pathogen Brucellaabortus has not been explored. Generally, bacterial Put systems are composed of the bifunctional enzyme proline dehydrogenase PutA and its transcriptional activator PutR. Here, we demonstrate that the genes putA (bab2_0518) and putR (bab2_0517) are critical for the chronic infection of mice by B. abortus, but putA and putR are not required for the survival and replication of the bacteria in naive macrophages. Additionally, in vitro experiments revealed that putR is necessary for the ability of the bacteria to withstand oxidative stress, as a ΔputR deletion strain is hypersensitive to hydrogen peroxide exposure. Quantitative reverse transcription-PCR and putA-lacZ transcriptional reporter studies revealed that PutR acts as a transcriptional activator of putA in Brucella, and electrophoretic mobility shift assays confirmed that PutR binds directly to the putA promoter region. Biochemical analyses demonstrated that a purified recombinant B. abortus PutA protein possesses quintessential proline dehydrogenase activity, as PutA is capable of catalysing the conversion of proline to glutamate. Altogether, these data are the first to reveal that the Put system plays a significant role in the ability of B. abortus to replicate and survive within its host, as well as to describe the genetic regulation and biochemical activity of the Put system in Brucella.

  17. Understanding Emerging Impacts and Requirements Related to Utility-Scale Solar Development

    Energy Technology Data Exchange (ETDEWEB)

    Hartmann, Heidi M. [Argonne National Lab. (ANL), Argonne, IL (United States); Grippo, Mark A. [Argonne National Lab. (ANL), Argonne, IL (United States); Heath, Garvin A. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Smith, Karen P. [Argonne National Lab. (ANL), Argonne, IL (United States); Sullivan, Robert G. [Argonne National Lab. (ANL), Argonne, IL (United States); Walston, Leroy J. [Argonne National Lab. (ANL), Argonne, IL (United States); Wescott, Konstance L. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-09-01

    Utility-scale solar energy plays an important role in the nation’s strategy to address climate change threats through increased deployment of renewable energy technologies, and both the federal government and individual states have established specific goals for increased solar energy development. In order to achieve these goals, much attention is paid to making utility-scale solar energy cost-competitive with other conventional energy sources, while concurrently conducting solar development in an environmentally sound manner.

  18. REQUIREMENTS PARTICLE NETWORKS: AN APPROACH TO FORMAL SOFTWARE FUNCTIONAL REQUIREMENTS MODELLING

    OpenAIRE

    Wiwat Vatanawood; Wanchai Rivepiboon

    2001-01-01

    In this paper, an approach to software functional requirements modelling using requirements particle networks is presented. In our approach, a set of requirements particles is defined as an essential tool to construct a visual model of software functional requirements specification during the software analysis phase and the relevant formal specification is systematically generated without the experience of writing formal specification. A number of algorithms are presented to perform these for...

  19. Building traceable Event-B models from requirements

    OpenAIRE

    Alkhammash, Eman; Butler, Michael; Fathabadi, Asieh Salehi; Cîrstea, Corina

    2015-01-01

    Abstract Bridging the gap between informal requirements and formal specifications is a key challenge in systems engineering. Constructing appropriate abstractions in formal models requires skill and managing the complexity of the relationships between requirements and formal models can be difficult. In this paper we present an approach that aims to address the twin challenges of finding appropriate abstractions and managing traceability between requirements and models. Our approach is based o...

  20. A public utility model for managing public land recreation enterprises.

    Science.gov (United States)

    Tom. Quinn

    2002-01-01

    Through review of relevant economic principles and judicial precedent, a case is made that public-land recreation enterprises are analogous to traditionally recognized public utilities. Given the historical concern over the societal value of recreation and associated pricing issues, public-land management policies failing to acknowledge these utility-like...

  1. mathematical models for estimating radio channels utilization when ...

    African Journals Online (AJOL)

    Programmer at the Telecommunications Office of the IT Department, Belgorod State. University. Published online: 08 August 2017. ABSTRACT. The wireless self-organized network functioning efficiency is considered from its radio channels utilization point of view. In order to increase the radio channels utilization it is.

  2. Mixing Formal and Informal Model Elements for Tracing Requirements

    DEFF Research Database (Denmark)

    Jastram, Michael; Hallerstede, Stefan; Ladenberger, Lukas

    2011-01-01

    Tracing between informal requirements and formal models is challenging. A method for such tracing should permit to deal efficiently with changes to both the requirements and the model. A particular challenge is posed by the persisting interplay of formal and informal elements. In this paper, we...... describe an incremental approach to requirements validation and systems modelling. Formal modelling facilitates a high degree of automation: it serves for validation and traceability. The foundation for our approach are requirements that are structured according to the WRSPM reference model. We provide...... a system for traceability with a state-based formal method that supports refinement. We do not require all specification elements to be modelled formally and support incremental incorporation of new specification elements into the formal model. Refinement is used to deal with larger amounts of requirements...

  3. Development of U.S. Government General Technical Requirements for UAS Flight Safety Systems Utilizing the Iridium Satellite Constellation

    Science.gov (United States)

    Murray, Jennifer; Birr, Richard

    2010-01-01

    This slide presentation reviews the development of technical requirements for Unmanned Aircraft Systems (UAS) utilization of the Iridium Satellite Constellation to provide flight safety. The Federal Aviation Authority (FAA) required an over-the-horizon communication standard to guarantee flight safety before permitting widespread UAS flights in the National Air Space (NAS). This is important to ensure reliable control of UASs during loss-link and over-the-horizon scenarios. The core requirement was to utilize a satellite system to send GPS tracking data and other telemetry from a flight vehicle down to the ground. Iridium was chosen as the system because it is one of the only true satellite systems that has world wide coverage, and the service has a highly reliable link margin. The Iridium system, the flight modems, and the test flight are described.

  4. Spreadsheet Decision Support Model for Training Exercise Material Requirements Planning

    National Research Council Canada - National Science Library

    Tringali, Arthur

    1997-01-01

    ... associated with military training exercises. The model combines the business practice of Material Requirements Planning and the commercial spreadsheet software capabilities of Lotus 1-2-3 to calculate the requirements for food, consumable...

  5. Extending enterprise architecture modelling with business goals and requirements

    NARCIS (Netherlands)

    Engelsman, W.; Quartel, Dick; Jonkers, Henk; van Sinderen, Marten J.

    The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling

  6. Functions and requirements for the INEL light duty utility arm sampler end effector

    International Nuclear Information System (INIS)

    Pace, D.P.; Barnes, G.E.

    1995-02-01

    This sampler end effector system functions and requirements document defines the system functions that the end effector must perform as well as the requirements the design must meet. Safety, quality assurance, operations, environmental conditions, and regulatory requirements have been considered. The main purpose of this document is to provide a basis for the end effector engineering, design, and fabrication activities. The document shall be the living reference document to initiate the development activities and will be updated as system technologies are finalized

  7. Functions and requirements for the INEL light duty utility arm gripper end effector

    International Nuclear Information System (INIS)

    Pace, D.P.; Barnes, G.E.

    1995-02-01

    This gripper end effector system functions and requirements document defines the system functions that the end effector must perform as well as the requirements the design must meet. Safety, quality assurance, operations, environmental conditions, and regulatory requirements have been considered. The main purpose of this document is to provide a basis for the end effector engineering, design, and fabrication activities. The document shall be the living reference document to initiate the development activities and will be updated as system technologies are finalized

  8. Formal Requirements Modeling for Simulation-Based Verification

    OpenAIRE

    Otter, Martin; Thuy, Nguyen; Bouskela, Daniel; Buffoni, Lena; Elmqvist, Hilding; Fritzson, Peter; Garro, Alfredo; Jardin, Audrey; Olsson, Hans; Payelleville, Maxime; Schamai, Wladimir; Thomas, Eric; Tundis, Andrea

    2015-01-01

    This paper describes a proposal on how to model formal requirements in Modelica for simulation-based verification. The approach is implemented in the open source Modelica_Requirements library. It requires extensions to the Modelica language, that have been prototypically implemented in the Dymola and Open-Modelica software. The design of the library is based on the FOrmal Requirement Modeling Language (FORM-L) defined by EDF, and on industrial use cases from EDF and Dassault Aviation. It uses...

  9. The Joint Venture Model of Knowledge Utilization: a guide for change in nursing.

    Science.gov (United States)

    Edgar, Linda; Herbert, Rosemary; Lambert, Sylvie; MacDonald, Jo-Ann; Dubois, Sylvie; Latimer, Margot

    2006-05-01

    Knowledge utilization (KU) is an essential component of today's nursing practice and healthcare system. Despite advances in knowledge generation, the gap in knowledge transfer from research to practice continues. KU models have moved beyond factors affecting the individual nurse to a broader perspective that includes the practice environment and the socio-political context. This paper proposes one such theoretical model the Joint Venture Model of Knowledge Utilization (JVMKU). Key components of the JVMKU that emerged from an extensive multidisciplinary review of the literature include leadership, emotional intelligence, person, message, empowered workplace and the socio-political environment. The model has a broad and practical application and is not specific to one type of KU or one population. This paper provides a description of the JVMKU, its development and suggested uses at both local and organizational levels. Nurses in both leadership and point-of-care positions will recognize the concepts identified and will be able to apply this model for KU in their own workplace for assessment of areas requiring strengthening and support.

  10. Correlation between IQIs- criteria of utilization from the level of required sensitivity

    International Nuclear Information System (INIS)

    Hallai Junior, C.; Rossi Junior, O.

    1983-01-01

    The correct selection and utilization of the image quality indicators (IQI) is indispensable to assure the detection and allow the dimensioning of the gaps in the radiographic test. Its use by the industries lead to an inadequate use of IQI, distortioning the specifications adopted. From this observations the authors show the essentials parameters of IQIs in a way to facilitate, without quality damage or distortion of specifications, the correlation among the several types. (E.G.) [pt

  11. Functions and requirements for the light duty utility arm integrated system

    International Nuclear Information System (INIS)

    Kiebel, G.R.

    1996-01-01

    The Light Duty Utility Arm (LDUA) Integrated System is a mobile robotic system designed to remotely deploy and operate a variety of tools in uninhabitable underground radiological and hazardous waste storage tanks. The system primarily provides a means to inspect, survey, monitor, map and/or obtain specific waste and waste tank data in support of the Tank Waste Remediation System (TWRS) mission at Hanford and remediation programs at other U.S. Department of Energy (DOE) sites

  12. Functions and requirements for the Light-Duty Utility Arm Integrated System. Revision 1

    International Nuclear Information System (INIS)

    Kiebel, G.R.

    1996-01-01

    The Light Duty Utility Arm (LDUA) Integrated System is a mobile robotic system designed to remotely deploy and operate a variety of tools in uninhabitable underground radiological and hazardous waste storage tanks. The system primarily provides a means to inspect, survey, monitor, map and/or obtain specific waste and waste tank data in support of the Tank Waste Remediation System (TWRS) mission at Hanford and remediation programs at other U.S. Department of Energy (DOE) sites

  13. Extending enterprise architecture modelling with business goals and requirements

    Science.gov (United States)

    Engelsman, Wilco; Quartel, Dick; Jonkers, Henk; van Sinderen, Marten

    2011-02-01

    The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling techniques for EA focus on the products, services, processes and applications of an enterprise. In addition, techniques may be provided to describe structured requirements lists and use cases. Little support is available however for modelling the underlying motivation of EAs in terms of stakeholder concerns and the high-level goals that address these concerns. This article describes a language that supports the modelling of this motivation. The definition of the language is based on existing work on high-level goal and requirements modelling and is aligned with an existing standard for enterprise modelling: the ArchiMate language. Furthermore, the article illustrates how EA can benefit from analysis techniques from the requirements engineering domain.

  14. Using cognitive modeling for requirements engineering in anesthesiology

    NARCIS (Netherlands)

    Pott, C; le Feber, J

    2005-01-01

    Cognitive modeling is a complexity reducing method to describe significant cognitive processes under a specified research focus. Here, a cognitive process model for decision making in anesthesiology is presented and applied in requirements engineering. Three decision making situations of

  15. Process and utility water requirements for cellulosic ethanol production processes via fermentation pathway

    Science.gov (United States)

    The increasing need of additional water resources for energy production is a growing concern for future economic development. In technology development for ethanol production from cellulosic feedstocks, a detailed assessment of the quantity and quality of water required, and the ...

  16. User Utility Oriented Queuing Model for Resource Allocation in Cloud Environment

    Directory of Open Access Journals (Sweden)

    Zhe Zhang

    2015-01-01

    Full Text Available Resource allocation is one of the most important research topics in servers. In the cloud environment, there are massive hardware resources of different kinds, and many kinds of services are usually run on virtual machines of the cloud server. In addition, cloud environment is commercialized, and economical factor should also be considered. In order to deal with commercialization and virtualization of cloud environment, we proposed a user utility oriented queuing model for task scheduling. Firstly, we modeled task scheduling in cloud environment as an M/M/1 queuing system. Secondly, we classified the utility into time utility and cost utility and built a linear programming model to maximize total utility for both of them. Finally, we proposed a utility oriented algorithm to maximize the total utility. Massive experiments validate the effectiveness of our proposed model.

  17. Modeling Domain Variability in Requirements Engineering with Contexts

    Science.gov (United States)

    Lapouchnian, Alexei; Mylopoulos, John

    Various characteristics of the problem domain define the context in which the system is to operate and thus impact heavily on its requirements. However, most requirements specifications do not consider contextual properties and few modeling notations explicitly specify how domain variability affects the requirements. In this paper, we propose an approach for using contexts to model domain variability in goal models. We discuss the modeling of contexts, the specification of their effects on system goals, and the analysis of goal models with contextual variability. The approach is illustrated with a case study.

  18. Job Satisfaction and Personality: The Utility of the Five-Factor Model of Personality

    Science.gov (United States)

    1999-03-01

    Social and Enterprising interests. Costa and McCrae (1998) utilized the Wiggin’s (1979) circumplex model as a basis for developing their Style of...JOB SATISFACTION AND PERSONALITY: THE UTILITY OF THE FIVE-FACTOR MODEL OF PERSONALITY by GREGG F.TANOFF DISTRIBUTION STATEMENT A Approved for...COVERED DISSERTATION 4. TITLE AND SUBTITLE JOB SATISFACTION AND PERSONLAITY: THE UTILITY OF THE FIVE-FACTOR MODEL OF PERSONALITY 5. FUNDING NUMBERS

  19. Improving Patient Flow Utilizing a Collaborative Learning Model.

    Science.gov (United States)

    Tibor, Laura C; Schultz, Stacy R; Cravath, Julie L; Rein, Russell R; Krecke, Karl N

    2016-01-01

    This initiative utilized a collaborative learning approach to increase knowledge and experience in process improvement and systems thinking while targeting improved patient flow in seven radiology modalities. Teams showed improvements in their project metrics and collectively streamlined the flow for 530 patients per day by improving patient lead time, wait time, and first case on-time start rates. In a post-project survey of 50 project team members, 82% stated they had more effective solutions as a result of the process improvement methodology, 84% stated they will be able to utilize the process improvement tools again in the future, and 98% would recommend participating in another project to a colleague.

  20. Flavobacterium johnsoniae Chitinase ChiA Is Required for Chitin Utilization and Is Secreted by the Type IX Secretion System

    OpenAIRE

    Kharade, Sampada S.; McBride, Mark J.

    2014-01-01

    Flavobacterium johnsoniae, a member of phylum Bacteriodetes, is a gliding bacterium that digests insoluble chitin and many other polysaccharides. A novel protein secretion system, the type IX secretion system (T9SS), is required for gliding motility and for chitin utilization. Five potential chitinases were identified by genome analysis. Fjoh_4555 (ChiA), a 168.9-kDa protein with two glycoside hydrolase family 18 (GH18) domains, was targeted for analysis. Disruption of chiA by insertional mut...

  1. Expected Utility and Catastrophic Risk in a Stochastic Economy-Climate Model

    NARCIS (Netherlands)

    Ikefuji, M.; Laeven, R.J.A.; Magnus, J.R.; Muris, C.H.M.

    2010-01-01

    In the context of extreme climate change, we ask how to conduct expected utility analysis in the presence of catastrophic risks. Economists typically model decision making under risk and uncertainty by expected util- ity with constant relative risk aversion (power utility); statisticians typi- cally

  2. A laboratory-calibrated model of coho salmon growth with utility for ecological analyses

    Science.gov (United States)

    Manhard, Christopher V.; Som, Nicholas A.; Perry, Russell W.; Plumb, John M.

    2018-01-01

    We conducted a meta-analysis of laboratory- and hatchery-based growth data to estimate broadly applicable parameters of mass- and temperature-dependent growth of juvenile coho salmon (Oncorhynchus kisutch). Following studies of other salmonid species, we incorporated the Ratkowsky growth model into an allometric model and fit this model to growth observations from eight studies spanning ten different populations. To account for changes in growth patterns with food availability, we reparameterized the Ratkowsky model to scale several of its parameters relative to ration. The resulting model was robust across a wide range of ration allocations and experimental conditions, accounting for 99% of the variation in final body mass. We fit this model to growth data from coho salmon inhabiting tributaries and constructed ponds in the Klamath Basin by estimating habitat-specific indices of food availability. The model produced evidence that constructed ponds provided higher food availability than natural tributaries. Because of their simplicity (only mass and temperature are required as inputs) and robustness, ration-varying Ratkowsky models have utility as an ecological tool for capturing growth in freshwater fish populations.

  3. The utility of Earth system Models of Intermediate Complexity

    NARCIS (Netherlands)

    Weber, S.L.

    2010-01-01

    Intermediate-complexity models are models which describe the dynamics of the atmosphere and/or ocean in less detail than conventional General Circulation Models (GCMs). At the same time, they go beyond the approach taken by atmospheric Energy Balance Models (EBMs) or ocean box models by

  4. Software Requirements Specification Verifiable Fuel Cycle Simulation (VISION) Model

    International Nuclear Information System (INIS)

    D. E. Shropshire; W. H. West

    2005-01-01

    The purpose of this Software Requirements Specification (SRS) is to define the top-level requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). This simulation model is intended to serve a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies

  5. An Additive-Utility Model of Delay Discounting

    Science.gov (United States)

    Killeen, Peter R.

    2009-01-01

    Goods remote in temporal, spatial, or social distance, or in likelihood, exert less control over our behavior than those more proximate. The decay of influence with distance, of perennial interest to behavioral economists, has had a renaissance in the study of delay discounting. By developing discount functions from marginal utilities, this…

  6. Mathematical model of a utility firm. Executive summary

    Energy Technology Data Exchange (ETDEWEB)

    1983-08-21

    The project was aimed at developing an understanding of the economic and behavioral processes that take place within a utility firm, and without it. This executive summary, one of five documents, gives the project goals and objectives, outlines the subject areas of investigation, discusses the findings and results, and finally considers applications within the electric power industry and future research directions. (DLC)

  7. Knowledge Management Models And Their Utility To The Effective ...

    African Journals Online (AJOL)

    Although indigenous knowledge is key to the development of sub Saharan Africa and the preservation of its societal memory, it is fast disappearing due to a variety of reasons. One of the strategies that may assist in the management and preservation of indigenous knowledge is the utilization of knowledge management ...

  8. Capabilities and requirements for modelling radionuclide transport in the geosphere

    International Nuclear Information System (INIS)

    Paige, R.W.; Piper, D.

    1989-02-01

    This report gives an overview of geosphere flow and transport models suitable for use by the Department of the Environment in the performance assessment of radioactive waste disposal sites. An outline methodology for geosphere modelling is proposed, consisting of a number of different types of model. A brief description of each of the component models is given, indicating the purpose of the model, the processes being modelled and the methodologies adopted. Areas requiring development are noted. (author)

  9. A Framework for Organizing Current and Future Electric Utility Regulatory and Business Models

    Energy Technology Data Exchange (ETDEWEB)

    Satchwell, Andrew [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Cappers, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Schwartz, Lisa [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Fadrhonc, Emily Martin [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-06-01

    In this report, we will present a descriptive and organizational framework for incremental and fundamental changes to regulatory and utility business models in the context of clean energy public policy goals. We will also discuss the regulated utility's role in providing value-added services that relate to distributed energy resources, identify the "openness" of customer information and utility networks necessary to facilitate change, and discuss the relative risks, and the shifting of risks, for utilities and customers.

  10. Utilization of respiratory energy in higher plants : requirements for 'maintenance' and transport processes

    NARCIS (Netherlands)

    Bouma, T.J.

    1995-01-01

    Quantitative knowledge of both photosynthesis and respiration is required to understand plant growth and resulting crop yield. However, especially the nature of the energy demanding processes that are dependent on dark respiration in full-grown tissues is largely unknown. The main objective

  11. Flexible simulation framework to couple processes in complex 3D models for subsurface utilization assessment

    Science.gov (United States)

    Kempka, Thomas; Nakaten, Benjamin; De Lucia, Marco; Nakaten, Natalie; Otto, Christopher; Pohl, Maik; Tillner, Elena; Kühn, Michael

    2016-04-01

    Utilization of the geological subsurface for production and storage of hydrocarbons, chemical energy and heat as well as for waste disposal requires the quantification and mitigation of environmental impacts as well as the improvement of georesources utilization in terms of efficiency and sustainability. The development of tools for coupled process simulations is essential to tackle these challenges, since reliable assessments are only feasible by integrative numerical computations. Coupled processes at reservoir to regional scale determine the behaviour of reservoirs, faults and caprocks, generally demanding for complex 3D geological models to be considered besides available monitoring and experimenting data in coupled numerical simulations. We have been developing a flexible numerical simulation framework that provides efficient workflows for integrating the required data and software packages to carry out coupled process simulations considering, e.g., multiphase fluid flow, geomechanics, geochemistry and heat. Simulation results are stored in structured data formats to allow for an integrated 3D visualization and result interpretation as well as data archiving and its provision to collaborators. The main benefits in using the flexible simulation framework are the integration of data geological and grid data from any third party software package as well as data export to generic 3D visualization tools and archiving formats. The coupling of the required process simulators in time and space is feasible, while different spatial dimensions in the coupled simulations can be integrated, e.g., 0D batch with 3D dynamic simulations. User interaction is established via high-level programming languages, while computational efficiency is achieved by using low-level programming languages. We present three case studies on the assessment of geological subsurface utilization based on different process coupling approaches and numerical simulations.

  12. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Even...... requirements, where the system to be built must explicitly support the interaction between people within a pervasive cooperative workflow execution. A case study from a real project is used to illustrate the proposed approach....

  13. Meta-Model and UML Profile for Requirements Management of Software and Embedded Systems

    Directory of Open Access Journals (Sweden)

    Arpinen Tero

    2011-01-01

    Full Text Available Software and embedded system companies today encounter problems related to requirements management tool integration, incorrect tool usage, and lack of traceability. This is due to utilized tools with no clear meta-model and semantics to communicate requirements between different stakeholders. This paper presents a comprehensive meta-model for requirements management. The focus is on software and embedded system domains. The goal is to define generic requirements management domain concepts and abstract interfaces between requirements management and system development. This leads to a portable requirements management meta-model which can be adapted with various system modeling languages. The created meta-model is prototyped by translating it into a UML profile. The profile is imported into a UML tool which is used for rapid evaluation of meta-model concepts in practice. The developed profile is associated with a proof of concept report generator tool that automatically produces up-to-date documentation from the models in form of web pages. The profile is adopted to create an example model of embedded system requirement specification which is built with the profile.

  14. An Analysis/Synthesis System of Audio Signal with Utilization of an SN Model

    OpenAIRE

    Turi Nagy, M.; Rozinaj, G.

    2004-01-01

    An SN (sinusoids plus noise) model is a spectral model, in which the periodic components of the sound are represented by sinusoids with time-varying frequencies, amplitudes and phases. The remaining non-periodic components are represented by a filtered noise. The sinusoidal model utilizes physical properties of musical instruments and the noise model utilizes the human inability to perceive the exact spectral shape or the phase of stochastic signals. SN modeling can be applied in a compressio...

  15. Modeling energy flexibility of low energy buildings utilizing thermal mass

    DEFF Research Database (Denmark)

    Foteinaki, Kyriaki; Heller, Alfred; Rode, Carsten

    2016-01-01

    to match the production patterns, shifting demand from on-peak hours to off-peak hours. Buildings could act as flexibility suppliers to the energy system, through load shifting potential, provided that the large thermal mass of the building stock could be utilized for energy storage. In the present study......In the future energy system a considerable increase in the penetration of renewable energy is expected, challenging the stability of the system, as both production and consumption will have fluctuating patterns. Hence, the concept of energy flexibility will be necessary in order for the consumption...... the load shifting potential of an apartment of a low energy building in Copenhagen is assessed, utilizing the heat storage capacity of the thermal mass when the heating system is switched off for relieving the energy system. It is shown that when using a 4-hour preheating period before switching off...

  16. SEMPATH Ontology: modeling multidisciplinary treatment schemes utilizing semantics.

    Science.gov (United States)

    Alexandrou, Dimitrios Al; Pardalis, Konstantinos V; Bouras, Thanassis D; Karakitsos, Petros; Mentzas, Gregoris N

    2012-03-01

    A dramatic increase of demand for provided treatment quality has occurred during last decades. The main challenge to be confronted, so as to increase treatment quality, is the personalization of treatment, since each patient constitutes a unique case. Healthcare provision encloses a complex environment since healthcare provision organizations are highly multidisciplinary. In this paper, we present the conceptualization of the domain of clinical pathways (CP). The SEMPATH (SEMantic PATHways) Oontology comprises three main parts: 1) the CP part; 2) the business and finance part; and 3) the quality assurance part. Our implementation achieves the conceptualization of the multidisciplinary domain of healthcare provision, in order to be further utilized for the implementation of a Semantic Web Rules (SWRL rules) repository. Finally, SEMPATH Ontology is utilized for the definition of a set of SWRL rules for the human papillomavirus) disease and its treatment scheme. © 2012 IEEE

  17. Exploration Requirements Development Utilizing the Strategy-to-Task-to-Technology Development Approach

    Science.gov (United States)

    Drake, Bret G.; Josten, B. Kent; Monell, Donald W.

    2004-01-01

    The Vision for Space Exploration provides direction for the National Aeronautics and Space Administration to embark on a robust space exploration program that will advance the Nation s scientific, security, and economic interests. This plan calls for a progressive expansion of human capabilities beyond low earth orbit seeking to answer profound scientific and philosophical questions while responding to discoveries along the way. In addition, the Vision articulates the strategy for developing the revolutionary new technologies and capabilities required for the future exploration of the solar system. The National Aeronautics and Space Administration faces new challenges in successfully implementing the Vision. In order to implement a sustained and affordable exploration endeavor it is vital for NASA to do business differently. This paper provides an overview of the strategy-to-task-to-technology process being used by NASA s Exploration Systems Mission Directorate to develop the requirements and system acquisition details necessary for implementing a sustainable exploration vision.

  18. Utility-Optimal Dynamic Rate Allocation under Average End-to-End Delay Requirements

    OpenAIRE

    Hajiesmaili, Mohammad H.; Talebi, Mohammad Sadegh; Khonsari, Ahmad

    2015-01-01

    QoS-aware networking applications such as real-time streaming and video surveillance systems require nearly fixed average end-to-end delay over long periods to communicate efficiently, although may tolerate some delay variations in short periods. This variability exhibits complex dynamics that makes rate control of such applications a formidable task. This paper addresses rate allocation for heterogeneous QoS-aware applications that preserves the long-term end-to-end delay constraint while, s...

  19. Renewable energy burden sharing. REBUS. Requirements and expectations of utilities and consumer organisations in the European energy sector

    International Nuclear Information System (INIS)

    Voogt, M.H.; Uyterlinde, M.A.; Skytte, K.; Leonardi, M.; Whiteley, M.H.

    2001-05-01

    Creation of an internal market for renewable electricity will involve a political negotiation process, similar to previous EU greenhouse gas negotiations. The Energy Ministers in the EU have agreed on an overall target of 21.7% of electricity supply from Renewable Energy Sources (RES-E) and a distribution of targets over the individual Member States. The REBUS project aimed at providing insights in the effects of implementing targets for renewable electricity generation at EU Member State level and the impact of introducing burden sharing systems within the EU, such as a Tradable Green Certificate (TGC) system. Member States can participate in such burden sharing systems to reduce the costs of achieving targets for electricity from renewable sources (RES-E), compared to strictly national implementation measures. The project concentrated on the development of the REBUS model, which quantifies the impact of trade (in green certificates, quotas or targets) and the implementation of different rules to setting targets at individual Member State level. In addition, the project has paid special attention to the participation of stakeholders such as utilities, traders, and consumers of electricity. What is their opinion on the target setting, on the design of a trading system and its practical implementation and monitoring aspects? Utilities and consumer organisations in Denmark, Italy, The Netherlands and the United Kingdom have been asked to comment on these issues. This report is a result of a series of interviews with these stakeholders on their reaction to different burden sharing proposals, and on the socio-economic and financial impacts they foresee. The utilities take a critical view of their position in the renewable energy market and possible future international trading scheme. The main conclusions from the interviews are: Generally, target setting and burden sharing are regarded political questions, on which governments should decide; Stakeholders emphasise

  20. Requirements Traceability and Transformation Conformance in Model-Driven Development

    NARCIS (Netherlands)

    Andrade Almeida, João; van Eck, Pascal; Iacob, Maria Eugenia

    2006-01-01

    The variety of design artefacts (models) produced in a model-driven design process results in an intricate rela-tionship between requirements and the various models. This paper proposes a methodological framework that simplifies management of this relationship. This frame-work is a basis for tracing

  1. Modeling and Analysis Compute Environments, Utilizing Virtualization Technology in the Climate and Earth Systems Science domain

    Science.gov (United States)

    Michaelis, A.; Nemani, R. R.; Wang, W.; Votava, P.; Hashimoto, H.

    2010-12-01

    Given the increasing complexity of climate modeling and analysis tools, it is often difficult and expensive to build or recreate an exact replica of the software compute environment used in past experiments. With the recent development of new technologies for hardware virtualization, an opportunity exists to create full modeling, analysis and compute environments that are “archiveable”, transferable and may be easily shared amongst a scientific community or presented to a bureaucratic body if the need arises. By encapsulating and entire modeling and analysis environment in a virtual machine image, others may quickly gain access to the fully built system used in past experiments, potentially easing the task and reducing the costs of reproducing and verify past results produced by other researchers. Moreover, these virtual machine images may be used as a pedagogical tool for others that are interested in performing an academic exercise but don't yet possess the broad expertise required. We built two virtual machine images, one with the Community Earth System Model (CESM) and one with Weather Research Forecast Model (WRF), then ran several small experiments to assess the feasibility, performance overheads costs, reusability, and transferability. We present a list of the pros and cons as well as lessoned learned from utilizing virtualization technology in the climate and earth systems modeling domain.

  2. Modeling Interprovincial Cooperative Energy Saving in China: An Electricity Utilization Perspective

    Directory of Open Access Journals (Sweden)

    Lijun Zeng

    2018-01-01

    Full Text Available As the world faces great challenges from climate change and environmental pollution, China urgently requires energy saving, emission reduction, and carbon reduction programmes. However, the non-cooperative energy saving model (NCESM, the simple regulation mode that is China’s main model for energy saving, is not beneficial for optimization of energy and resource distribution, and cannot effectively motivate energy saving at the provincial level. Therefore, we propose an interprovincial cooperative energy saving model (CESM from the perspective of electricity utilization, with the object of maximizing the benefits from electricity utilization of the cooperation union based on achieving the energy saving goals of the union as a whole. The CESM consists of two parts: (1 an optimization model that calculates the optimal quantities of electricity consumption for each participating province to meet the joint energy saving goal; and (2 a model that distributes the economic benefits of the cooperation among the provinces in the cooperation based on the Shapley value method. We applied the CESM to the case of an interprovincial union of Shanghai, Sichuan, Shanxi, and Gansu in China. The results, based on the data from 2001–2014, show that cooperation can significantly increase the benefits of electricity utilization for each province in the union. The total benefits of the union from utilization of electricity increased 38.38%, or 353.98 billion CNY, while the benefits to Shanghai, Sichuan, Shanxi, and Gansu were 200.28, 58.37, 57.11, and 38.22 billion CNY respectively greater under the CESM than the NCESM. The implementation of the CESM provides the provincial governments not only a flexible and incentive way to achieve short-term goals, but also a feasible and effective path to realize long-term energy saving strategies. To test the impact of different parameter values on the results of the CESM, a sensitivity analysis was conducted. Some policy

  3. Use of Th and U in CANDU-6 and ACR-700 on the once-through cycle: Burnup analyses, natural U requirement/saving and nuclear resource utilization

    Science.gov (United States)

    Türkmen, Mehmet; Zabunoğlu, Okan H.

    2012-10-01

    Use of U and U-Th fuels in CANDU type of reactors (CANDU-6 and ACR-700) on the once-through nuclear fuel cycle is investigated. Based on the unit-cell approximation with the homogeneous-bundle/core model, utilizing the MONTEBURNS code, burnup computations are performed; discharge burnups are determined and expressed as functions of 235U and Th fractions, when applicable. Natural Uranium Requirement (and Saving) and Nuclear Resource Utilization are calculated for varying fuel compositions. Results are analyzed to observe the effects of 235U and Th fractions, thus to reach conclusions about use of Th in CANDU-6 and ACR-700 on the once-through cycle.

  4. Commonsense Psychology and the Functional Requirements of Cognitive Models

    National Research Council Canada - National Science Library

    Gordon, Andrew S

    2005-01-01

    In this paper we argue that previous models of cognitive abilities (e.g. memory, analogy) have been constructed to satisfy functional requirements of implicit commonsense psychological theories held by researchers and nonresearchers alike...

  5. Requirements engineering for cross-sectional information chain models.

    Science.gov (United States)

    Hübner, U; Cruel, E; Gök, M; Garthaus, M; Zimansky, M; Remmers, H; Rienhoff, O

    2012-01-01

    Despite the wealth of literature on requirements engineering, little is known about engineering very generic, innovative and emerging requirements, such as those for cross-sectional information chains. The IKM health project aims at building information chain reference models for the care of patients with chronic wounds, cancer-related pain and back pain. Our question therefore was how to appropriately capture information and process requirements that are both generally applicable and practically useful. To this end, we started with recommendations from clinical guidelines and put them up for discussion in Delphi surveys and expert interviews. Despite the heterogeneity we encountered in all three methods, it was possible to obtain requirements suitable for building reference models. We evaluated three modelling languages and then chose to write the models in UML (class and activity diagrams). On the basis of the current project results, the pros and cons of our approach are discussed.

  6. Spreadsheet Decision Support Model for Training Exercise Material Requirements Planning

    National Research Council Canada - National Science Library

    Tringali, Arthur

    1997-01-01

    This thesis focuses on developing a spreadsheet decision support model that can be used by combat engineer platoon and company commanders in determining the material requirements and estimated costs...

  7. Hybriding CMMI and requirement engineering maturity and capability models

    OpenAIRE

    Buglione, Luigi; Hauck, Jean Carlo R.; Gresse von Wangenheim, Christiane; Mc Caffery, Fergal

    2012-01-01

    peer-reviewed Estimation represents one of the most critical processes for any project and it is highly dependent on the quality of requirements elicitation and management. Therefore, the management of requirements should be prioritised in any process improvement program, because the less precise the requirements gathering, analysis and sizing, the greater the error in terms of time and cost estimation. Maturity and Capability Models (MCM) represent a good tool for assessing the status of ...

  8. Expected Utility and Sequential Elimination Models of Career Decision Making.

    Science.gov (United States)

    Shaffer, Michal; Lichtenberg, James W.

    Decision-making strategies have traditionally been classified as either prescriptive/normative or descriptive/behavioral in nature. Proponents of prescriptive/normative decision-making models attempt to develop procedures for making optimal decisions while proponents of the descriptive/behavioral models look for a choice that meets a minimal set…

  9. On the Utility of Island Models in Dynamic Optimization

    DEFF Research Database (Denmark)

    Lissovoi, Andrei; Witt, Carsten

    2015-01-01

    A simple island model with λ islands and migration occurring after every τ iterations is studied on the dynamic fitness function Maze. This model is equivalent to a (1+λ) EA if τ=1, i.e., migration occurs during every iteration. It is proved that even for an increased offspring population size up...

  10. Generic skills requirements (KSA model) towards future mechanical ...

    African Journals Online (AJOL)

    Generic Skills is a basic requirement that engineers need to master in all areas of Engineering. This study was conducted throughout the peninsular Malaysia involving small, medium and heavy industries using the KSA Model. The objectives of this study are studying the level of requirement of Generic Skills that need to be ...

  11. Electric power bidding model for practical utility system

    Directory of Open Access Journals (Sweden)

    M. Prabavathi

    2018-03-01

    Full Text Available A competitive open market environment has been created due to the restructuring in the electricity market. In the new competitive market, mostly a centrally operated pool with a power exchange has been introduced to meet the offers from the competing suppliers with the bids of the customers. In such an open access environment, the formation of bidding strategy is one of the most challenging and important tasks for electricity participants to maximize their profit. To build bidding strategies for power suppliers and consumers in the restructured electricity market, a new mathematical framework is proposed in this paper. It is assumed that each participant submits several blocks of real power quantities along with their bidding prices. The effectiveness of the proposed method is tested on Indian Utility-62 bus system and IEEE-118 bus system. Keywords: Bidding strategy, Day ahead electricity market, Market clearing price, Market clearing volume, Block bid, Intermediate value theorem

  12. Workshop on Functional Requirements for the Modeling of Fate and Transport of Waterborne CBRN Materials

    Energy Technology Data Exchange (ETDEWEB)

    Giles, GE

    2005-02-03

    The purpose of this Workshop on ''Functional Requirements for the Modeling of Fate and Transport of Waterborne CBRN Materials'' was to solicit functional requirements for tools that help Incident Managers plan for and deal with the consequences of industrial or terrorist releases of materials into the nation's waterways and public water utilities. Twenty representatives attended and several made presentations. Several hours of discussions elicited a set of requirements. These requirements were summarized in a form for the attendees to vote on their highest priority requirements. These votes were used to determine the prioritized requirements that are reported in this paper and can be used to direct future developments.

  13. Surplus thermal energy model of greenhouses and coefficient analysis for effective utilization

    Directory of Open Access Journals (Sweden)

    Seung-Hwan Yang

    2016-03-01

    Full Text Available If a greenhouse in the temperate and subtropical regions is maintained in a closed condition, the indoor temperature commonly exceeds that required for optimal plant growth, even in the cold season. This study considered this excess energy as surplus thermal energy (STE, which can be recovered, stored and used when heating is necessary. To use the STE economically and effectively, the amount of STE must be estimated before designing a utilization system. Therefore, this study proposed an STE model using energy balance equations for the three steps of the STE generation process. The coefficients in the model were determined by the results of previous research and experiments using the test greenhouse. The proposed STE model produced monthly errors of 17.9%, 10.4% and 7.4% for December, January and February, respectively. Furthermore, the effects of the coefficients on the model accuracy were revealed by the estimation error assessment and linear regression analysis through fixing dynamic coefficients. A sensitivity analysis of the model coefficients indicated that the coefficients have to be determined carefully. This study also provides effective ways to increase the amount of STE.

  14. Dispersion modeling of accidental releases of toxic gases - Comparison of the models and their utility for the fire brigades.

    Science.gov (United States)

    Stenzel, S.; Baumann-Stanzer, K.

    2009-04-01

    Dispersion modeling of accidental releases of toxic gases - Comparison of the models and their utility for the fire brigades. Sirma Stenzel, Kathrin Baumann-Stanzer In the case of accidental release of hazardous gases in the atmosphere, the emergency responders need a reliable and fast tool to assess the possible consequences and apply the optimal countermeasures. For hazard prediction and simulation of the hazard zones a number of air dispersion models are available. The most model packages (commercial or free of charge) include a chemical database, an intuitive graphical user interface (GUI) and automated graphical output for display the results, they are easy to use and can operate fast and effective during stress situations. The models are designed especially for analyzing different accidental toxic release scenarios ("worst-case scenarios"), preparing emergency response plans and optimal countermeasures as well as for real-time risk assessment and management. There are also possibilities for model direct coupling to automatic meteorological stations, in order to avoid uncertainties in the model output due to insufficient or incorrect meteorological data. Another key problem in coping with accidental toxic release is the relative width spectrum of regulations and values, like IDLH, ERPG, AEGL, MAK etc. and the different criteria for their application. Since the particulate emergency responders and organizations require for their purposes unequal regulations and values, it is quite difficult to predict the individual hazard areas. There are a quite number of research studies and investigations coping with the problem, anyway the end decision is up to the authorities. The research project RETOMOD (reference scenarios calculations for toxic gas releases - model systems and their utility for the fire brigade) was conducted by the Central Institute for Meteorology and Geodynamics (ZAMG) in cooperation with the Vienna fire brigade, OMV Refining & Marketing GmbH and

  15. Predictive Modeling of Defibrillation utilizing Hexahedral and Tetrahedral Finite Element Models: Recent Advances

    Science.gov (United States)

    Triedman, John K.; Jolley, Matthew; Stinstra, Jeroen; Brooks, Dana H.; MacLeod, Rob

    2008-01-01

    ICD implants may be complicated by body size and anatomy. One approach to this problem has been the adoption of creative, extracardiac implant strategies using standard ICD components. Because data on safety or efficacy of such ad hoc implant strategies is lacking, we have developed image-based finite element models (FEMs) to compare electric fields and expected defibrillation thresholds (DFTs) using standard and novel electrode locations. In this paper, we review recently published studies by our group using such models, and progress in meshing strategies to improve efficiency and visualization. Our preliminary observations predict that they may be large changes in DFTs with clinically relevant variations of electrode placement. Extracardiac ICDs of various lead configurations are predicted to be effective in both children and adults. This approach may aid both ICD development and patient-specific optimization of electrode placement, but the simplified nature of current models dictates further development and validation prior to clinical or industrial utilization. PMID:18817926

  16. Tritium Specific Adsorption Simulation Utilizing the OSPREY Model

    Energy Technology Data Exchange (ETDEWEB)

    Veronica Rutledge; Lawrence Tavlarides; Ronghong Lin; Austin Ladshaw

    2013-09-01

    During the processing of used nuclear fuel, volatile radionuclides will be discharged to the atmosphere if no recovery processes are in place to limit their release. The volatile radionuclides of concern are 3H, 14C, 85Kr, and 129I. Methods are being developed, via adsorption and absorption unit operations, to capture these radionuclides. It is necessary to model these unit operations to aid in the evaluation of technologies and in the future development of an advanced used nuclear fuel processing plant. A collaboration between Fuel Cycle Research and Development Offgas Sigma Team member INL and a NEUP grant including ORNL, Syracuse University, and Georgia Institute of Technology has been formed to develop off gas models and support off gas research. This report is discusses the development of a tritium specific adsorption model. Using the OSPREY model and integrating it with a fundamental level isotherm model developed under and experimental data provided by the NEUP grant, the tritium specific adsorption model was developed.

  17. NRC review of Electric Power Research Institute's Advanced Light Water Reactor Utility Requirements Document - Evolutionary plant designs, Chapter 1, Project No. 669

    International Nuclear Information System (INIS)

    1992-08-01

    The staff of the US Nuclear Regulatory Commission has prepared Volume 2 (Parts 1 and 2) of a safety evaluation report (SER), ''NRC Review of Electric Power Research Institute's Advanced Light Water Reactor Utility Requirements Document -- Evolutionary Plant Designs,'' to document the results of its review of the Electric Power Research Institute's ''Advanced Light Water Reactor Utility Requirements Document.'' This SER gives the results of the staff's review of Volume II of the Requirements Document for evolutionary plant designs, which consists of 13 chapters and contains utility design requirements for an evolutionary nuclear power plant (approximately 1300 megawatts-electric)

  18. NRC review of Electric Power Research Institute's Advanced Light Water Reactor Utility Requirements Document - Evolutionary plant designs, Chapters 2--13, Project No. 669

    International Nuclear Information System (INIS)

    1992-08-01

    The staff of the US Nuclear Regulatory Commission has prepared Volume 2 (Parts 1 and 2) of a safety evaluation report (SER), ''NRC Review of Electric Power Research Institute's Advanced Light Water Reactor Utility Requirements Document -- Evolutionary Plant Designs,'' to document the results of its review of the Electric Power Research Institute's ''Advanced Light Water Reactor Utility Requirements Document.'' This SER gives the results of the staff's review of Volume II of the Requirements Document for evolutionary plant designs, which consists of 13 chapters and contains utility design requirements for an evolutionary nuclear power plant (approximately 1300 megawatts-electric)

  19. Irrigation Requirement Estimation Using Vegetation Indices and Inverse Biophysical Modeling

    Science.gov (United States)

    Bounoua, Lahouari; Imhoff, Marc L.; Franks, Shannon

    2010-01-01

    We explore an inverse biophysical modeling process forced by satellite and climatological data to quantify irrigation requirements in semi-arid agricultural areas. We constrain the carbon and water cycles modeled under both equilibrium, balance between vegetation and climate, and non-equilibrium, water added through irrigation. We postulate that the degree to which irrigated dry lands vary from equilibrium climate conditions is related to the amount of irrigation. The amount of water required over and above precipitation is considered as an irrigation requirement. For July, results show that spray irrigation resulted in an additional amount of water of 1.3 mm per occurrence with a frequency of 24.6 hours. In contrast, the drip irrigation required only 0.6 mm every 45.6 hours or 46% of that simulated by the spray irrigation. The modeled estimates account for 87% of the total reported irrigation water use, when soil salinity is not important and 66% in saline lands.

  20. Flavobacterium johnsoniae chitinase ChiA is required for chitin utilization and is secreted by the type IX secretion system.

    Science.gov (United States)

    Kharade, Sampada S; McBride, Mark J

    2014-03-01

    Flavobacterium johnsoniae, a member of phylum Bacteriodetes, is a gliding bacterium that digests insoluble chitin and many other polysaccharides. A novel protein secretion system, the type IX secretion system (T9SS), is required for gliding motility and for chitin utilization. Five potential chitinases were identified by genome analysis. Fjoh_4555 (ChiA), a 168.9-kDa protein with two glycoside hydrolase family 18 (GH18) domains, was targeted for analysis. Disruption of chiA by insertional mutagenesis resulted in cells that failed to digest chitin, and complementation with wild-type chiA on a plasmid restored chitin utilization. Antiserum raised against recombinant ChiA was used to detect the protein and to characterize its secretion by F. johnsoniae. ChiA was secreted in soluble form by wild-type cells but remained cell associated in strains carrying mutations in any of the T9SS genes, gldK, gldL, gldM, gldNO, sprA, sprE, and sprT. Western blot and liquid chromatography-tandem mass spectrometry (LC-MS/MS) analyses suggested that ChiA was proteolytically processed into two GH18 domain-containing proteins. Proteins secreted by T9SSs typically have conserved carboxy-terminal domains (CTDs) belonging to the TIGRFAM families TIGR04131 and TIGR04183. ChiA does not exhibit strong similarity to these sequences and instead has a novel CTD. Deletion of this CTD resulted in accumulation of ChiA inside cells. Fusion of the ChiA CTD to recombinant mCherry resulted in secretion of mCherry into the medium. The results indicate that ChiA is a soluble extracellular chitinase required for chitin utilization and that it relies on a novel CTD for secretion by the F. johnsoniae T9SS.

  1. Evaluating the performance and utility of regional climate models

    DEFF Research Database (Denmark)

    Christensen, Jens H.; Carter, Timothy R.; Rummukainen, Markku

    2007-01-01

    This special issue of Climatic Change contains a series of research articles documenting co-ordinated work carried out within a 3-year European Union project 'Prediction of Regional scenarios and Uncertainties for Defining European Climate change risks and Effects' (PRUDENCE). The main objective...... weather events and (7) implications of the results for policy. A paper summarising the related MICE (Modelling the Impact of Climate Extremes) project is also included. The second part of the issue contains 12 articles that focus in more detail on some of the themes summarised in the overarching papers....... The PRUDENCE results represent the first comprehensive, continental-scale intercomparison and evaluation of high resolution climate models and their applications, bringing together climate modelling, impact research and social sciences expertise on climate change....

  2. Recursive inter-generational utility in global climate risk modeling

    Energy Technology Data Exchange (ETDEWEB)

    Minh, Ha-Duong [Centre International de Recherche sur l' Environnement et le Developpement (CIRED-CNRS), 75 - Paris (France); Treich, N. [Institut National de Recherches Agronomiques (INRA-LEERNA), 31 - Toulouse (France)

    2003-07-01

    This paper distinguishes relative risk aversion and resistance to inter-temporal substitution in climate risk modeling. Stochastic recursive preferences are introduced in a stylized numeric climate-economy model using preliminary IPCC 1998 scenarios. It shows that higher risk aversion increases the optimal carbon tax. Higher resistance to inter-temporal substitution alone has the same effect as increasing the discount rate, provided that the risk is not too large. We discuss implications of these findings for the debate upon discounting and sustainability under uncertainty. (author)

  3. Medical Specialty Decision Model: Utilizing Social Cognitive Career Theory

    Science.gov (United States)

    Gibson, Denise D.; Borges, Nicole J.

    2004-01-01

    Objectives: The purpose of this study was to develop a working model to explain medical specialty decision-making. Using Social Cognitive Career Theory, we examined personality, medical specialty preferences, job satisfaction, and expectations about specialty choice to create a conceptual framework to guide specialty choice decision-making.…

  4. Lunar-Forming Giant Impact Model Utilizing Modern Graphics ...

    Indian Academy of Sciences (India)

    2016-01-27

    Jan 27, 2016 ... Recent giant impact models focus on producing a circumplanetary disk of the proper composition around the Earth and defer to earlier works for the accretion of this disk into the Moon. The discontinuity between creating the circumplanetary disk and accretion of the Moon is unnatural and lacks simplicity.

  5. Business Process Simulation: Requirements for Business and Resource Models

    OpenAIRE

    Audrius Rima; Olegas Vasilecas

    2015-01-01

    The purpose of Business Process Model and Notation (BPMN) is to provide easily understandable graphical representation of business process. Thus BPMN is widely used and applied in various areas one of them being a business process simulation. This paper addresses some BPMN model based business process simulation problems. The paper formulate requirements for business process and resource models in enabling their use for business process simulation.

  6. Business Process Simulation: Requirements for Business and Resource Models

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2015-07-01

    Full Text Available The purpose of Business Process Model and Notation (BPMN is to provide easily understandable graphical representation of business process. Thus BPMN is widely used and applied in various areas one of them being a business process simulation. This paper addresses some BPMN model based business process simulation problems. The paper formulate requirements for business process and resource models in enabling their use for business process simulation.

  7. Asset transformation and the challenges to servitize a utility business model

    International Nuclear Information System (INIS)

    Helms, Thorsten

    2016-01-01

    The traditional energy utility business model is under pressure, and energy services are expected to play an important role for the energy transition. Experts and scholars argue that utilities need to innovate their business models, and transform from commodity suppliers to service providers. The transition from a product-oriented, capital-intensive business model based on tangible assets, towards a service-oriented, expense-intensive business model based on intangible assets may present great managerial and organizational challenges. Little research exists about such transitions for capital-intensive commodity providers, and particularly energy utilities, where the challenges to servitize are expected to be greatest. This qualitative paper explores the barriers to servitization within selected Swiss and German utility companies through a series of interviews with utility managers. One of them is ‘asset transformation’, the shift from tangible to intangible assets as major input factor for the value proposition, which is proposed as a driver for the complexity of business model transitions. Managers need to carefully manage those challenges, and find ways to operate both new service and established utility business models aside. Policy makers can support the transition of utilities through more favorable regulatory frameworks for energy services, and by supporting the exchange of knowledge in the industry. - Highlights: •The paper analyses the expected transformation of utilities into service-providers. •Service and utility business models possess very different attributes. •The former is based on intangible, the latter on tangible assets. •The transformation into a service-provider is related with great challenges. •Asset transformation is proposed as a barrier for business model innovation.

  8. Integrating utilization-focused evaluation with business process modeling for clinical research improvement.

    Science.gov (United States)

    Kagan, Jonathan M; Rosas, Scott; Trochim, William M K

    2010-10-01

    New discoveries in basic science are creating extraordinary opportunities to design novel biomedical preventions and therapeutics for human disease. But the clinical evaluation of these new interventions is, in many instances, being hindered by a variety of legal, regulatory, policy and operational factors, few of which enhance research quality, the safety of study participants or research ethics. With the goal of helping increase the efficiency and effectiveness of clinical research, we have examined how the integration of utilization-focused evaluation with elements of business process modeling can reveal opportunities for systematic improvements in clinical research. Using data from the NIH global HIV/AIDS clinical trials networks, we analyzed the absolute and relative times required to traverse defined phases associated with specific activities within the clinical protocol lifecycle. Using simple median duration and Kaplan-Meyer survival analysis, we show how such time-based analyses can provide a rationale for the prioritization of research process analysis and re-engineering, as well as a means for statistically assessing the impact of policy modifications, resource utilization, re-engineered processes and best practices. Successfully applied, this approach can help researchers be more efficient in capitalizing on new science to speed the development of improved interventions for human disease.

  9. Decision modelling tools for utilities in the deregulated energy market

    Energy Technology Data Exchange (ETDEWEB)

    Makkonen, S. [Process Vision Oy, Helsinki (Finland)

    2005-07-01

    This thesis examines the impact of the deregulation of the energy market on decision making and optimisation in utilities and demonstrates how decision support applications can solve specific encountered tasks in this context. The themes of the thesis are presented in different frameworks in order to clarify the complex decision making and optimisation environment where new sources of uncertainties arise due to the convergence of energy markets, globalisation of energy business and increasing competition. This thesis reflects the changes in the decision making and planning environment of European energy companies during the period from 1995 to 2004. It also follows the development of computational performance and evolution of energy information systems during the same period. Specifically, this thesis consists of studies at several levels of the decision making hierarchy ranging from top-level strategic decision problems to specific optimisation algorithms. On the other hand, the studies also follow the progress of the liberalised energy market from the monopolistic era to the fully competitive market with new trading instruments and issues like emissions trading. This thesis suggests that there is an increasing need for optimisation and multiple criteria decision making methods, and that new approaches based on the use of operations research are welcome as the deregulation proceeds and uncertainties increase. Technically, the optimisation applications presented are based on Lagrangian relaxation techniques and the dedicated Power Simplex algorithm supplemented with stochastic scenario analysis for decision support, a heuristic method to allocate common benefits and potential losses of coalitions of power companies, and an advanced Branch- and-Bound algorithm to solve efficiently nonconvex optimisation problems. The optimisation problems are part of the operational and tactical decision making process that has become very complex in the recent years. Similarly

  10. Decision modelling tools for utilities in the deregulated energy market

    International Nuclear Information System (INIS)

    Makkonen, S.

    2005-01-01

    This thesis examines the impact of the deregulation of the energy market on decision making and optimisation in utilities and demonstrates how decision support applications can solve specific encountered tasks in this context. The themes of the thesis are presented in different frameworks in order to clarify the complex decision making and optimisation environment where new sources of uncertainties arise due to the convergence of energy markets, globalisation of energy business and increasing competition. This thesis reflects the changes in the decision making and planning environment of European energy companies during the period from 1995 to 2004. It also follows the development of computational performance and evolution of energy information systems during the same period. Specifically, this thesis consists of studies at several levels of the decision making hierarchy ranging from top-level strategic decision problems to specific optimisation algorithms. On the other hand, the studies also follow the progress of the liberalised energy market from the monopolistic era to the fully competitive market with new trading instruments and issues like emissions trading. This thesis suggests that there is an increasing need for optimisation and multiple criteria decision making methods, and that new approaches based on the use of operations research are welcome as the deregulation proceeds and uncertainties increase. Technically, the optimisation applications presented are based on Lagrangian relaxation techniques and the dedicated Power Simplex algorithm supplemented with stochastic scenario analysis for decision support, a heuristic method to allocate common benefits and potential losses of coalitions of power companies, and an advanced Branch- and-Bound algorithm to solve efficiently nonconvex optimisation problems. The optimisation problems are part of the operational and tactical decision making process that has become very complex in the recent years. Similarly

  11. Assessment of energy utilization and leakages in buildings with building information model energy

    Directory of Open Access Journals (Sweden)

    Egwunatum I. Samuel

    2017-03-01

    Full Text Available Given the ability of building information models (BIM to serve as a multidisciplinary data repository, this study attempts to explore and exploit the sustainability value of BIM in delivering buildings that require less energy for operations, emit less carbon dioxide, and provide conducive living environments for occupants. This objective was attained by a critical and extensive literature review that covers the following: (1 building energy consumption, (2 building energy performance and analysis, and (3 BIM and energy assessment. Literature cited in this paper shows that linking an energy analysis tool with a BIM model has helped project design teams to predict and create optimized energy consumption by conducting building energy performance analysis utilizing key performance indicators on average thermal transmitters, resulting heat demand, lighting power, solar heat gains, and ventilation heat losses. An in-depth analysis was conducted on a completed BIM integrated construction project utilizing the Arboleda Project in the Dominican Republic to validate the aforementioned findings. Results show that the BIM-based energy analysis helped the design team attain the world׳s first positive energy building. This study concludes that linking an energy analysis tool with a BIM model helps to expedite the energy analysis process, provide more detailed and accurate results, and deliver energy-efficient buildings. This study further recommends that the adoption of level 2 BIM and BIM integration in energy optimization analysis must be demanded by building regulatory agencies for all projects regardless of procurement method (i.e., government funded or otherwise or size.

  12. Building a Narrative Based Requirements Engineering Mediation Model

    Science.gov (United States)

    Ma, Nan; Hall, Tracy; Barker, Trevor

    This paper presents a narrative-based Requirements Engineering (RE) mediation model to help RE practitioners to effectively identify, define, and resolve conflicts of interest, goals, and requirements. Within the SPI community, there is a common belief that social, human, and organizational issues significantly impact on the effectiveness of software process improvement in general and the requirements engineering process in particularl. Conflicts among different stakeholders are an important human and social issue that need more research attention in the SPI and RE community. By drawing on the conflict resolution literature and IS literature, we argue that conflict resolution in RE is a mediated process, in which a requirements engineer can act as a mediator among different stakeholders. To address socio-psychological aspects of conflict in RE and SPI, Winslade and Monk (2000)'s narrative mediation model is introduced, justified, and translated into the context of RE.

  13. Spot markets vs. long-term contracts - modelling tools for regional electricity generating utilities

    International Nuclear Information System (INIS)

    Grohnheit, P.E.

    1999-01-01

    A properly organised market for electricity requires that some information will be available for all market participants. Also a range of generally available modelling tools are necessary. This paper describes a set of simple models based on published data for analyses of the long-term revenues of regional utilities with combined heat and power generation (CHP), who will operate a competitive international electricity market and a local heat market. The future revenues from trade on the spot market is analysed using a load curve model, in which marginal costs are calculated on the basis of short-term costs of the available units and chronological hourly variations in the demands for electricity and heat. Assumptions on prices, marginal costs and electricity generation by the different types of generating units are studied for selected types of local electricity generators. The long-term revenue requirements to be met by long-term contracts are analysed using a traditional techno-economic optimisation model focusing on technology choice and competition among technologies over 20.30 years. A possible conclusion from this discussion is that it is important for the economic and environmental efficiency of the electricity market that local or regional generators of CHP, who are able to react on price signals, do not conclude long-term contracts that include fixed time-of-day tariff for sale of electricity. Optimisation results for a CHP region (represented by the structure of the Danish electricity and CHP market in 1995) also indicates that a market for CO 2 tradable permits is unlikely to attract major non-fossil fuel technologies for electricity generation, e.g. wind power. (au)

  14. Modelling Security Requirements Through Extending Scrum Agile Development Framework

    OpenAIRE

    Alotaibi, Minahi

    2016-01-01

    Security is today considered as a basic foundation in software development and therefore, the modelling and implementation of security requirements is an essential part of the production of secure software systems. Information technology organisations are moving towards agile development methods in order to satisfy customers' changing requirements in light of accelerated evolution and time restrictions with their competitors in software production. Security engineering is considered difficult...

  15. Biomimetic peptide-based models of [FeFe]-hydrogenases: utilization of phosphine-containing peptides

    Energy Technology Data Exchange (ETDEWEB)

    Roy, Souvik [Department of Chemistry and Biochemistry; Arizona State University; Tempe, USA; Nguyen, Thuy-Ai D. [Department of Chemistry and Biochemistry; Arizona State University; Tempe, USA; Gan, Lu [Department of Chemistry and Biochemistry; Arizona State University; Tempe, USA; Jones, Anne K. [Department of Chemistry and Biochemistry; Arizona State University; Tempe, USA

    2015-01-01

    Peptide based models for [FeFe]-hydrogenase were synthesized utilizing unnatural phosphine-amino acids and their electrocatalytic properties were investigated in mixed aqueous-organic solvents.

  16. Computer model for estimating electric utility environmental noise

    International Nuclear Information System (INIS)

    Teplitzky, A.M.; Hahn, K.J.

    1991-01-01

    This paper reports on a computer code for estimating environmental noise emissions from the operation and the construction of electric power plants that was developed based on algorithms. The computer code (Model) is used to predict octave band sound power levels for power plant operation and construction activities on the basis of the equipment operating characteristics and calculates off-site sound levels for each noise source and for an entire plant. Estimated noise levels are presented either as A-weighted sound level contours around the power plant or as octave band levels at user defined receptor locations. Calculated sound levels can be compared with user designated noise criteria, and the program can assist the user in analyzing alternative noise control strategies

  17. Utilization of FEM model for steel microstructure determination

    Science.gov (United States)

    Kešner, A.; Chotěborský, R.; Linda, M.; Hromasová, M.

    2018-02-01

    Agricultural tools which are used in soil processing, they are worn by abrasive wear mechanism cases by hard minerals particles in the soil. The wear rate is influenced by mechanical characterization of tools material and wear rate is influenced also by soil mineral particle contents. Mechanical properties of steel can be affected by a technology of heat treatment that it leads to a different microstructures. Experimental work how to do it is very expensive and thanks to numerical methods like FEM we can assumed microstructure at low cost but each of numerical model is necessary to be verified. The aim of this work has shown a procedure of prediction microstructure of steel for agricultural tools. The material characterizations of 51CrV4 grade steel were used for numerical simulation like TTT diagram, heat capacity, heat conduction and other physical properties of material. A relationship between predicted microstructure by FEM and real microstructure after heat treatment shows a good correlation.

  18. Utilizing Chamber Data for Developing and Validating Climate Change Models

    Science.gov (United States)

    Monje, Oscar

    2012-01-01

    Controlled environment chambers (e.g. growth chambers, SPAR chambers, or open-top chambers) are useful for measuring plant ecosystem responses to climatic variables and CO2 that affect plant water relations. However, data from chambers was found to overestimate responses of C fluxes to CO2 enrichment. Chamber data may be confounded by numerous artifacts (e.g. sidelighting, edge effects, increased temperature and VPD, etc) and this limits what can be measured accurately. Chambers can be used to measure canopy level energy balance under controlled conditions and plant transpiration responses to CO2 concentration can be elucidated. However, these measurements cannot be used directly in model development or validation. The response of stomatal conductance to CO2 will be the same as in the field, but the measured response must be recalculated in such a manner to account for differences in aerodynamic conductance, temperature and VPD between the chamber and the field.

  19. A customer satisfaction model for a utility service industry

    Science.gov (United States)

    Jamil, Jastini Mohd; Nawawi, Mohd Kamal Mohd; Ramli, Razamin

    2016-08-01

    This paper explores the effect of Image, Customer Expectation, Perceived Quality and Perceived Value on Customer Satisfaction, and to investigate the effect of Image and Customer Satisfaction on Customer Loyalty of mobile phone provider in Malaysia. The result of this research is based on data gathered online from international students in one of the public university in Malaysia. Partial Least Squares Structural Equation Modeling (PLS-SEM) has been used to analyze the data that have been collected from the international students' perceptions. The results found that Image and Perceived Quality have significant impact on Customer Satisfaction. Image and Customer Satisfaction ware also found to have significantly related to Customer Loyalty. However, no significant impact has been found between Customer Expectation with Customer Satisfaction, Perceived Value with Customer Satisfaction, and Customer Expectation with Perceived Value. We hope that the findings may assist the mobile phone provider in production and promotion of their services.

  20. Bioenergy crop models: Descriptions, data requirements and future challenges

    Energy Technology Data Exchange (ETDEWEB)

    Nair, S. Surendran [University of Tennessee, Knoxville (UTK); Kang, Shujiang [ORNL; Zhang, Xuesong [Pacific Northwest National Laboratory (PNNL); Miguez, Fernando [Iowa State University; Izaurralde, Dr. R. Cesar [Pacific Northwest National Laboratory (PNNL); Post, Wilfred M [ORNL; Dietze, Michael [University of Illinois, Urbana-Champaign; Lynd, L. [Dartmouth College; Wullschleger, Stan D [ORNL

    2012-01-01

    Field studies that address the production of lignocellulosic biomass as a source of renewable energy provide critical data for the development of bioenergy crop models. A literature survey revealed that 14 models have been used for simulating bioenergy crops including herbaceous and woody bioenergy crops, and for crassulacean acid metabolism (CAM) crops. These models simulate field-scale production of biomass for switchgrass (ALMANAC, EPIC, and Agro-BGC), miscanthus (MISCANFOR, MISCANMOD, and WIMOVAC), sugarcane (APSIM, AUSCANE, and CANEGRO), and poplar and willow (SECRETS and 3PG). Two models are adaptations of dynamic global vegetation models and simulate biomass yields of miscanthus and sugarcane at regional scales (Agro-IBIS and LPJmL). Although it lacks the complexity of other bioenergy crop models, the environmental productivity index (EPI) is the only model used to estimate biomass production of CAM (Agave and Opuntia) plants. Except for the EPI model, all models include representations of leaf area dynamics, phenology, radiation interception and utilization, biomass production, and partitioning of biomass to roots and shoots. A few models simulate soil water, nutrient, and carbon cycle dynamics, making them especially useful for assessing the environmental consequences (e.g., erosion and nutrient losses) associated with the large-scale deployment of bioenergy crops. The rapid increase in use of models for energy crop simulation is encouraging; however, detailed information on the influence of climate, soils, and crop management practices on biomass production is scarce. Thus considerable work remains regarding the parameterization and validation of process-based models for bioenergy crops; generation and distribution of high-quality field data for model development and validation; and implementation of an integrated framework for efficient, high-resolution simulations of biomass production for use in planning sustainable bioenergy systems.

  1. Modelling of limestone injection for SO2 capture in a coal fired utility boiler

    International Nuclear Information System (INIS)

    Kovacik, G.J.; Reid, K.; McDonald, M.M.; Knill, K.

    1997-01-01

    A computer model was developed for simulating furnace sorbent injection for SO 2 capture in a full scale utility boiler using TASCFlow TM computational fluid dynamics (CFD) software. The model makes use of a computational grid of the superheater section of a tangentially fired utility boiler. The computer simulations are three dimensional so that the temperature and residence time distribution in the boiler could be realistically represented. Results of calculations of simulated sulphur capture performance of limestone injection in a typical utility boiler operation were presented

  2. A Framework for Organizing Current and Future Electric Utility Regulatory and Business Models

    Energy Technology Data Exchange (ETDEWEB)

    Satchwell, Andrew [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Cappers, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Schwartz, Lisa C. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Fadrhonc, Emily Martin [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-06-01

    Many regulators, utilities, customer groups, and other stakeholders are reevaluating existing regulatory models and the roles and financial implications for electric utilities in the context of today’s environment of increasing distributed energy resource (DER) penetrations, forecasts of significant T&D investment, and relatively flat or negative utility sales growth. When this is coupled with predictions about fewer grid-connected customers (i.e., customer defection), there is growing concern about the potential for serious negative impacts on the regulated utility business model. Among states engaged in these issues, the range of topics under consideration is broad. Most of these states are considering whether approaches that have been applied historically to mitigate the impacts of previous “disruptions” to the regulated utility business model (e.g., energy efficiency) as well as to align utility financial interests with increased adoption of such “disruptive technologies” (e.g., shareholder incentive mechanisms, lost revenue mechanisms) are appropriate and effective in the present context. A handful of states are presently considering more fundamental changes to regulatory models and the role of regulated utilities in the ownership, management, and operation of electric delivery systems (e.g., New York “Reforming the Energy Vision” proceeding).

  3. Utility of Monte Carlo Modelling for Holdup Measurements.

    Energy Technology Data Exchange (ETDEWEB)

    Belian, Anthony P.; Russo, P. A. (Phyllis A.); Weier, Dennis R. (Dennis Ray),

    2005-01-01

    Non-destructive assay (NDA) measurements performed to locate and quantify holdup in the Oak Ridge K25 enrichment cascade used neutron totals counting and low-resolution gamma-ray spectroscopy. This facility housed the gaseous diffusion process for enrichment of uranium, in the form of UF{sub 6} gas, from {approx} 20% to 93%. Inventory of {sup 235}U inventory in K-25 is all holdup. These buildings have been slated for decontaminatino and decommissioning. The NDA measurements establish the inventory quantities and will be used to assure criticality safety and meet criteria for waste analysis and transportation. The tendency to err on the side of conservatism for the sake of criticality safety in specifying total NDA uncertainty argues, in the interests of safety and costs, for obtaining the best possible value of uncertainty at the conservative confidence level for each item of process equipment. Variable deposit distribution is a complex systematic effect (i.e., determined by multiple independent variables) on the portable NDA results for very large and bulk converters that contributes greatly to total uncertainty for holdup in converters measured by gamma or neutron NDA methods. Because the magnitudes of complex systematic effects are difficult to estimate, computational tools are important for evaluating those that are large. Motivated by very large discrepancies between gamma and neutron measurements of high-mass converters with gamma results tending to dominate, the Monte Carlo code MCNP has been used to determine the systematic effects of deposit distribution on gamma and neutron results for {sup 235}U holdup mass in converters. This paper details the numerical methodology used to evaluate large systematic effects unique to each measurement type, validates the methodology by comparison with measurements, and discusses how modeling tools can supplement the calibration of instruments used for holdup measurements by providing realistic values at well

  4. Model-based human reliability analysis: prospects and requirements

    International Nuclear Information System (INIS)

    Mosleh, A.; Chang, Y.H.

    2004-01-01

    Major limitations of the conventional methods for human reliability analysis (HRA), particularly those developed for operator response analysis in probabilistic safety assessments (PSA) of nuclear power plants, are summarized as a motivation for the need and a basis for developing requirements for the next generation HRA methods. It is argued that a model-based approach that provides explicit cognitive causal links between operator behaviors and directly or indirectly measurable causal factors should be at the core of the advanced methods. An example of such causal model is briefly reviewed, where due to the model complexity and input requirements can only be currently implemented in a dynamic PSA environment. The computer simulation code developed for this purpose is also described briefly, together with current limitations in the models, data, and the computer implementation

  5. NASA Standard for Models and Simulations: Philosophy and Requirements Overview

    Science.gov (United States)

    Blattnig, Steve R.; Luckring, James M.; Morrison, Joseph H.; Sylvester, Andre J.; Tripathi, Ram K.; Zang, Thomas A.

    2013-01-01

    Following the Columbia Accident Investigation Board report, the NASA Administrator chartered an executive team (known as the Diaz Team) to identify those CAIB report elements with NASA-wide applicability and to develop corrective measures to address each element. One such measure was the development of a standard for the development, documentation, and operation of models and simulations. This report describes the philosophy and requirements overview of the resulting NASA Standard for Models and Simulations.

  6. Critical Business Requirements Model and Metrics for Intranet ROI

    OpenAIRE

    Luqi; Jacoby, Grant A.

    2005-01-01

    Journal of Electronic Commerce Research, Vol. 6, No. 1, pp. 1-30. This research provides the first theoretical model, the Intranet Efficiency and Effectiveness Model (IEEM), to measure intranet overall value contributions based on a corporation’s critical business requirements by applying a balanced baseline of metrics and conversion ratios linked to key business processes of knowledge workers, IT managers and business decision makers -- in effect, closing the gap of understanding...

  7. Models of protein and amino acid requirements for cattle

    Directory of Open Access Journals (Sweden)

    Luis Orlindo Tedeschi

    2015-03-01

    Full Text Available Protein supply and requirements by ruminants have been studied for more than a century. These studies led to the accumulation of lots of scientific information about digestion and metabolism of protein by ruminants as well as the characterization of the dietary protein in order to maximize animal performance. During the 1980s and 1990s, when computers became more accessible and powerful, scientists began to conceptualize and develop mathematical nutrition models, and to program them into computers to assist with ration balancing and formulation for domesticated ruminants, specifically dairy and beef cattle. The most commonly known nutrition models developed during this period were the National Research Council (NRC in the United States, Agricultural Research Council (ARC in the United Kingdom, Institut National de la Recherche Agronomique (INRA in France, and the Commonwealth Scientific and Industrial Research Organization (CSIRO in Australia. Others were derivative works from these models with different degrees of modifications in the supply or requirement calculations, and the modeling nature (e.g., static or dynamic, mechanistic, or deterministic. Circa 1990s, most models adopted the metabolizable protein (MP system over the crude protein (CP and digestible CP systems to estimate supply of MP and the factorial system to calculate MP required by the animal. The MP system included two portions of protein (i.e., the rumen-undegraded dietary CP - RUP - and the contributions of microbial CP - MCP as the main sources of MP for the animal. Some models would explicitly account for the impact of dry matter intake (DMI on the MP required for maintenance (MPm; e.g., Cornell Net Carbohydrate and Protein System - CNCPS, the Dutch system - DVE/OEB, while others would simply account for scurf, urinary, metabolic fecal, and endogenous contributions independently of DMI. All models included milk yield and its components in estimating MP required for lactation

  8. Examining the utility of satellite-based wind sheltering estimates for lake hydrodynamic modeling

    Science.gov (United States)

    Van Den Hoek, Jamon; Read, Jordan S.; Winslow, Luke A.; Montesano, Paul; Markfort, Corey D.

    2015-01-01

    Satellite-based measurements of vegetation canopy structure have been in common use for the last decade but have never been used to estimate canopy's impact on wind sheltering of individual lakes. Wind sheltering is caused by slower winds in the wake of topography and shoreline obstacles (e.g. forest canopy) and influences heat loss and the flux of wind-driven mixing energy into lakes, which control lake temperatures and indirectly structure lake ecosystem processes, including carbon cycling and thermal habitat partitioning. Lakeshore wind sheltering has often been parameterized by lake surface area but such empirical relationships are only based on forested lakeshores and overlook the contributions of local land cover and terrain to wind sheltering. This study is the first to examine the utility of satellite imagery-derived broad-scale estimates of wind sheltering across a diversity of land covers. Using 30 m spatial resolution ASTER GDEM2 elevation data, the mean sheltering height, hs, being the combination of local topographic rise and canopy height above the lake surface, is calculated within 100 m-wide buffers surrounding 76,000 lakes in the U.S. state of Wisconsin. Uncertainty of GDEM2-derived hs was compared to SRTM-, high-resolution G-LiHT lidar-, and ICESat-derived estimates of hs, respective influences of land cover type and buffer width on hsare examined; and the effect of including satellite-based hs on the accuracy of a statewide lake hydrodynamic model was discussed. Though GDEM2 hs uncertainty was comparable to or better than other satellite-based measures of hs, its higher spatial resolution and broader spatial coverage allowed more lakes to be included in modeling efforts. GDEM2 was shown to offer superior utility for estimating hs compared to other satellite-derived data, but was limited by its consistent underestimation of hs, inability to detect within-buffer hs variability, and differing accuracy across land cover types. Nonetheless

  9. Pseudomonas aeruginosa PA1006, Which Plays a Role in Molybdenum Homeostasis, Is Required for Nitrate Utilization, Biofilm Formation, and Virulence

    Science.gov (United States)

    Filiatrault, Melanie J.; Tombline, Gregory; Wagner, Victoria E.; Van Alst, Nadine; Rumbaugh, Kendra; Sokol, Pam; Schwingel, Johanna; Iglewski, Barbara H.

    2013-01-01

    Pseudomonas aeruginosa (Pae) is a clinically important opportunistic pathogen. Herein, we demonstrate that the PA1006 protein is critical for all nitrate reductase activities, growth as a biofilm in a continuous flow system, as well as virulence in mouse burn and rat lung model systems. Microarray analysis revealed that ΔPA1006 cells displayed extensive alterations in gene expression including nitrate-responsive, quorum sensing (including PQS production), and iron-regulated genes, as well as molybdenum cofactor and Fe-S cluster biosynthesis factors, members of the TCA cycle, and Type VI Secretion System components. Phenotype Microarray™ profiles of ΔPA1006 aerobic cultures using Biolog plates also revealed a reduced ability to utilize a number of TCA cycle intermediates as well as a failure to utilize xanthine as a sole source of nitrogen. As a whole, these data indicate that the loss of PA1006 confers extensive changes in Pae metabolism. Based upon homology of PA1006 to the E. coli YhhP protein and data from the accompanying study, loss of PA1006 persulfuration and/or molybdenum homeostasis are likely the cause of extensive metabolic alterations that impact biofilm development and virulence in the ΔPA1006 mutant. PMID:23409004

  10. Pseudomonas aeruginosa PA1006, which plays a role in molybdenum homeostasis, is required for nitrate utilization, biofilm formation, and virulence.

    Directory of Open Access Journals (Sweden)

    Melanie J Filiatrault

    Full Text Available Pseudomonas aeruginosa (Pae is a clinically important opportunistic pathogen. Herein, we demonstrate that the PA1006 protein is critical for all nitrate reductase activities, growth as a biofilm in a continuous flow system, as well as virulence in mouse burn and rat lung model systems. Microarray analysis revealed that ΔPA1006 cells displayed extensive alterations in gene expression including nitrate-responsive, quorum sensing (including PQS production, and iron-regulated genes, as well as molybdenum cofactor and Fe-S cluster biosynthesis factors, members of the TCA cycle, and Type VI Secretion System components. Phenotype Microarray™ profiles of ΔPA1006 aerobic cultures using Biolog plates also revealed a reduced ability to utilize a number of TCA cycle intermediates as well as a failure to utilize xanthine as a sole source of nitrogen. As a whole, these data indicate that the loss of PA1006 confers extensive changes in Pae metabolism. Based upon homology of PA1006 to the E. coli YhhP protein and data from the accompanying study, loss of PA1006 persulfuration and/or molybdenum homeostasis are likely the cause of extensive metabolic alterations that impact biofilm development and virulence in the ΔPA1006 mutant.

  11. Effectiveness and Utility of a Case-Based Model for Delivering Engineering Ethics Professional Development Units

    Directory of Open Access Journals (Sweden)

    Heidi Ann Hahn

    2014-10-01

    Full Text Available This article describes an action research project conducted at Los Alamos National Laboratory (LANL to resolve a problem with the ability of licensed and/or certified engineers to obtain the ethics-related professional development units or hours (PDUs or PDHs needed to maintain their credentials. Because of the recurring requirement and the static nature of the information, an initial, in-depth training followed by annually updated refresher training was proposed. A case model approach, with online delivery, was selected as the optimal pedagogical model for the refresher training. In the first two years, the only data that was collected was throughput and information retention. Response rates indicated that the approach was effective in helping licensed professional engineers obtain the needed PDUs. The rates of correct responses suggested that knowledge transfer regarding ethical reasoning had occurred in the initial training and had been retained in the refresher. In FY13, after completing the refresher, learners received a survey asking their opinion of the effectiveness and utility of the course, as well as their impressions of the case study format vs. the typical presentation format. Results indicate that the courses have been favorably received and that the case study method supports most of the pedagogical needs of adult learners as well as, if not better than, presentation-based instruction. Future plans for improvement are focused on identifying and evaluating methods for enriching online delivery of the engineering ethics cases.

  12. Effectiveness and Utility of a Case-Based Model for Delivering Engineering Ethics Professional Development Units

    Directory of Open Access Journals (Sweden)

    Heidi Ann Hahn

    2015-04-01

    Full Text Available This article describes an action research project conducted at Los Alamos National Laboratory (LANL to resolve a problem with the ability of licensed and/or certified engineers to obtain the ethics-related professional development units or hours (PDUs or PDHs needed to maintain their credentials. Because of the recurring requirement and the static nature of the information, an initial, in-depth training followed by annually updated refresher training was proposed. A case model approach, with online delivery, was selected as the optimal pedagogical model for the refresher training. In the first two years, the only data that was collected was throughput and information retention. Response rates indicated that the approach was effective in helping licensed professional engineers obtain the needed PDUs. The rates of correct responses suggested that knowledge transfer regarding ethical reasoning had occurred in the initial training and had been retained in the refresher. In FY13, after completing the refresher, learners received a survey asking their opinion of the effectiveness and utility of the course, as well as their impressions of the case study format vs. the typical presentation format. Results indicate that the courses have been favorably received and that the case study method supports most of the pedagogical needs of adult learners as well as, if not better than, presentation-based instruction. Future plans for improvement are focused on identifying and evaluating methods for enriching online delivery of the engineering ethics cases.

  13. Inferring the most probable maps of underground utilities using Bayesian mapping model

    Science.gov (United States)

    Bilal, Muhammad; Khan, Wasiq; Muggleton, Jennifer; Rustighi, Emiliano; Jenks, Hugo; Pennock, Steve R.; Atkins, Phil R.; Cohn, Anthony

    2018-03-01

    Mapping the Underworld (MTU), a major initiative in the UK, is focused on addressing social, environmental and economic consequences raised from the inability to locate buried underground utilities (such as pipes and cables) by developing a multi-sensor mobile device. The aim of MTU device is to locate different types of buried assets in real time with the use of automated data processing techniques and statutory records. The statutory records, even though typically being inaccurate and incomplete, provide useful prior information on what is buried under the ground and where. However, the integration of information from multiple sensors (raw data) with these qualitative maps and their visualization is challenging and requires the implementation of robust machine learning/data fusion approaches. An approach for automated creation of revised maps was developed as a Bayesian Mapping model in this paper by integrating the knowledge extracted from sensors raw data and available statutory records. The combination of statutory records with the hypotheses from sensors was for initial estimation of what might be found underground and roughly where. The maps were (re)constructed using automated image segmentation techniques for hypotheses extraction and Bayesian classification techniques for segment-manhole connections. The model consisting of image segmentation algorithm and various Bayesian classification techniques (segment recognition and expectation maximization (EM) algorithm) provided robust performance on various simulated as well as real sites in terms of predicting linear/non-linear segments and constructing refined 2D/3D maps.

  14. An Analysis/Synthesis System of Audio Signal with Utilization of an SN Model

    Directory of Open Access Journals (Sweden)

    G. Rozinaj

    2004-12-01

    Full Text Available An SN (sinusoids plus noise model is a spectral model, in which theperiodic components of the sound are represented by sinusoids withtime-varying frequencies, amplitudes and phases. The remainingnon-periodic components are represented by a filtered noise. Thesinusoidal model utilizes physical properties of musical instrumentsand the noise model utilizes the human inability to perceive the exactspectral shape or the phase of stochastic signals. SN modeling can beapplied in a compression, transformation, separation of sounds, etc.The designed system is based on methods used in the SN modeling. Wehave proposed a model that achieves good results in audio perception.Although many systems do not save phases of the sinusoids, they areimportant for better modelling of transients, for the computation ofresidual and last but not least for stereo signals, too. One of thefundamental properties of the proposed system is the ability of thesignal reconstruction not only from the amplitude but from the phasepoint of view, as well.

  15. Atmospheric disturbance modelling requirements for flying qualities applications

    Science.gov (United States)

    Moorhouse, D. J.

    1978-01-01

    Flying qualities are defined as those airplane characteristics which govern the ease or precision with which the pilot can accomplish the mission. Some atmospheric disturbance modelling requirements for aircraft flying qualities applications are reviewed. It is concluded that some simplifications are justified in identifying the primary influence on aircraft response and pilot control. It is recommended that a universal environmental model be developed, which could form the reference for different applications. This model should include the latest information on winds, turbulence, gusts, visibility, icing and precipitation. A chosen model would be kept by a national agency and updated regularly by feedback from users. A user manual is believed to be an essential part of such a model.

  16. Information Models, Data Requirements, and Agile Data Curation

    Science.gov (United States)

    Hughes, John S.; Crichton, Dan; Ritschel, Bernd; Hardman, Sean; Joyner, Ron

    2015-04-01

    The Planetary Data System's next generation system, PDS4, is an example of the successful use of an ontology-based Information Model (IM) to drive the development and operations of a data system. In traditional systems engineering, requirements or statements about what is necessary for the system are collected and analyzed for input into the design stage of systems development. With the advent of big data the requirements associated with data have begun to dominate and an ontology-based information model can be used to provide a formalized and rigorous set of data requirements. These requirements address not only the usual issues of data quantity, quality, and disposition but also data representation, integrity, provenance, context, and semantics. In addition the use of these data requirements during system's development has many characteristics of Agile Curation as proposed by Young et al. [Taking Another Look at the Data Management Life Cycle: Deconstruction, Agile, and Community, AGU 2014], namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. For example customers can be satisfied through early and continuous delivery of system software and services that are configured directly from the information model. This presentation will describe the PDS4 architecture and its three principle parts: the ontology-based Information Model (IM), the federated registries and repositories, and the REST-based service layer for search, retrieval, and distribution. The development of the IM will be highlighted with special emphasis on knowledge acquisition, the impact of the IM on development and operations, and the use of shared ontologies at multiple governance levels to promote system interoperability and data correlation.

  17. Utilizing Visual Effects Software for Efficient and Flexible Isostatic Adjustment Modelling

    Science.gov (United States)

    Meldgaard, A.; Nielsen, L.; Iaffaldano, G.

    2017-12-01

    The isostatic adjustment signal generated by transient ice sheet loading is an important indicator of past ice sheet extent and the rheological constitution of the interior of the Earth. Finite element modelling has proved to be a very useful tool in these studies. We present a simple numerical model for 3D visco elastic Earth deformation and a new approach to the design of such models utilizing visual effects software designed for the film and game industry. The software package Houdini offers an assortment of optimized tools and libraries which greatly facilitate the creation of efficient numerical algorithms. In particular, we make use of Houdini's procedural work flow, the SIMD programming language VEX, Houdini's sparse matrix creation and inversion libraries, an inbuilt tetrahedralizer for grid creation, and the user interface, which facilitates effortless manipulation of 3D geometry. We mitigate many of the time consuming steps associated with the authoring of efficient algorithms from scratch while still keeping the flexibility that may be lost with the use of commercial dedicated finite element programs. We test the efficiency of the algorithm by comparing simulation times with off-the-shelf solutions from the Abaqus software package. The algorithm is tailored for the study of local isostatic adjustment patterns, in close vicinity to present ice sheet margins. In particular, we wish to examine possible causes for the considerable spatial differences in the uplift magnitude which are apparent from field observations in these areas. Such features, with spatial scales of tens of kilometres, are not resolvable with current global isostatic adjustment models, and may require the inclusion of local topographic features. We use the presented algorithm to study a near field area where field observations are abundant, namely, Disko Bay in West Greenland with the intention of constraining Earth parameters and ice thickness. In addition, we assess how local

  18. Utilization of Integrated Assessment Modeling for determining geologic CO2 storage security

    Science.gov (United States)

    Pawar, R.

    2017-12-01

    Geologic storage of carbon dioxide (CO2) has been extensively studied as a potential technology to mitigate atmospheric concentration of CO2. Multiple international research & development efforts, large-scale demonstration and commercial projects are helping advance the technology. One of the critical areas of active investigation is prediction of long-term CO2 storage security and risks. A quantitative methodology for predicting a storage site's long-term performance is critical for making key decisions necessary for successful deployment of commercial scale projects where projects will require quantitative assessments of potential long-term liabilities. These predictions are challenging given that they require simulating CO2 and in-situ fluid movements as well as interactions through the primary storage reservoir, potential leakage pathways (such as wellbores, faults, etc.) and shallow resources such as groundwater aquifers. They need to take into account the inherent variability and uncertainties at geologic sites. This talk will provide an overview of an approach based on integrated assessment modeling (IAM) to predict long-term performance of a geologic storage site including, storage reservoir, potential leakage pathways and shallow groundwater aquifers. The approach utilizes reduced order models (ROMs) to capture the complex physical/chemical interactions resulting due to CO2 movement and interactions but are computationally extremely efficient. Applicability of the approach will be demonstrated through examples that are focused on key storage security questions such as what is the probability of leakage of CO2 from a storage reservoir? how does storage security vary for different geologic environments and operational conditions? how site parameter variability and uncertainties affect storage security, etc.

  19. A choice modelling analysis on the similarity between distribution utilities' and industrial customers' price and quality preferences

    International Nuclear Information System (INIS)

    Soederberg, Magnus

    2008-01-01

    The Swedish Electricity Act states that electricity distribution must comply with both price and quality requirements. In order to maintain efficient regulation it is necessary to firstly, define quality attributes and secondly, determine a customer's priorities concerning price and quality attributes. If distribution utilities gain an understanding of customer preferences and incentives for reporting them, the regulator can save a lot of time by surveying them rather than their customers. This study applies a choice modelling methodology where utilities and industrial customers are asked to evaluate the same twelve choice situations in which price and four specific quality attributes are varied. The preferences expressed by the utilities, and estimated by a random parameter logit, correspond quite well with the preferences expressed by the largest industrial customers. The preferences expressed by the utilities are reasonably homogenous in relation to forms of association (private limited, public and trading partnership). If the regulator acts according to the preferences expressed by the utilities, smaller industrial customers will have to pay for quality they have not asked for. (author)

  20. Clinical utility of the Five-Factor Model of personality disorder.

    Science.gov (United States)

    Mullins-Sweatt, Stephanie N; Lengel, Gregory J

    2012-12-01

    There exists a great deal of research regarding the validity of the Five-Factor Model (FFM) of personality disorder. One of the most common objections to this model is concern regarding clinical utility. This article discusses clinical utility in terms of three fundamental components (i.e., ease of usage, communication, and treatment). In addition, a considerable number of recent empirical studies have examined whether the FFM compares well to personality disorder diagnostic categories with respect to all three components of clinical utility. The purpose of the current article is to provide a description of the implications of each component of clinical utility as it relates to the FFM and to acknowledge and address the empirical findings. © 2012 The Authors. Journal of Personality © 2012, Wiley Periodicals, Inc.

  1. Stock Selection for Portfolios Using Expected Utility-Entropy Decision Model

    Directory of Open Access Journals (Sweden)

    Jiping Yang

    2017-09-01

    Full Text Available Yang and Qiu proposed and then recently improved an expected utility-entropy (EU-E measure of risk and decision model. When segregation holds, Luce et al. derived an expected utility term, plus a constant multiplies the Shannon entropy as the representation of risky choices, further demonstrating the reasonability of the EU-E decision model. In this paper, we apply the EU-E decision model to selecting the set of stocks to be included in the portfolios. We first select 7 and 10 stocks from the 30 component stocks of Dow Jones Industrial Average index, and then derive and compare the efficient portfolios in the mean-variance framework. The conclusions imply that efficient portfolios composed of 7(10 stocks selected using the EU-E model with intermediate intervals of the tradeoff coefficients are more efficient than that composed of the sets of stocks selected using the expected utility model. Furthermore, the efficient portfolio of 7(10 stocks selected by the EU-E decision model have almost the same efficient frontier as that of the sample of all stocks. This suggests the necessity of incorporating both the expected utility and Shannon entropy together when taking risky decisions, further demonstrating the importance of Shannon entropy as the measure of uncertainty, as well as the applicability of the EU-E model as a decision-making model.

  2. Estimating FIA plot characteristics using NAIP imagery, function modeling, and the RMRS raster utility coding library

    Science.gov (United States)

    John S. Hogland; Nathaniel M. Anderson

    2015-01-01

    Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space, often limiting the types of analyses that can be performed. To address this issue, we have developed Function Modeling. Function Modeling is a new modeling framework that streamlines the...

  3. Assessment of the biophysical impacts of utility-scale photovoltaics through observations and modelling

    Science.gov (United States)

    Broadbent, A. M.; Georgescu, M.; Krayenhoff, E. S.; Sailor, D.

    2017-12-01

    Utility-scale solar power plants are a rapidly growing component of the solar energy sector. Utility-scale photovoltaic (PV) solar power generation in the United States has increased by 867% since 2012 (EIA, 2016). This expansion is likely to continue as the cost PV technologies decrease. While most agree that solar power can decrease greenhouse gas emissions, the biophysical effects of PV systems on surface energy balance (SEB), and implications for surface climate, are not well understood. To our knowledge, there has never been a detailed observational study of SEB at a utility-scale solar array. This study presents data from an eddy covariance observational tower, temporarily placed above a utility-scale PV array in Southern Arizona. Comparison of PV SEB with a reference (unmodified) site, shows that solar panels can alter the SEB and near surface climate. SEB observations are used to develop and validate a new and more complete SEB PV model. In addition, the PV model is compared to simpler PV modelling methods. The simpler PV models produce differing results to our newly developed model and cannot capture the more complex processes that influence PV SEB. Finally, hypothetical scenarios of PV expansion across the continental United States (CONUS) were developed using various spatial mapping criteria. CONUS simulations of PV expansion reveal regional variability in biophysical effects of PV expansion. The study presents the first rigorous and validated simulations of the biophysical effects of utility-scale PV arrays.

  4. NRC review of Electric Power Research Institute`s advanced light water reactor utility requirements document. Passive plant designs, chapter 1, project number 669

    Energy Technology Data Exchange (ETDEWEB)

    1994-08-01

    The Electric Power Research Institute (EPRI) is preparing a compendium of technical requirements, referred to as the {open_quotes}Advanced Light Water Reactor [ALWR] Utility Requirements Document{close_quotes}, that is acceptable to the design of an ALWR power plant. When completed, this document is intended to be a comprehensive statement of utility requirements for the design, construction, and performance of an ALWR power plant for the 1990s and beyond. The Requirements Document consists of three volumes. Volume 1, {open_quotes}ALWR Policy and Summary of Top-Tier Requirements{close_quotes}, is a management-level synopsis of the Requirements Document, including the design objectives and philosophy, the overall physical configuration and features of a future nuclear plant design, and the steps necessary to take the proposed ALWR design criteria beyond the conceptual design state to a completed, functioning power plant. Volume II consists of 13 chapters and contains utility design requirements for an evolutionary nuclear power plant [approximately 1350 megawatts-electric (MWe)]. Volume III contains utility design requirements for nuclear plants for which passive features will be used in their designs (approximately 600 MWe). In April 1992, the staff of the Office of Nuclear Reactor Regulation, U.S. Nuclear Regulatory Commission, issued Volume 1 and Volume 2 (Parts 1 and 2) of its safety evaluation report (SER) to document the results of its review of Volumes 1 and 2 of the Requirements Document. Volume 1, {open_quotes}NRC Review of Electric Power Research Institute`s Advanced Light Water Reactor Utility Requirements Document - Program Summary{close_quotes}, provided a discussion of the overall purpose and scope of the Requirements Document, the background of the staff`s review, the review approach used by the staff, and a summary of the policy and technical issues raised by the staff during its review.

  5. NRC review of Electric Power Research Institute`s advanced light water reactor utility requirements document. Passive plant designs, chapters 2-13, project number 669

    Energy Technology Data Exchange (ETDEWEB)

    1994-08-01

    The Electric Power Research Institute (EPRI) is preparing a compendium of technical requirements, referred to as the {open_quotes}Advanced Light Water Reactor [ALWR] Utility Requirements Document{close_quotes}, that is acceptable to the design of an ALWR power plant. When completed, this document is intended to be a comprehensive statement of utility requirements for the design, construction, and performance of an ALWR power plant for the 1990s and beyond. The Requirements Document consists of three volumes. Volume I, {open_quotes}ALWR Policy and Summary of Top-Tier Requirements{close_quotes}, is a management-level synopsis of the Requirements Document, including the design objectives and philosophy, the overall physical configuration and features of a future nuclear plant design, and the steps necessary to take the proposed ALWR design criteria beyond the conceptual design state to a completed, functioning power plant. Volume II consists of 13 chapters and contains utility design requirements for an evolutionary nuclear power plant [approximately 1350 megawatts-electric (MWe)]. Volume III contains utility design requirements for nuclear plants for which passive features will be used in their designs (approximately 600 MWe). In April 1992, the staff of the Office of Nuclear Reactor Regulation, U.S. Nuclear Regulatory Commission, issued Volume 1 and Volume 2 (Parts 1 and 2) of its safety evaluation report (SER) to document the results of its review of Volumes 1 and 2 of the Requirements Document. Volume 1, {open_quotes}NRC Review of Electric Power Research Institute`s Advanced Light Water Reactor Utility Requirements Document - Program Summary{close_quotes}, provided a discussion of the overall purpose and scope of the Requirements Document, the background of the staff`s review, the review approach used by the staff, and a summary of the policy and technical issues raised by the staff during its review.

  6. Using Random Utility Models to Estimate the Recreational Value of Estuarine Resources

    OpenAIRE

    Yoshiaki Kaoru; V. Kerry Smith; Jin Long Liu

    1995-01-01

    In this paper we describe a model using a household production framework to link measures of nonpoint source pollution to fishing quality and a random utility model to describe how that quality influences sport fishing parties' decisions in North Carolina. The results provide clear support for using a model that evaluates the effects of pollution on the activities and decisions associated with the fishing activity once a trip is taken. Site selection decisions are then conditioned on the anti...

  7. Utilizing Gaze Behavior for Inferring Task Transitions Using Abstract Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Daniel Fernando Tello Gamarra

    2016-12-01

    Full Text Available We demonstrate an improved method for utilizing observed gaze behavior and show that it is useful in inferring hand movement intent during goal directed tasks. The task dynamics and the relationship between hand and gaze behavior are learned using an Abstract Hidden Markov Model (AHMM. We show that the predicted hand movement transitions occur consistently earlier in AHMM models with gaze than those models that do not include gaze observations.

  8. An explanatory model of the dental care utilization of low-income children.

    Science.gov (United States)

    Milgrom, P; Mancl, L; King, B; Weinstein, P; Wells, N; Jeffcott, E

    1998-04-01

    Factors related to the utilization of dental care by 5- to 11-year-old children from low-income households were investigated using a comprehensive multivariate model that assessed the contribution of structure, history, cognition, and expectations. The influence of dentist-patient interactions, psychosocial and health beliefs, particularly fear of the dentist, on utilization were investigated. Children were chosen randomly from public schools, and 895 mothers were surveyed and their children were interviewed in the home. Utilization was studied during the 1991-1992 school year, including a 6-month follow-up period after the interview. The overall utilization rate was 63.2%, and the rate for nonemergent (preventive) visits was 59.9%. Utilization was unrelated to actual oral health status. Race and years the guardian lived in the United States were predictive of an episode of care. Preventive medical visits and perceived need were strong predictors of a visit to the dentist, as were beliefs in the efficacy of dental care. Mothers who were satisfied with their own care and oral health and whose children were covered by insurance were more likely to utilize children's dental care. In contrast, child dental fear and absences from school for family problems were associated with lower rates of utilization. Mutable factors that govern the use of care in this population were identified. These findings have implications for the design of dental care delivery systems for children and their families.

  9. NRC review of Electric Power Research Institute's advanced light water reactor utility requirements document. Passive plant designs, chapters 2-13, project number 669

    International Nuclear Information System (INIS)

    1994-08-01

    The Electric Power Research Institute (EPRI) is preparing a compendium of technical requirements, referred to as the open-quotes Advanced Light Water Reactor [ALWR] Utility Requirements Documentclose quotes, that is acceptable to the design of an ALWR power plant. When completed, this document is intended to be a comprehensive statement of utility requirements for the design, construction, and performance of an ALWR power plant for the 1990s and beyond. The Requirements Document consists of three volumes. Volume I, open-quotes ALWR Policy and Summary of Top-Tier Requirementsclose quotes, is a management-level synopsis of the Requirements Document, including the design objectives and philosophy, the overall physical configuration and features of a future nuclear plant design, and the steps necessary to take the proposed ALWR design criteria beyond the conceptual design state to a completed, functioning power plant. Volume II consists of 13 chapters and contains utility design requirements for an evolutionary nuclear power plant [approximately 1350 megawatts-electric (MWe)]. Volume III contains utility design requirements for nuclear plants for which passive features will be used in their designs (approximately 600 MWe). In April 1992, the staff of the Office of Nuclear Reactor Regulation, U.S. Nuclear Regulatory Commission, issued Volume 1 and Volume 2 (Parts 1 and 2) of its safety evaluation report (SER) to document the results of its review of Volumes 1 and 2 of the Requirements Document. Volume 1, open-quotes NRC Review of Electric Power Research Institute's Advanced Light Water Reactor Utility Requirements Document - Program Summaryclose quotes, provided a discussion of the overall purpose and scope of the Requirements Document, the background of the staff's review, the review approach used by the staff, and a summary of the policy and technical issues raised by the staff during its review

  10. NRC review of Electric Power Research Institute's advanced light water reactor utility requirements document. Passive plant designs, chapter 1, project number 669

    International Nuclear Information System (INIS)

    1994-08-01

    The Electric Power Research Institute (EPRI) is preparing a compendium of technical requirements, referred to as the open-quotes Advanced Light Water Reactor [ALWR] Utility Requirements Documentclose quotes, that is acceptable to the design of an ALWR power plant. When completed, this document is intended to be a comprehensive statement of utility requirements for the design, construction, and performance of an ALWR power plant for the 1990s and beyond. The Requirements Document consists of three volumes. Volume 1, open-quotes ALWR Policy and Summary of Top-Tier Requirementsclose quotes, is a management-level synopsis of the Requirements Document, including the design objectives and philosophy, the overall physical configuration and features of a future nuclear plant design, and the steps necessary to take the proposed ALWR design criteria beyond the conceptual design state to a completed, functioning power plant. Volume II consists of 13 chapters and contains utility design requirements for an evolutionary nuclear power plant [approximately 1350 megawatts-electric (MWe)]. Volume III contains utility design requirements for nuclear plants for which passive features will be used in their designs (approximately 600 MWe). In April 1992, the staff of the Office of Nuclear Reactor Regulation, U.S. Nuclear Regulatory Commission, issued Volume 1 and Volume 2 (Parts 1 and 2) of its safety evaluation report (SER) to document the results of its review of Volumes 1 and 2 of the Requirements Document. Volume 1, open-quotes NRC Review of Electric Power Research Institute's Advanced Light Water Reactor Utility Requirements Document - Program Summaryclose quotes, provided a discussion of the overall purpose and scope of the Requirements Document, the background of the staff's review, the review approach used by the staff, and a summary of the policy and technical issues raised by the staff during its review

  11. Modelling human resource requirements for the nuclear industry in Europe

    Energy Technology Data Exchange (ETDEWEB)

    Roelofs, Ferry [Nuclear Research and Consultancy Group (NRG) (Netherlands); Flore, Massimo; Estorff, Ulrik von [Joint Research Center (JRC) (Netherlands)

    2017-11-15

    The European Human Resource Observatory for Nuclear (EHRO-N) provides the European Commission with essential data related to supply and demand for nuclear experts in the EU-28 and the enlargement and integration countries based on bottom-up information from the nuclear industry. The objective is to assess how the supply of experts for the nuclear industry responds to the needs for the same experts for present and future nuclear projects in the region. Complementary to the bottom-up approach taken by the EHRO-N team at JRC, a top-down modelling approach has been taken in a collaboration with NRG in the Netherlands. This top-down modelling approach focuses on the human resource requirements for operation, construction, decommissioning, and efforts for long term operation of nuclear power plants. This paper describes the top-down methodology, the model input, the main assumptions, and the results of the analyses.

  12. Formal Requirements Modeling for Reactive Systems with Coloured Petri Nets

    DEFF Research Database (Denmark)

    Tjell, Simon

    . This is important because it represents the identi cation of what is being designed (the reactive system), and what is given and being made assumptions about (the environment). The representation of the environment is further partitioned to distinguish human actors from non-human actors. This allows the modeler...... to addressing the problem of validating formal requirements models through interactive graphical animations is presented. Executable Use Cases (EUCs) provide a framework for integrating three tiers of descriptions of specifications and environment assumptions: the lower tier is an informal description...... to distinguish the modeling artifacts describing the environment from those describing the specifications for a reactive system. The formalization allows for clear identi cation of interfaces between interacting domains, where the interaction takes place through an abstraction of possibly parameterized states...

  13. The transparency, reliability and utility of tropical rainforest land-use and land-cover change models.

    Science.gov (United States)

    Rosa, Isabel M D; Ahmed, Sadia E; Ewers, Robert M

    2014-06-01

    Land-use and land-cover (LULC) change is one of the largest drivers of biodiversity loss and carbon emissions globally. We use the tropical rainforests of the Amazon, the Congo basin and South-East Asia as a case study to investigate spatial predictive models of LULC change. Current predictions differ in their modelling approaches, are highly variable and often poorly validated. We carried out a quantitative review of 48 modelling methodologies, considering model spatio-temporal scales, inputs, calibration and validation methods. In addition, we requested model outputs from each of the models reviewed and carried out a quantitative assessment of model performance for tropical LULC predictions in the Brazilian Amazon. We highlight existing shortfalls in the discipline and uncover three key points that need addressing to improve the transparency, reliability and utility of tropical LULC change models: (1) a lack of openness with regard to describing and making available the model inputs and model code; (2) the difficulties of conducting appropriate model validations; and (3) the difficulty that users of tropical LULC models face in obtaining the model predictions to help inform their own analyses and policy decisions. We further draw comparisons between tropical LULC change models in the tropics and the modelling approaches and paradigms in other disciplines, and suggest that recent changes in the climate change and species distribution modelling communities may provide a pathway that tropical LULC change modellers may emulate to further improve the discipline. Climate change models have exerted considerable influence over public perceptions of climate change and now impact policy decisions at all political levels. We suggest that tropical LULC change models have an equally high potential to influence public opinion and impact the development of land-use policies based on plausible future scenarios, but, to do that reliably may require further improvements in the

  14. Consumer preferences for alternative fuel vehicles: Comparing a utility maximization and a regret minimization model

    International Nuclear Information System (INIS)

    Chorus, Caspar G.; Koetse, Mark J.; Hoen, Anco

    2013-01-01

    This paper presents a utility-based and a regret-based model of consumer preferences for alternative fuel vehicles, based on a large-scale stated choice-experiment held among company car leasers in The Netherlands. Estimation and application of random utility maximization and random regret minimization discrete choice models shows that while the two models achieve almost identical fit with the data and differ only marginally in terms of predictive ability, they generate rather different choice probability-simulations and policy implications. The most eye-catching difference between the two models is that the random regret minimization model accommodates a compromise-effect, as it assigns relatively high choice probabilities to alternative fuel vehicles that perform reasonably well on each dimension instead of having a strong performance on some dimensions and a poor performance on others. - Highlights: • Utility- and regret-based models of preferences for alternative fuel vehicles. • Estimation based on stated choice-experiment among Dutch company car leasers. • Models generate rather different choice probabilities and policy implications. • Regret-based model accommodates a compromise-effect

  15. Experimental development based on mapping rule between requirements analysis model and web framework specific design model.

    Science.gov (United States)

    Okuda, Hirotaka; Ogata, Shinpei; Matsuura, Saeko

    2013-12-01

    Model Driven Development is a promising approach to develop high quality software systems. We have proposed a method of model-driven requirements analysis using Unified Modeling Language (UML). The main feature of our method is to automatically generate a Web user interface prototype from UML requirements analysis model so that we can confirm validity of input/output data for each page and page transition on the system by directly operating the prototype. We proposes a mapping rule in which design information independent of each web application framework implementation is defined based on the requirements analysis model, so as to improve the traceability to the final product from the valid requirements analysis model. This paper discusses the result of applying our method to the development of a Group Work Support System that is currently running in our department.

  16. Estimating health state utility values from discrete choice experiments--a QALY space model approach.

    Science.gov (United States)

    Gu, Yuanyuan; Norman, Richard; Viney, Rosalie

    2014-09-01

    Using discrete choice experiments (DCEs) to estimate health state utility values has become an important alternative to the conventional methods of Time Trade-Off and Standard Gamble. Studies using DCEs have typically used the conditional logit to estimate the underlying utility function. The conditional logit is known for several limitations. In this paper, we propose two types of models based on the mixed logit: one using preference space and the other using quality-adjusted life year (QALY) space, a concept adapted from the willingness-to-pay literature. These methods are applied to a dataset collected using the EQ-5D. The results showcase the advantages of using QALY space and demonstrate that the preferred QALY space model provides lower estimates of the utility values than the conditional logit, with the divergence increasing with worsening health states. Copyright © 2014 John Wiley & Sons, Ltd.

  17. Implications of Model Structure and Detail for Utility Planning: Scenario Case Studies Using the Resource Planning Model

    Energy Technology Data Exchange (ETDEWEB)

    Mai, Trieu [National Renewable Energy Lab. (NREL), Golden, CO (United States); Barrows, Clayton [National Renewable Energy Lab. (NREL), Golden, CO (United States); Lopez, Anthony [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hale, Elaine [National Renewable Energy Lab. (NREL), Golden, CO (United States); Dyson, Mark [National Renewable Energy Lab. (NREL), Golden, CO (United States); Eurek, Kelly [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2015-04-01

    In this report, we analyze the impacts of model configuration and detail in capacity expansion models, computational tools used by utility planners looking to find the least cost option for planning the system and by researchers or policy makers attempting to understand the effects of various policy implementations. The present analysis focuses on the importance of model configurations — particularly those related to capacity credit, dispatch modeling, and transmission modeling — to the construction of scenario futures. Our analysis is primarily directed toward advanced tools used for utility planning and is focused on those impacts that are most relevant to decisions with respect to future renewable capacity deployment. To serve this purpose, we develop and employ the NREL Resource Planning Model to conduct a case study analysis that explores 12 separate capacity expansion scenarios of the Western Interconnection through 2030.

  18. Specification of advanced safety modeling requirements (Rev. 0).

    Energy Technology Data Exchange (ETDEWEB)

    Fanning, T. H.; Tautges, T. J.

    2008-06-30

    The U.S. Department of Energy's Global Nuclear Energy Partnership has lead to renewed interest in liquid-metal-cooled fast reactors for the purpose of closing the nuclear fuel cycle and making more efficient use of future repository capacity. However, the U.S. has not designed or constructed a fast reactor in nearly 30 years. Accurate, high-fidelity, whole-plant dynamics safety simulations will play a crucial role by providing confidence that component and system designs will satisfy established design limits and safety margins under a wide variety of operational, design basis, and beyond design basis transient conditions. Current modeling capabilities for fast reactor safety analyses have resulted from several hundred person-years of code development effort supported by experimental validation. The broad spectrum of mechanistic and phenomenological models that have been developed represent an enormous amount of institutional knowledge that needs to be maintained. Complicating this, the existing code architectures for safety modeling evolved from programming practices of the 1970s. This has lead to monolithic applications with interdependent data models which require significant knowledge of the complexities of the entire code in order for each component to be maintained. In order to develop an advanced fast reactor safety modeling capability, the limitations of the existing code architecture must be overcome while preserving the capabilities that already exist. To accomplish this, a set of advanced safety modeling requirements is defined, based on modern programming practices, that focuses on modular development within a flexible coupling framework. An approach for integrating the existing capabilities of the SAS4A/SASSYS-1 fast reactor safety analysis code into the SHARP framework is provided in order to preserve existing capabilities while providing a smooth transition to advanced modeling capabilities. In doing this, the advanced fast reactor safety models

  19. Modeling Water Utility Investments and Improving Regulatory Policies using Economic Optimisation in England and Wales

    Science.gov (United States)

    Padula, S.; Harou, J. J.

    2012-12-01

    Water utilities in England and Wales are regulated natural monopolies called 'water companies'. Water companies must obtain periodic regulatory approval for all investments (new supply infrastructure or demand management measures). Both water companies and their regulators use results from least economic cost capacity expansion optimisation models to develop or assess water supply investment plans. This presentation first describes the formulation of a flexible supply-demand planning capacity expansion model for water system planning. The model uses a mixed integer linear programming (MILP) formulation to choose the least-cost schedule of future supply schemes (reservoirs, desalination plants, etc.) and demand management (DM) measures (leakage reduction, water efficiency and metering options) and bulk transfers. Decisions include what schemes to implement, when to do so, how to size schemes and how much to use each scheme during each year of an n-year long planning horizon (typically 30 years). In addition to capital and operating (fixed and variable) costs, the estimated social and environmental costs of schemes are considered. Each proposed scheme is costed discretely at one or more capacities following regulatory guidelines. The model uses a node-link network structure: water demand nodes are connected to supply and demand management (DM) options (represented as nodes) or to other demand nodes (transfers). Yields from existing and proposed are estimated separately using detailed water resource system simulation models evaluated over the historical period. The model simultaneously considers multiple demand scenarios to ensure demands are met at required reliability levels; use levels of each scheme are evaluated for each demand scenario and weighted by scenario likelihood so that operating costs are accurately evaluated. Multiple interdependency relationships between schemes (pre-requisites, mutual exclusivity, start dates, etc.) can be accounted for by

  20. Work-Life Benefits and Organizational Attachment: Self-Interest Utility and Signaling Theory Models

    Science.gov (United States)

    Casper, Wendy J.; Harris, Christopher M.

    2008-01-01

    This study examines two competing theoretical explanations for why work-life policies such as dependent care assistance and flexible schedules influence organizational attachment. The self-interest utility model posits that work-life policies influence organizational attachment because employee use of these policies facilitates attachment. The…

  1. IAPCS: A COMPUTER MODEL THAT EVALUATES POLLUTION CONTROL SYSTEMS FOR UTILITY BOILERS

    Science.gov (United States)

    The IAPCS model, developed by U.S. EPA`s Air and Energy Engineering Research Laboratory and made available to the public through the National Technical Information Service, can be used by utility companies, architectural and engineering companies, and regulatory agencies at all l...

  2. Sample Size and Item Parameter Estimation Precision When Utilizing the One-Parameter "Rasch" Model

    Science.gov (United States)

    Custer, Michael

    2015-01-01

    This study examines the relationship between sample size and item parameter estimation precision when utilizing the one-parameter model. Item parameter estimates are examined relative to "true" values by evaluating the decline in root mean squared deviation (RMSD) and the number of outliers as sample size increases. This occurs across…

  3. The Dynamics of Mobile Learning Utilization in Vocational Education: Frame Model Perspective Review

    Science.gov (United States)

    Mahande, Ridwan Daud; Susanto, Adhi; Surjono, Herman Dwi

    2017-01-01

    This study aimed to describe the dynamics of content aspects, user aspects and social aspects of mobile learning utilization (m-learning) in vocational education from the FRAME Model perspective review. This study was quantitative descriptive research. The population in this study was teachers and students of state vocational school and private…

  4. Antidepressant-like Effects of Electroconvulsive Seizures Require Adult Neurogenesis in a Neuroendocrine Model of Depression.

    Science.gov (United States)

    Schloesser, Robert J; Orvoen, Sophie; Jimenez, Dennisse V; Hardy, Nicholas F; Maynard, Kristen R; Sukumar, Mahima; Manji, Husseini K; Gardier, Alain M; David, Denis J; Martinowich, Keri

    2015-01-01

    Neurogenesis continues throughout life in the hippocampal dentate gyrus. Chronic treatment with monoaminergic antidepressant drugs stimulates hippocampal neurogenesis, and new neurons are required for some antidepressant-like behaviors. Electroconvulsive seizures (ECS), a laboratory model of electroconvulsive therapy (ECT), robustly stimulate hippocampal neurogenesis. ECS requires newborn neurons to improve behavioral deficits in a mouse neuroendocrine model of depression. We utilized immunohistochemistry for doublecortin (DCX), a marker of migrating neuroblasts, to assess the impact of Sham or ECS treatments (1 treatment per day, 7 treatments over 15 days) on hippocampal neurogenesis in animals receiving 6 weeks of either vehicle or chronic corticosterone (CORT) treatment in the drinking water. We conducted tests of anxiety- and depressive-like behavior to investigate the ability of ECS to reverse CORT-induced behavioral deficits. We also determined whether adult neurons are required for the effects of ECS. For these studies we utilized a pharmacogenetic model (hGFAPtk) to conditionally ablate adult born neurons. We then evaluated behavioral indices of depression after Sham or ECS treatments in CORT-treated wild-type animals and CORT-treated animals lacking neurogenesis. ECS is able to rescue CORT-induced behavioral deficits in indices of anxiety- and depressive-like behavior. ECS increases both the number and dendritic complexity of adult-born migrating neuroblasts. The ability of ECS to promote antidepressant-like behavior is blocked in mice lacking adult neurogenesis. ECS ameliorates a number of anxiety- and depressive-like behaviors caused by chronic exposure to CORT. ECS requires intact hippocampal neurogenesis for its efficacy in these behavioral indices. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Developing a clinical utility framework to evaluate prediction models in radiogenomics

    Science.gov (United States)

    Wu, Yirong; Liu, Jie; Munoz del Rio, Alejandro; Page, David C.; Alagoz, Oguzhan; Peissig, Peggy; Onitilo, Adedayo A.; Burnside, Elizabeth S.

    2015-03-01

    Combining imaging and genetic information to predict disease presence and behavior is being codified into an emerging discipline called "radiogenomics." Optimal evaluation methodologies for radiogenomics techniques have not been established. We aim to develop a clinical decision framework based on utility analysis to assess prediction models for breast cancer. Our data comes from a retrospective case-control study, collecting Gail model risk factors, genetic variants (single nucleotide polymorphisms-SNPs), and mammographic features in Breast Imaging Reporting and Data System (BI-RADS) lexicon. We first constructed three logistic regression models built on different sets of predictive features: (1) Gail, (2) Gail+SNP, and (3) Gail+SNP+BI-RADS. Then, we generated ROC curves for three models. After we assigned utility values for each category of findings (true negative, false positive, false negative and true positive), we pursued optimal operating points on ROC curves to achieve maximum expected utility (MEU) of breast cancer diagnosis. We used McNemar's test to compare the predictive performance of the three models. We found that SNPs and BI-RADS features augmented the baseline Gail model in terms of the area under ROC curve (AUC) and MEU. SNPs improved sensitivity of the Gail model (0.276 vs. 0.147) and reduced specificity (0.855 vs. 0.912). When additional mammographic features were added, sensitivity increased to 0.457 and specificity to 0.872. SNPs and mammographic features played a significant role in breast cancer risk estimation (p-value < 0.001). Our decision framework comprising utility analysis and McNemar's test provides a novel framework to evaluate prediction models in the realm of radiogenomics.

  6. Utility of Social Modeling in Assessment of a State’s Propensity for Nuclear Proliferation

    Energy Technology Data Exchange (ETDEWEB)

    Coles, Garill A.; Brothers, Alan J.; Whitney, Paul D.; Dalton, Angela C.; Olson, Jarrod; White, Amanda M.; Cooley, Scott K.; Youchak, Paul M.; Stafford, Samuel V.

    2011-06-01

    This report is the third and final report out of a set of three reports documenting research for the U.S. Department of Energy (DOE) National Security Administration (NASA) Office of Nonproliferation Research and Development NA-22 Simulations, Algorithms, and Modeling program that investigates how social modeling can be used to improve proliferation assessment for informing nuclear security, policy, safeguards, design of nuclear systems and research decisions. Social modeling has not to have been used to any significant extent in a proliferation studies. This report focuses on the utility of social modeling as applied to the assessment of a State's propensity to develop a nuclear weapons program.

  7. Requirements traceability in model-driven development: Applying model and transformation conformance

    NARCIS (Netherlands)

    Andrade Almeida, João; Iacob, Maria Eugenia; van Eck, Pascal

    The variety of design artifacts (models) produced in a model-driven design process results in an intricate relationship between requirements and the various models. This paper proposes a methodological framework that simplifies management of this relationship, which helps in assessing the quality of

  8. Testing alternative kinetic models for utilization of crystalline cellulose (Avicel) by batch cultures of Clostridium thermocellum.

    Science.gov (United States)

    Holwerda, Evert K; Lynd, Lee R

    2013-09-01

    Descriptive kinetics of batch cellulose (Avicel) and cellobiose fermentation by Clostridium thermocellum were examined with residual substrate and biosynthate concentrations inferred based on elemental analysis. Biosynthate was formed in constant proportion to substrate consumption until substrate was exhausted for cellobiose fermentation, and until near the point of substrate exhaustion for cellulose fermentation. Cell yields (g pellet biosynthate carbon/g substrate carbon) of 0.214 and 0.200 were obtained for cellulose and cellobiose, respectively. For cellulose fermentation a sigmoidal curve fit was applied to substrate and biosynthate concentrations over time, which was then differentiated to calculate instantaneous rates of growth and substrate consumption. Three models were tested to describe the kinetics of Avicel utilization by C. thermocellum: (A) first order in cells, (B) first order in substrate, and (C) first order in cells and substrate, and second order overall. Models (A) and (B) have been proposed in the literature to describe cultures of cellulolytic microorganisms, whereas model (C) has not. Of the three models tested, model (c) provided by far the best fit to batch culture data. A second order rate constant equal to 0.735 L g C(-1)  h(-1) was found for utilization of Avicel by C. thermocellum. Adding an endogenous metabolism term improved the descriptive quality of the model as substrate exhaustion was approached. Such rate constants may in the future find utility for describing and comparing cellulose fermentation involving other microbes and other substrates. Copyright © 2013 Wiley Periodicals, Inc.

  9. A GENERALIZATION OF TRADITIONAL KANO MODEL FOR CUSTOMER REQUIREMENTS ANALYSIS

    Directory of Open Access Journals (Sweden)

    Renáta Turisová

    2015-07-01

    Full Text Available Purpose: The theory of attractiveness determines the relationship between the technically achieved and customer perceived quality of product attributes. The most frequently used approach in the theory of attractiveness is the implementation of Kano‘s model. There exist a lot of generalizations of that model which take into consideration various aspects and approaches focused on understanding the customer preferences and identification of his priorities for a selling  product. The aim of this article is to outline another possible generalization of Kano‘s model.Methodology/Approach: The traditional Kano’s model captures the nonlinear relationship between reached attributes of quality and customer requirements. The individual attributes of quality are divided into three main categories: must-be, one-dimensional, attractive quality and into two side categories: indifferent and reverse quality. The well selling product has to contain the must-be attribute. It should contain as many one-dimensional attributes as possible. If there are also supplementary attractive attributes, it means that attractiveness of the entire product, from the viewpoint of the customer, nonlinearly sharply rises what has a direct positive impact on a decision of potential customer when purchasing the product. In this article, we show that inclusion of individual quality attributes of a product to the mentioned categories depends, among other things, also on costs on life cycle of the product, respectively on a price of the product on the market.Findings: In practice, we are often encountering the inclusion of products into different price categories: lower, middle and upper class. For a certain type of products the category is either directly declared by a producer (especially in automotive industry, or is determined by a customer by means of assessment of available market prices. To each of those groups of a products different customer expectations can be assigned

  10. Hydrological Modeling in the Bull Run Watershed in Support of a Piloting Utility Modeling Applications (PUMA) Project

    Science.gov (United States)

    Nijssen, B.; Chiao, T. H.; Lettenmaier, D. P.; Vano, J. A.

    2016-12-01

    Hydrologic models with varying complexities and structures are commonly used to evaluate the impact of climate change on future hydrology. While the uncertainties in future climate projections are well documented, uncertainties in streamflow projections associated with hydrologic model structure and parameter estimation have received less attention. In this study, we implemented and calibrated three hydrologic models (the Distributed Hydrology Soil Vegetation Model (DHSVM), the Precipitation-Runoff Modeling System (PRMS), and the Variable Infiltration Capacity model (VIC)) for the Bull Run watershed in northern Oregon using consistent data sources and best practice calibration protocols. The project was part of a Piloting Utility Modeling Applications (PUMA) project with the Portland Water Bureau (PWB) under the umbrella of the Water Utility Climate Alliance (WUCA). Ultimately PWB would use the model evaluation to select a model to perform in-house climate change analysis for Bull Run Watershed. This presentation focuses on the experimental design of the comparison project, project findings and the collaboration between the team at the University of Washington and at PWB. After calibration, the three models showed similar capability to reproduce seasonal and inter-annual variations in streamflow, but differed in their ability to capture extreme events. Furthermore, the annual and seasonal hydrologic sensitivities to changes in climate forcings differed among models, potentially attributable to different model representations of snow and vegetation processes.

  11. Survey review of models for use in market penetration analysis: utility sector focus

    Energy Technology Data Exchange (ETDEWEB)

    Groncki, P.J.; Kydes, A.S.; Lamontagne, J.; Marcuse, W.; Vinjamuri, G.

    1980-11-01

    The ultimate benefits of federal expenditures in research and development for new technologies are dependent upon the degree of acceptance of these technologies. Market penetration considerations are central to the problem of quantifying the potential benefits. These benefits are inputs to the selection process of projects competing for finite R and D funds. Market penetration is the gradual acceptance of a new commodity or technology. The Office of Coal utilization is concerned with the specialized area of market penetration of new electric power generation technologies for both replacement and new capacity. The common measure of market penetration is the fraction of the market serviced by the challenging technology for each time point considered. The methodologies for estimating market penetration are divided into three generic classes: integrated energy/economy modeling systems, utility capacity expansion models, and technology substitution models. In general, the integrated energy/economy modeling systems have three advantages: they provide internally consistent macro, energy-economy scenarios, they account for the effect of prices on demand by fuel form, and they explicitly capture the effects of population growth and the level and structure of economic activity on energy demand. A variety of deficiencies appear in most energy-economy systems models. All of the methodologies may be applied at some level to questions of market penetration of new technologies in the utility sector; choice of methods for a particular analysis must be conditioned by the scope of the analysis, data availability, and the relative cost of alternative analysis.

  12. Resource allocation on computational grids using a utility model and the knapsack problem

    CERN Document Server

    Van der ster, Daniel C; Parra-Hernandez, Rafael; Sobie, Randall J

    2009-01-01

    This work introduces a utility model (UM) for resource allocation on computational grids and formulates the allocation problem as a variant of the 0–1 multichoice multidimensional knapsack problem. The notion of task-option utility is introduced, and it is used to effect allocation policies. We present a variety of allocation policies, which are expressed as functions of metrics that are both intrinsic and external to the task and resources. An external user-defined credit-value metric is shown to allow users to intervene in the allocation of urgent or low priority tasks. The strategies are evaluated in simulation against random workloads as well as those drawn from real systems. We measure the sensitivity of the UM-derived schedules to variations in the allocation policies and their corresponding utility functions. The UM allocation strategy is shown to optimally allocate resources congruent with the chosen policies.

  13. Modeling the minimum enzymatic requirements for optimal cellulose conversion

    International Nuclear Information System (INIS)

    Den Haan, R; Van Zyl, W H; Van Zyl, J M; Harms, T M

    2013-01-01

    Hydrolysis of cellulose is achieved by the synergistic action of endoglucanases, exoglucanases and β-glucosidases. Most cellulolytic microorganisms produce a varied array of these enzymes and the relative roles of the components are not easily defined or quantified. In this study we have used partially purified cellulases produced heterologously in the yeast Saccharomyces cerevisiae to increase our understanding of the roles of some of these components. CBH1 (Cel7), CBH2 (Cel6) and EG2 (Cel5) were separately produced in recombinant yeast strains, allowing their isolation free of any contaminating cellulolytic activity. Binary and ternary mixtures of the enzymes at loadings ranging between 3 and 100 mg g −1 Avicel allowed us to illustrate the relative roles of the enzymes and their levels of synergy. A mathematical model was created to simulate the interactions of these enzymes on crystalline cellulose, under both isolated and synergistic conditions. Laboratory results from the various mixtures at a range of loadings of recombinant enzymes allowed refinement of the mathematical model. The model can further be used to predict the optimal synergistic mixes of the enzymes. This information can subsequently be applied to help to determine the minimum protein requirement for complete hydrolysis of cellulose. Such knowledge will be greatly informative for the design of better enzymatic cocktails or processing organisms for the conversion of cellulosic biomass to commodity products. (letter)

  14. Modeling Substrate Utilization, Metabolite Production, and Uranium Immobilization in Shewanella oneidensis Biofilms

    Directory of Open Access Journals (Sweden)

    Ryan S. Renslow

    2017-06-01

    Full Text Available In this study, we developed a two-dimensional mathematical model to predict substrate utilization and metabolite production rates in Shewanella oneidensis MR-1 biofilm in the presence and absence of uranium (U. In our model, lactate and fumarate are used as the electron donor and the electron acceptor, respectively. The model includes the production of extracellular polymeric substances (EPS. The EPS bound to the cell surface and distributed in the biofilm were considered bound EPS (bEPS and loosely associated EPS (laEPS, respectively. COMSOL® Multiphysics finite element analysis software was used to solve the model numerically (model file provided in the Supplementary Material. The input variables of the model were the lactate, fumarate, cell, and EPS concentrations, half saturation constant for fumarate, and diffusion coefficients of the substrates and metabolites. To estimate unknown parameters and calibrate the model, we used a custom designed biofilm reactor placed inside a nuclear magnetic resonance (NMR microimaging and spectroscopy system and measured substrate utilization and metabolite production rates. From these data we estimated the yield coefficients, maximum substrate utilization rate, half saturation constant for lactate, stoichiometric ratio of fumarate and acetate to lactate and stoichiometric ratio of succinate to fumarate. These parameters are critical to predicting the activity of biofilms and are not available in the literature. Lastly, the model was used to predict uranium immobilization in S. oneidensis MR-1 biofilms by considering reduction and adsorption processes in the cells and in the EPS. We found that the majority of immobilization was due to cells, and that EPS was less efficient at immobilizing U. Furthermore, most of the immobilization occurred within the top 10 μm of the biofilm. To the best of our knowledge, this research is one of the first biofilm immobilization mathematical models based on experimental

  15. The Utility of Cognitive Plausibility in Language Acquisition Modeling: Evidence From Word Segmentation.

    Science.gov (United States)

    Phillips, Lawrence; Pearl, Lisa

    2015-11-01

    The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's cognitive plausibility. We suggest that though every computational model necessarily idealizes the modeled task, an informative language acquisition model can aim to be cognitively plausible in multiple ways. We discuss these cognitive plausibility checkpoints generally and then apply them to a case study in word segmentation, investigating a promising Bayesian segmentation strategy. We incorporate cognitive plausibility by using an age-appropriate unit of perceptual representation, evaluating the model output in terms of its utility, and incorporating cognitive constraints into the inference process. Our more cognitively plausible model shows a beneficial effect of cognitive constraints on segmentation performance. One interpretation of this effect is as a synergy between the naive theories of language structure that infants may have and the cognitive constraints that limit the fidelity of their inference processes, where less accurate inference approximations are better when the underlying assumptions about how words are generated are less accurate. More generally, these results highlight the utility of incorporating cognitive plausibility more fully into computational models of language acquisition. Copyright © 2015 Cognitive Science Society, Inc.

  16. On the Path to SunShot - Utility Regulatory Business Model Reforms forAddressing the Financial Impacts of Distributed Solar on Utilities

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2016-05-01

    Net-energy metering (NEM) with volumetric retail electricity pricing has enabled rapid proliferation of distributed photovoltaics (DPV) in the United States. However, this transformation is raising concerns about the potential for higher electricity rates and cost-shifting to non-solar customers, reduced utility shareholder profitability, reduced utility earnings opportunities, and inefficient resource allocation. Although DPV deployment in most utility territories remains too low to produce significant impacts, these concerns have motivated real and proposed reforms to utility regulatory and business models, with profound implications for future DPV deployment. This report explores the challenges and opportunities associated with such reforms in the context of the U.S. Department of Energy’s SunShot Initiative. As such, the report focuses on a subset of a broader range of reforms underway in the electric utility sector. Drawing on original analysis and existing literature, we analyze the significance of DPV’s financial impacts on utilities and non-solar ratepayers under current NEM rules and rate designs, the projected effects of proposed NEM and rate reforms on DPV deployment, and alternative reforms that could address utility and ratepayer concerns while supporting continued DPV growth. We categorize reforms into one or more of four conceptual strategies. Understanding how specific reforms map onto these general strategies can help decision makers identify and prioritize options for addressing specific DPV concerns that balance stakeholder interests.

  17. The utility of comparative models and the local model quality for protein crystal structure determination by Molecular Replacement

    Directory of Open Access Journals (Sweden)

    Pawlowski Marcin

    2012-11-01

    Full Text Available Abstract Background Computational models of protein structures were proved to be useful as search models in Molecular Replacement (MR, a common method to solve the phase problem faced by macromolecular crystallography. The success of MR depends on the accuracy of a search model. Unfortunately, this parameter remains unknown until the final structure of the target protein is determined. During the last few years, several Model Quality Assessment Programs (MQAPs that predict the local accuracy of theoretical models have been developed. In this article, we analyze whether the application of MQAPs improves the utility of theoretical models in MR. Results For our dataset of 615 search models, the real local accuracy of a model increases the MR success ratio by 101% compared to corresponding polyalanine templates. On the contrary, when local model quality is not utilized in MR, the computational models solved only 4.5% more MR searches than polyalanine templates. For the same dataset of the 615 models, a workflow combining MR with predicted local accuracy of a model found 45% more correct solution than polyalanine templates. To predict such accuracy MetaMQAPclust, a “clustering MQAP” was used. Conclusions Using comparative models only marginally increases the MR success ratio in comparison to polyalanine structures of templates. However, the situation changes dramatically once comparative models are used together with their predicted local accuracy. A new functionality was added to the GeneSilico Fold Prediction Metaserver in order to build models that are more useful for MR searches. Additionally, we have developed a simple method, AmIgoMR (Am I good for MR?, to predict if an MR search with a template-based model for a given template is likely to find the correct solution.

  18. Federal and State Structures to Support Financing Utility-Scale Solar Projects and the Business Models Designed to Utilize Them

    Energy Technology Data Exchange (ETDEWEB)

    Mendelsohn, M.; Kreycik, C.

    2012-04-01

    Utility-scale solar projects have grown rapidly in number and size over the last few years, driven in part by strong renewable portfolio standards (RPS) and federal incentives designed to stimulate investment in renewable energy technologies. This report provides an overview of such policies, as well as the project financial structures they enable, based on industry literature, publicly available data, and questionnaires conducted by the National Renewable Energy Laboratory (NREL).

  19. Changes in fibrinogen availability and utilization in an animal model of traumatic coagulopathy

    DEFF Research Database (Denmark)

    Hagemo, Jostein S; Jørgensen, Jørgen; Ostrowski, Sisse R

    2013-01-01

    Impaired haemostasis following shock and tissue trauma is frequently detected in the trauma setting. These changes occur early, and are associated with increased mortality. The mechanism behind trauma-induced coagulopathy (TIC) is not clear. Several studies highlight the crucial role of fibrinogen...... in posttraumatic haemorrhage. This study explores the coagulation changes in a swine model of early TIC, with emphasis on fibrinogen levels and utilization of fibrinogen....

  20. Utility Function for modeling Group Multicriteria Decision Making problems as games

    OpenAIRE

    Alexandre Bevilacqua Leoneti

    2016-01-01

    To assist in the decision making process, several multicriteria methods have been proposed. However, the existing methods assume a single decision-maker and do not consider decision under risk, which is better addressed by Game Theory. Hence, the aim of this research is to propose a Utility Function that makes it possible to model Group Multicriteria Decision Making problems as games. The advantage of using Game Theory for solving Group Multicriteria Decision Making problems is to evaluate th...

  1. Research on Computer Aided Innovation Model of Weapon Equipment Requirement Demonstration

    Science.gov (United States)

    Li, Yong; Guo, Qisheng; Wang, Rui; Li, Liang

    Firstly, in order to overcome the shortcoming of using only AD or TRIZ solely, and solve the problems currently existed in weapon equipment requirement demonstration, the paper construct the method system of weapon equipment requirement demonstration combining QFD, AD, TRIZ, FA. Then, we construct a CAI model frame of weapon equipment requirement demonstration, which include requirement decomposed model, requirement mapping model and requirement plan optimization model. Finally, we construct the computer aided innovation model of weapon equipment requirement demonstration, and developed CAI software of equipment requirement demonstration.

  2. A 2nd generation static model of greenhouse energy requirements (horticern) : a comparison with dynamic models

    CERN Document Server

    Jolliet, O; Munday, G L

    1989-01-01

    Optimisation of a greenhouse and its components requires a suitable model permitting precise determination of its energy requirements. Existing static models are simple but lack precision; dynamic models though more precise, are unsuitable for use over long periods and difficult to handle in practice. A theoretical study and measurements from the CERN trial greenhouse have allowed the development of new static model named "HORTICERN", precise and easy to use for predicting energy consumption and which takes into account effects of solar energy, wind and radiative loss to the sky. This paper compares the HORTICERN model with the dynamic models of Bot, Takakura, Van Bavel and Gembloux, and demonstrates that its precision is comparable; differences on average being less than 5%, it is independent of type of greenhouse (e.g. single or double glazing, Hortiplus, etc.) and climate. The HORTICERN method has been developed for PC use and is proving to be a powerful tool for greenhouse optimisation by research work...

  3. The Min system and nucleoid occlusion are not required for identifying the division site in Bacillus subtilis but ensure its efficient utilization.

    Directory of Open Access Journals (Sweden)

    Christopher D A Rodrigues

    Full Text Available Precise temporal and spatial control of cell division is essential for progeny survival. The current general view is that precise positioning of the division site at midcell in rod-shaped bacteria is a result of the combined action of the Min system and nucleoid (chromosome occlusion. Both systems prevent assembly of the cytokinetic Z ring at inappropriate places in the cell, restricting Z rings to the correct site at midcell. Here we show that in the bacterium Bacillus subtilis Z rings are positioned precisely at midcell in the complete absence of both these systems, revealing the existence of a mechanism independent of Min and nucleoid occlusion that identifies midcell in this organism. We further show that Z ring assembly at midcell is delayed in the absence of Min and Noc proteins, while at the same time FtsZ accumulates at other potential division sites. This suggests that a major role for Min and Noc is to ensure efficient utilization of the midcell division site by preventing Z ring assembly at potential division sites, including the cell poles. Our data lead us to propose a model in which spatial regulation of division in B. subtilis involves identification of the division site at midcell that requires Min and nucleoid occlusion to ensure efficient Z ring assembly there and only there, at the right time in the cell cycle.

  4. Optimizing Availability of a Framework in Series Configuration Utilizing Markov Model and Monte Carlo Simulation Techniques

    Directory of Open Access Journals (Sweden)

    Mansoor Ahmed Siddiqui

    2017-06-01

    Full Text Available This research work is aimed at optimizing the availability of a framework comprising of two units linked together in series configuration utilizing Markov Model and Monte Carlo (MC Simulation techniques. In this article, effort has been made to develop a maintenance model that incorporates three distinct states for each unit, while taking into account their different levels of deterioration. Calculations are carried out using the proposed model for two distinct cases of corrective repair, namely perfect and imperfect repairs, with as well as without opportunistic maintenance. Initially, results are accomplished using an analytical technique i.e., Markov Model. Validation of the results achieved is later carried out with the help of MC Simulation. In addition, MC Simulation based codes also work well for the frameworks that follow non-exponential failure and repair rates, and thus overcome the limitations offered by the Markov Model.

  5. Business model innovation for Local Energy Management: a perspective from Swiss utilities

    Directory of Open Access Journals (Sweden)

    Emanuele Facchinetti

    2016-08-01

    Full Text Available The successful deployment of the energy transition relies on a deep reorganization of the energy market. Business model innovation is recognized as a key driver of this process. This work contributes to this topic by providing to potential Local Energy Management stakeholders and policy makers a conceptual framework guiding the Local Energy Management business model innovation. The main determinants characterizing Local Energy Management concepts and impacting its business model innovation are identified through literature reviews on distributed generation typologies and customer/investor preferences related to new business opportunities emerging with the energy transition. Afterwards, the relation between the identified determinants and the Local Energy Management business model solution space is analyzed based on semi-structured interviews with managers of Swiss utilities companies. The collected managers’ preferences serve as explorative indicators supporting the business model innovation process and provide insights to policy makers on challenges and opportunities related to Local Energy Management.

  6. Business Model Innovation for Local Energy Management: A Perspective from Swiss Utilities

    International Nuclear Information System (INIS)

    Facchinetti, Emanuele; Eid, Cherrelle; Bollinger, Andrew; Sulzer, Sabine

    2016-01-01

    The successful deployment of the energy transition relies on a deep reorganization of the energy market. Business model innovation is recognized as a key driver of this process. This work contributes to this topic by providing to potential local energy management (LEM) stakeholders and policy makers a conceptual framework guiding the LEM business model innovation. The main determinants characterizing LEM concepts and impacting its business model innovation are identified through literature reviews on distributed generation typologies and customer/investor preferences related to new business opportunities emerging with the energy transition. Afterwards, the relation between the identified determinants and the LEM business model solution space is analyzed based on semi-structured interviews with managers of Swiss utilities companies. The collected managers’ preferences serve as explorative indicators supporting the business model innovation process and provide insights into policy makers on challenges and opportunities related to LEM.

  7. Modelling and multi objective optimization of laser peening process using Taguchi utility concept

    Science.gov (United States)

    Ranjith Kumar, G.; Rajyalakshmi, G.

    2017-11-01

    Laser peening is considered as one of the innovative surface treatment technique. This work focuses on determining the optimal peening parameters for finding optimal responses like residual stresses and deformation. The modelling was done using ANSYS and values are optimised using Taguchi Utility concept for simultaneous optimization of responses. Three parameters viz. overlap; Pulse duration and Pulse density are considered as process parameters for modelling and optimization. Through Multi objective optimization, it is showing that Overlap is showing maximum influence on Stress and deformation followed by Power density and pulse duration.

  8. The episodic random utility model unifies time trade-off and discrete choice approaches in health state valuation

    NARCIS (Netherlands)

    B.M. Craig (Benjamin); J.J. van Busschbach (Jan)

    2009-01-01

    textabstractABSTRACT: BACKGROUND: To present an episodic random utility model that unifies time trade-off and discrete choice approaches in health state valuation. METHODS: First, we introduce two alternative random utility models (RUMs) for health preferences: the episodic RUM and the more common

  9. Exploring New Models for Utility Distributed Energy Resource Planning and Integration: SMUD and Con Edison

    Energy Technology Data Exchange (ETDEWEB)

    2018-01-23

    As a result of the rapid growth of renewable energy in the United States, the U.S. electric grid is undergoing a monumental shift away from its historical status quo. These changes are occurring at both the centralized and local levels and have been driven by a number of different factors, including large declines in renewable energy costs, federal and state incentives and mandates, and advances in the underlying technology. Higher levels of variable-generation renewable energy, however, may require new and increasingly complex methods for utilities to operate and maintain the grid while also attempting to limit the costly build-out of supporting grid infrastructure.

  10. Requirements, model and prototype for a multi-utility locational and security information hub.

    Science.gov (United States)

    2015-11-01

    This project lays the foundation for building an exchange hub for locational and security data and risk assessment of potential excavation work. It acts primarily at 2 stages: upstream of the mark-out process, as a decision support tool to help strea...

  11. Region-specific study of the electric utility industry: financial history and future power requirements for the VACAR region

    Energy Technology Data Exchange (ETDEWEB)

    Pochan, M.J.

    1985-07-01

    Financial data for the period 1966 to 1981 are presented for the four investor-owned electric utilities in the VACAR (Virginia-Carolinas) region. This region was selected as representative for the purpose of assessing the availability, reliability, and cost of electric power for the future in the United States. The estimated demand for power and planned additions to generating capacity for the region through the year 2000 are also given.

  12. Region-specific study of the electric utility industry: financial history and future power requirements for the VACAR region

    International Nuclear Information System (INIS)

    Pochan, M.J.

    1985-07-01

    Financial data for the period 1966 to 1981 are presented for the four investor-owned electric utilities in the VACAR (Virginia-Carolinas) region. This region was selected as representative for the purpose of assessing the availability, reliability, and cost of electric power for the future in the United States. The estimated demand for power and planned additions to generating capacity for the region through the year 2000 are also given

  13. Models and data requirements for human reliability analysis

    International Nuclear Information System (INIS)

    1989-03-01

    It has been widely recognised for many years that the safety of the nuclear power generation depends heavily on the human factors related to plant operation. This has been confirmed by the accidents at Three Mile Island and Chernobyl. Both these cases revealed how human actions can defeat engineered safeguards and the need for special operator training to cover the possibility of unexpected plant conditions. The importance of the human factor also stands out in the analysis of abnormal events and insights from probabilistic safety assessments (PSA's), which reveal a large proportion of cases having their origin in faulty operator performance. A consultants' meeting, organized jointly by the International Atomic Energy Agency (IAEA) and the International Institute for Applied Systems Analysis (IIASA) was held at IIASA in Laxenburg, Austria, December 7-11, 1987, with the aim of reviewing existing models used in Probabilistic Safety Assessment (PSA) for Human Reliability Analysis (HRA) and of identifying the data required. The report collects both the contributions offered by the members of the Expert Task Force and the findings of the extensive discussions that took place during the meeting. Refs, figs and tabs

  14. Utilization of Model Predictive Control to Balance Power Absorption Against Load Accumulation

    Energy Technology Data Exchange (ETDEWEB)

    Abbas, Nikhar [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Tom, Nathan M [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-06-03

    Wave energy converter (WEC) control strategies have been primarily focused on maximizing power absorption. The use of model predictive control strategies allows for a finite-horizon, multiterm objective function to be solved. This work utilizes a multiterm objective function to maximize power absorption while minimizing the structural loads on the WEC system. Furthermore, a Kalman filter and autoregressive model were used to estimate and forecast the wave exciting force and predict the future dynamics of the WEC. The WEC's power-take-off time-averaged power and structural loads under a perfect forecast assumption in irregular waves were compared against results obtained from the Kalman filter and autoregressive model to evaluate model predictive control performance.

  15. Clinical Utility of the DSM-5 Alternative Model of Personality Disorders

    DEFF Research Database (Denmark)

    Bach, Bo; Markon, Kristian; Simonsen, Erik

    2015-01-01

    In Section III, Emerging Measures and Models, DSM-5 presents an Alternative Model of Personality Disorders, which is an empirically based model of personality pathology measured with the Level of Personality Functioning Scale (LPFS) and the Personality Inventory for DSM-5 (PID-5). These novel...... (involving a comparison of presenting problems, history, and diagnoses) and used to formulate treatment considerations. We also considered 6 specific personality disorder types that could be derived from the profiles as defined in the DSM-5 Section III criteria. Results. Using the LPFS and PID-5, we were...... evaluation generally supported the utility for clinical purposes of the Alternative Model for Personality Disorders in Section III of the DSM-5, although it also identified some areas for refinement....

  16. Utilization of Model Predictive Control to Balance Power Absorption Against Load Accumulation: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Abbas, Nikhar; Tom, Nathan

    2017-09-01

    Wave energy converter (WEC) control strategies have been primarily focused on maximizing power absorption. The use of model predictive control strategies allows for a finite-horizon, multiterm objective function to be solved. This work utilizes a multiterm objective function to maximize power absorption while minimizing the structural loads on the WEC system. Furthermore, a Kalman filter and autoregressive model were used to estimate and forecast the wave exciting force and predict the future dynamics of the WEC. The WEC's power-take-off time-averaged power and structural loads under a perfect forecast assumption in irregular waves were compared against results obtained from the Kalman filter and autoregressive model to evaluate model predictive control performance.

  17. Evaluating components of dental care utilization among adults with diabetes and matched controls via hurdle models

    Science.gov (United States)

    2012-01-01

    Background About one-third of adults with diabetes have severe oral complications. However, limited previous research has investigated dental care utilization associated with diabetes. This project had two purposes: to develop a methodology to estimate dental care utilization using claims data and to use this methodology to compare utilization of dental care between adults with and without diabetes. Methods Data included secondary enrollment and demographic data from Washington Dental Service (WDS) and Group Health Cooperative (GH), clinical data from GH, and dental-utilization data from WDS claims during 2002–2006. Dental and medical records from WDS and GH were linked for enrolees continuously and dually insured during the study. We employed hurdle models in a quasi-experimental setting to assess differences between adults with and without diabetes in 5-year cumulative utilization of dental services. Propensity score matching adjusted for differences in baseline covariates between the two groups. Results We found that adults with diabetes had lower odds of visiting a dentist (OR = 0.74, p dental visit, diabetes patients had lower odds of receiving prophylaxes (OR = 0.77), fillings (OR = 0.80) and crowns (OR = 0.84) (p diabetes are less likely to use dental services. Those who do are less likely to use preventive care and more likely to receive periodontal care and tooth-extractions. Future research should address the possible effectiveness of additional prevention in reducing subsequent severe oral disease in patients with diabetes. PMID:22776352

  18. Evaluating components of dental care utilization among adults with diabetes and matched controls via hurdle models

    Directory of Open Access Journals (Sweden)

    Chaudhari Monica

    2012-07-01

    Full Text Available Abstract Background About one-third of adults with diabetes have severe oral complications. However, limited previous research has investigated dental care utilization associated with diabetes. This project had two purposes: to develop a methodology to estimate dental care utilization using claims data and to use this methodology to compare utilization of dental care between adults with and without diabetes. Methods Data included secondary enrollment and demographic data from Washington Dental Service (WDS and Group Health Cooperative (GH, clinical data from GH, and dental-utilization data from WDS claims during 2002–2006. Dental and medical records from WDS and GH were linked for enrolees continuously and dually insured during the study. We employed hurdle models in a quasi-experimental setting to assess differences between adults with and without diabetes in 5-year cumulative utilization of dental services. Propensity score matching adjusted for differences in baseline covariates between the two groups. Results We found that adults with diabetes had lower odds of visiting a dentist (OR = 0.74, p  0.001. Among those with a dental visit, diabetes patients had lower odds of receiving prophylaxes (OR = 0.77, fillings (OR = 0.80 and crowns (OR = 0.84 (p 0.005 for all and higher odds of receiving periodontal maintenance (OR = 1.24, non-surgical periodontal procedures (OR = 1.30, extractions (OR = 1.38 and removable prosthetics (OR = 1.36 (p  Conclusions Patients with diabetes are less likely to use dental services. Those who do are less likely to use preventive care and more likely to receive periodontal care and tooth-extractions. Future research should address the possible effectiveness of additional prevention in reducing subsequent severe oral disease in patients with diabetes.

  19. Modeling and Optimizing Energy Utilization of Steel Production Process: A Hybrid Petri Net Approach

    Directory of Open Access Journals (Sweden)

    Peng Wang

    2013-01-01

    Full Text Available The steel industry is responsible for nearly 9% of anthropogenic energy utilization in the world. It is urgent to reduce the total energy utilization of steel industry under the huge pressures on reducing energy consumption and CO2 emission. Meanwhile, the steel manufacturing is a typical continuous-discrete process with multiprocedures, multiobjects, multiconstraints, and multimachines coupled, which makes energy management rather difficult. In order to study the energy flow within the real steel production process, this paper presents a new modeling and optimization method for the process based on Hybrid Petri Nets (HPN in consideration of the situation above. Firstly, we introduce the detailed description of HPN. Then the real steel production process from one typical integrated steel plant is transformed into Hybrid Petri Net model as a case. Furthermore, we obtain a series of constraints of our optimization model from this model. In consideration of the real process situation, we pick the steel production, energy efficiency and self-made gas surplus as the main optimized goals in this paper. Afterwards, a fuzzy linear programming method is conducted to obtain the multiobjective optimization results. Finally, some measures are suggested to improve this low efficiency and high whole cost process structure.

  20. The Aspergillus nidulans acuL gene encodes a mitochondrial carrier required for the utilization of carbon sources that are metabolized via the TCA cycle.

    Science.gov (United States)

    Flipphi, Michel; Oestreicher, Nathalie; Nicolas, Valérie; Guitton, Audrey; Vélot, Christian

    2014-07-01

    In Aspergillus nidulans, the utilization of acetate as sole carbon source requires several genes (acu). Most of them are also required for the utilization of fatty acids. This is the case for acuD and acuE, which encode the two glyoxylate cycle-specific enzymes, isocitrate lyase and malate synthase, respectively, but also for acuL that we have identified as AN7287, and characterized in this study. Deletion of acuL resulted in the same phenotype as the original acuL217 mutant. acuL encodes a 322-amino acid protein which displays all the structural features of a mitochondrial membrane carrier, and shares 60% identity with the Saccharomyces cerevisiae succinate/fumarate mitochondrial antiporter Sfc1p (also named Acr1p). Consistently, the AcuL protein was shown to localize in mitochondria, and partial cross-complementation was observed between the S. cerevisiae and A. nidulans homologues. Extensive phenotypic characterization suggested that the acuL gene is involved in the utilization of carbon sources that are catabolized via the TCA cycle, and therefore require gluconeogenesis. In addition, acuL proves to be co-regulated with acuD and acuE. Overall, our data suggest that AcuL could link the glyoxylate cycle to gluconeogenesis by exchanging cytoplasmic succinate for mitochondrial fumarate. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. On the Need / utility of Flexibilization Procedural by Judge of Procedure System in Brazil : An Analysis Required

    OpenAIRE

    Bruna Rocha Passos

    2016-01-01

    This study addresses the phenomenon of procedural flexibility, one of the innovations provided for in the New Civil Procedure Code, introduced in Brazil by Law No. 13,105 /2015. The study begins with the presentation of models of procedural systems and their characteristics. Breaks, then to the study of existing procedural flexibility models in other jurisdictions, with a special analysis of the main specificities of flexible models adopted by England, USA and Portugal. Then, it is the analys...

  2. Bayesian statistical models to estimate EQ-5D utility scores from EORTC QLQ data in myeloma.

    Science.gov (United States)

    Kharroubi, Samer A; Edlin, Richard; Meads, David; McCabe, Christopher

    2018-02-20

    It is well documented that the modelling of health-related quality of life data is difficult as the distribution of such data is often strongly right/left skewed and it includes a significant percentage of observations at one. The objective of this study is to develop a series of two-part models (TPMs) that deal with these issues. Data from the UK Medical Research Council Myeloma IX trial were used to examine the relationship between the European Organization for Research and Treatment of Cancer (EORTC) QLQ-C30/QLQ-MY20 scores and the European QoL-5 Dimensions (EQ-5D) utility score. Four different TPMs were developed. The models fitted included TPM with normal regression, TPM with normal regression with variance a function of participant characteristics, TPM with log-transformed data, and TPM with gamma regression and a log link. The cohort of 1839 patients was divided into 75% derivation sample, to fit the different models, and 25% validation sample to assess the predictive ability of these models by comparing predicted and observed mean EQ-5D scores in the validation set, unadjusted R 2 , and root mean square error. Predictive performance in the derivation dataset depended on the criterion used, with R 2 /adjusted-R 2 favouring the TPM with normal regression and mean predicted error favouring the TPM with gamma regression. The TPM with gamma regression performs best within the validation dataset under all criteria. TPM regression models provide flexible approaches to estimate mean EQ-5D utility weights from the EORTC QLQ-C30/QLQ-MY20 for use in economic evaluation. Copyright © 2018 John Wiley & Sons, Ltd.

  3. Estimates of nutritional requirements and use of Small Ruminant Nutrition System model for hair sheep in semiarid conditions

    Directory of Open Access Journals (Sweden)

    Alessandra Pinto de Oliveira

    2014-09-01

    Full Text Available The objective was to determine the efficiency of utilization of metabolizable energy for maintenance (km and weight gain (kf, the dietary requirements of total digestible nutrients (TDN and metabolizable protein (MP, as well as, evaluate the Small Ruminant Nutrition System (SRNS model to predict the dry matter intake (DMI and the average daily gain (ADG of Santa Ines lambs, fed diets containing different levels of metabolizable energy (ME. Thirty five lambs, non-castrated, with initial body weight (BW of 14.77 ± 1.26 kg at approximate two months old, were used. At the beginning of the experiment, five animals were slaughtered to serve as reference for the estimative of empty body weight (EBW and initial body composition of the 30 remaining animals, which were distributed in randomized block design with five treatments (1.13; 1.40; 1.73; 2.22 and 2.60 Mcal/kg DM, and six repetitions. The requirement of metabolizable energy for maintenance was 78.53 kcal/kg EBW0,75/day, with a utilization efficiency of 66%. The average value of efficiency of metabolizable energy utilization for weight gain was 48%. The dietary requirements of TDN and MP increased with the increase in BW and ADG of the animals. The SRNS model underestimated the DMI and ADG of the animals in 6.2% and 24.6%, respectively. Concludes that the values of km and kf are consistent with those observed in several studies with lambs created in the tropics. The dietary requirements of TDN and MP of Santa Ines lambs for different BW and ADG are, approximately, 42% and 24%, respectively, lower than those suggested by the american system of evaluation of food and nutrient requirements of small ruminants. The SRNS model was sensitive to predict the DMI in Santa Ines lambs, however, for variable ADG, more studies are needed, since the model underestimated the response of the animals of this study.

  4. Prioritization of EGFR/IGF-IR/VEGFR2 combination targeted therapies utilizing cancer models.

    Science.gov (United States)

    Tonra, James R; Corcoran, Erik; Deevi, Dhanvanthri S; Steiner, Philipp; Kearney, Jessica; Li, Huiling; Ludwig, Dale L; Zhu, Zhenping; Witte, Larry; Surguladze, David; Hicklin, Daniel J

    2009-06-01

    Rational strategies utilizing anticancer efficacy and biological principles are needed for the prioritization of specific combination targeted therapy approaches for clinical development, from among the many with experimental support. Antibodies targeting epidermal growth factor receptor (EGFR) (cetuximab), insulin-like growth factor-1 receptor (IGF-IR) (IMC-A12) or vascular endothelial growth factor receptor 2 (VEGFR2) (DC101), were dosed alone or in combination, in 11 human tumor xenograft models established in mice. Efficacy readouts included the tumor burden and incidence of metastasis, as well as tumor active hypoxia inducible factor-1 (HIF-1), human VEGF and blood vessel density. Cetuximab and DC101 contributed potent and non-overlapping benefits to the combination approach. Moreover, DC101 prevented escape from IMC-A12 + cetuximab in a colorectal cancer model and cetuximab prevented escape from DC101 therapy in a pancreatic cancer model. Targeting VEGFR2 + EGFR was prioritized over other treatment strategies utilizing EGFR, IGF-IR and VEGFR2 antibodies. The criteria that proved to be valuable were a non-overlapping spectrum of anticancer activity and the prevention of resistance to another therapy in the combination.

  5. Utilization of Short-Simulations for Tuning High-Resolution Climate Model

    Science.gov (United States)

    Lin, W.; Xie, S.; Ma, P. L.; Rasch, P. J.; Qian, Y.; Wan, H.; Ma, H. Y.; Klein, S. A.

    2016-12-01

    Many physical parameterizations in atmospheric models are sensitive to resolution. Tuning the models that involve a multitude of parameters at high resolution is computationally expensive, particularly when relying primarily on multi-year simulations. This work describes a complementary set of strategies for tuning high-resolution atmospheric models, using ensembles of short simulations to reduce the computational cost and elapsed time. Specifically, we utilize the hindcast approach developed through the DOE Cloud Associated Parameterization Testbed (CAPT) project for high-resolution model tuning, which is guided by a combination of short (global mean statistics and many spatial features are consistent between Atmospheric Model Intercomparison Project (AMIP)-type simulations and CAPT-type hindcasts, with just a small number of short-term simulations for the latter over the corresponding season. The use of CAPT hindcasts to find parameter choice for the reduction of large model biases dramatically improves the turnaround time for the tuning at high resolution. Improvement seen in CAPT hindcasts generally translates to improved AMIP-type simulations. An iterative CAPT-AMIP tuning approach is therefore adopted during each major tuning cycle, with the former to survey the likely responses and narrow the parameter space, and the latter to verify the results in climate context along with assessment in greater detail once an educated set of parameter choice is selected. Limitations on using short-term simulations for tuning climate model are also discussed.

  6. Validation of Power Requirement Model for Active Loudspeakers

    DEFF Research Database (Denmark)

    Schneider, Henrik; Madsen, Anders Normann; Bjerregaard, Ruben

    2015-01-01

    The actual power requirement of an active loudspeaker during playback of music has not received much attention in the literature. This is probably because no single and simple solution exists and because a complete system knowledge from input voltage to output sound pressure level is required...

  7. Closed loop models for analyzing engineering requirements for simulators

    Science.gov (United States)

    Baron, S.; Muralidharan, R.; Kleinman, D.

    1980-01-01

    A closed loop analytic model, incorporating a model for the human pilot, (namely, the optimal control model) that would allow certain simulation design tradeoffs to be evaluated quantitatively was developed. This model was applied to a realistic flight control problem. The resulting model is used to analyze both overall simulation effects and the effects of individual elements. The results show that, as compared to an ideal continuous simulation, the discrete simulation can result in significant performance and/or workload penalties.

  8. On the use of prior information in modelling metabolic utilization of energy in growing pigs

    DEFF Research Database (Denmark)

    Strathe, Anders Bjerring; Jørgensen, Henry; Fernández, José Adalberto

    2011-01-01

    Construction of models that provide a realistic representation of metabolic utilization of energy in growing animals tend to be over-parameterized because data generated from individual metabolic studies are often sparse. In the Bayesian framework prior information can enter the data analysis......) curves, resulting from a metabolism study on growing pigs of high genetic potential. A total of 17 crossbred pigs of three genders (barrows, boars and gilts) were used. Pigs were fed four diets based on barley, wheat and soybean meal supplemented with crystalline amino acids to meet Danish nutrient...

  9. Transaction-based building controls framework, Volume 2: Platform descriptive model and requirements

    Energy Technology Data Exchange (ETDEWEB)

    Akyol, Bora A. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Haack, Jereme N. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Carpenter, Brandon J. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Katipamula, Srinivas [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Lutes, Robert G. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Hernandez, George [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States)

    2015-07-31

    Transaction-based Building Controls (TBC) offer a control systems platform that provides an agent execution environment that meets the growing requirements for security, resource utilization, and reliability. This report outlines the requirements for a platform to meet these needs and describes an illustrative/exemplary implementation.

  10. Computational model for microstructure and effective thermal conductivity of ash deposits in utility boilers

    Science.gov (United States)

    Kweon, Soon-Cheol

    The ash deposits formed in pulverized-coal fired power plants reduce heat transfer rate to furnace wall, super heater tubes, and other heat transfer surfaces. The thermal properties that influence strongly on this heat transfer depend mainly on the microstructure of the ash deposit. This dissertation examines three issues associated with the ash deposits in utility boilers: (1) the three-dimensional model for characterization of the ash deposit microstructures from the sample ash deposits, (2) the computational model for effective thermal conductivity of sintered packed beds with low conductive stagnant fluids, and (3) the application of thermal resistor network model for the effective thermal conductivity of ash deposits in utility boilers. The SEM image analysis was conducted on two sample ash deposits to characterize three-dimensional microstructure of the ash deposit with several structural parameters using stereology. A ballistic deposition model was adopted to simulate the deposit structure defined by the structural parameters. The inputs for the deposition model were chosen from the predicted and measured physical parameters, such as the size distribution, the probability of the particle rolling, and the degree of the particle sintering. The difference between the microstructure of the sample deposits and the simulated deposits was investigated and compared quantitatively based on the structural parameters defined. Both the sample and the simulated deposits agree in terms of the structural parameters. The computational model for predicting the effective thermal conductivity of sintered packed beds with low conductive stagnant fluid was built and the heat conduction through the contact area among sintered particles is the dominant mode of heat transfer. A thermal resistor network is used to model the heat conduction among the sintered particles and the thermal resistance among the contacting particles is estimated from both the contact area and the contact

  11. Vitamin D Signaling in the Bovine Immune System: A Model for Understanding Human Vitamin D Requirements

    Directory of Open Access Journals (Sweden)

    Corwin D. Nelson

    2012-03-01

    Full Text Available The endocrine physiology of vitamin D in cattle has been rigorously investigated and has yielded information on vitamin D requirements, endocrine function in health and disease, general metabolism, and maintenance of calcium homeostasis in cattle. These results are relevant to human vitamin D endocrinology. The current debate regarding vitamin D requirements is centered on the requirements for proper intracrine and paracrine vitamin D signaling. Studies in adult and young cattle can provide valuable insight for understanding vitamin D requirements as they relate to innate and adaptive immune responses during infectious disease. In cattle, toll-like receptor recognition activates intracrine and paracrine vitamin D signaling mechanism in the immune system that regulates innate and adaptive immune responses in the presence of adequate 25-hydroxyvitamin D. Furthermore, experiments with mastitis in dairy cattle have provided in vivo evidence for the intracrine vitamin D signaling mechanism in macrophages as well as vitamin D mediated suppression of infection. Epidemiological evidence indicates that circulating concentrations above 32 ng/mL of 25-hydroxyvitamin D are necessary for optimal vitamin D signaling in the immune system, but experimental evidence is lacking for that value. Experiments in cattle can provide that evidence as circulating 25-hydroxyvitamin D concentrations can be experimentally manipulated within ranges that are normal for humans and cattle. Additionally, young and adult cattle can be experimentally infected with bacteria and viruses associated with significant diseases in both cattle and humans. Utilizing the bovine model to further delineate the immunomodulatory role of vitamin D will provide potentially valuable insights into the vitamin D requirements of both humans and cattle, especially as they relate to immune response capacity and infectious disease resistance.

  12. Case studies of community relations on DOE's Formerly Utilized Sites Remedial Action Program as models for Superfund sites

    International Nuclear Information System (INIS)

    Plant, S.W.; Adler, D.G.

    1995-01-01

    Ever since the US Department of Energy (DOE) created its Formerly Utilized Sites Remedial Action Program (FUSRAP) in 1974, there has been a community relations program. The community relations effort has grown as FUSRAP has grown. With 20 of 46 sites now cleaned up, considerable experience in working with FUSRAP stakeholders has been gained. Why not share that experience with others who labor on the Superfund sites? Many similarities exist between the Superfund sites and FUSRAP. FUSRAP is a large, multiple-site environmental restoration program. The challenges range from small sites requiring remedial actions measurable in weeks to major sites requiring the full remedial investigation/feasibility study process. The numerous Superfund sites throughout the United States offer the same diversity, both geographically and technically. But before DOE offers FUSRAP's community relations experience as a model, it needs to make clear that this will be a realistic model. As experiences are shared, DOE will certainly speak of the efforts that achieved its goals. But many of the problems that DOE encountered along the way will also be related. FUSRAP relies on a variety of one- and two-way communication techniques for involving stakeholders in the DOE decision-making process. Some of the techniques and experiences from the case studies are presented

  13. On the Path to SunShot. Utility Regulatory and Business Model Reforms for Addressing the Financial Impacts of Distributed Solar on Utilities

    Energy Technology Data Exchange (ETDEWEB)

    Barbose, Galen [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Miller, John [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sigrin, Ben [National Renewable Energy Lab. (NREL), Golden, CO (United States); Reiter, Emerson [National Renewable Energy Lab. (NREL), Golden, CO (United States); Cory, Karlynn [National Renewable Energy Lab. (NREL), Golden, CO (United States); McLaren, Joyce [National Renewable Energy Lab. (NREL), Golden, CO (United States); Seel, Joachim [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Mills, Andrew [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Darghouth, Naim [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Satchwell, Andrew [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-05-01

    Net-energy metering (NEM) has helped drive the rapid growth of distributed PV (DPV) but has raised concerns about electricity cost shifts, utility financial losses, and inefficient resource allocation. These concerns have motivated real and proposed reforms to utility regulatory and business models. This report explores the challenges and opportunities associated with such reforms in the context of the U.S. Department of Energy's SunShot Initiative. Most of the reforms to date address NEM concerns by reducing the benefits provided to DPV customers and thus constraining DPV deployment. Eliminating NEM nationwide, by compensating exports of PV electricity at wholesale rather than retail rates, could cut cumulative DPV deployment by 20% in 2050 compared with a continuation of current policies. This would slow the PV cost reductions that arise from larger scale and market certainty. It could also thwart achievement of the SunShot deployment goals even if the initiative's cost targets are achieved. This undesirable prospect is stimulating the development of alternative reform strategies that address concerns about distributed PV compensation without inordinately harming PV economics and growth. These alternatives fall into the categories of facilitating higher-value DPV deployment, broadening customer access to solar, and aligning utility profits and earnings with DPV. Specific strategies include utility ownership and financing of DPV, community solar, distribution network operators, services-driven utilities, performance-based incentives, enhanced utility system planning, pricing structures that incentivize high-value DPV configurations, and decoupling and other ratemaking reforms that reduce regulatory lag. These approaches represent near- and long-term solutions for preserving the legacy of the SunShot Initiative.

  14. Required Collaborative Work in Online Courses: A Predictive Modeling Approach

    Science.gov (United States)

    Smith, Marlene A.; Kellogg, Deborah L.

    2015-01-01

    This article describes a predictive model that assesses whether a student will have greater perceived learning in group assignments or in individual work. The model produces correct classifications 87.5% of the time. The research is notable in that it is the first in the education literature to adopt a predictive modeling methodology using data…

  15. User Requirements Model for University Timetable Management System

    OpenAIRE

    Ahmad Althunibat; Mohammad I. Muhairat

    2016-01-01

    Automated timetables are used to schedule courses, lectures and rooms in universities by considering some constraints. Inconvenient and ineffective timetables often waste time and money. Therefore, it is important to investigate the requirements and potential needs of users. Thus, eliciting user requirements of University Timetable Management System (TMS) and their implication becomes an important process for the implementation of TMS. On this note, this paper seeks to propose a m...

  16. The Modeling of Factors That Influence Coast Guard Manpower Requirements

    Science.gov (United States)

    2014-12-01

    Requirements Determination process. Finally, sincere thanks, hugs, and kisses to my family. I appreciate your enduring patience and encouragement. I...allowances. To help clarify the process, Phase II has guiding principles and core assumptions that direct the Phase. Three of the four guiding principles are...analyst is determining for the first time what manpower is required. The second notable guiding principle is “MRD analysts shall identify and

  17. Analytic model comparing the cost utility of TVT versus duloxetine in women with urinary stress incontinence.

    Science.gov (United States)

    Jacklin, Paul; Duckett, Jonathan; Renganathan, Arasee

    2010-08-01

    The purpose of this study was to assess cost utility of duloxetine versus tension-free vaginal tape (TVT) as a second-line treatment for urinary stress incontinence. A Markov model was used to compare the cost utility based on a 2-year follow-up period. Quality-adjusted life year (QALY) estimation was performed by assuming a disutility rate of 0.05. Under base-case assumptions, although duloxetine was a cheaper option, TVT gave a considerably higher QALY gain. When a longer follow-up period was considered, TVT had an incremental cost-effectiveness ratio (ICER) of pound 7,710 ($12,651) at 10 years. If the QALY gain from cure was 0.09, then the ICER for duloxetine and TVT would both fall within the indicative National Institute for Health and Clinical Excellence willingness to pay threshold at 2 years, but TVT would be the cost-effective option having extended dominance over duloxetine. This model suggests that TVT is a cost-effective treatment for stress incontinence.

  18. Modeling and design of light powered biomimicry micropump utilizing transporter proteins

    Science.gov (United States)

    Liu, Jin; Sze, Tsun-Kay Jackie; Dutta, Prashanta

    2014-11-01

    The creation of compact micropumps to provide steady flow has been an on-going challenge in the field of microfluidics. We present a mathematical model for a micropump utilizing Bacteriorhodopsin and sugar transporter proteins. This micropump utilizes transporter proteins as method to drive fluid flow by converting light energy into chemical potential. The fluid flow through a microchannel is simulated using the Nernst-Planck, Navier-Stokes, and continuity equations. Numerical results show that the micropump is capable of generating usable pressure. Designing parameters influencing the performance of the micropump are investigated including membrane fraction, lipid proton permeability, illumination, and channel height. The results show that there is a substantial membrane fraction region at which fluid flow is maximized. The use of lipids with low membrane proton permeability allows illumination to be used as a method to turn the pump on and off. This capability allows the micropump to be activated and shut off remotely without bulky support equipment. This modeling work provides new insights on mechanisms potentially useful for fluidic pumping in self-sustained bio-mimic microfluidic pumps. This work is supported in part by the National Science Fundation Grant CBET-1250107.

  19. Utilization of building information modeling in infrastructure’s design and construction

    Science.gov (United States)

    Zak, Josef; Macadam, Helen

    2017-09-01

    Building Information Modeling (BIM) is a concept that has gained its place in the design, construction and maintenance of buildings in Czech Republic during recent years. This paper deals with description of usage, applications and potential benefits and disadvantages connected with implementation of BIM principles in the preparation and construction of infrastructure projects. Part of the paper describes the status of BIM implementation in Czech Republic, and there is a review of several virtual design and construction practices in Czech Republic. Examples of best practice are presented from current infrastructure projects. The paper further summarizes experiences with new technologies gained from the application of BIM related workflows. The focus is on the BIM model utilization for the machine control systems on site, quality assurance, quality management and construction management.

  20. Grid-connection of large offshore windfarms utilizing VSC-HVDC: Modeling and grid impact

    DEFF Research Database (Denmark)

    Xue, Yijing; Akhmatov, Vladislav

    2009-01-01

    Utilization of Voltage Source Converter (VSC) – High Voltage Direct Current (HVDC) systems for grid-connection of large offshore windfarms becomes relevant as installed power capacities as well as distances to the connection points of the on-land transmission systems increase. At the same time...... for grid-connection of large offshore windfarms. The VSC-HVDC model is implemented using a general approach of independent control of active and reactive power in normal operation situations. The on-land VSC inverter, which is also called a grid-side inverter, provides voltage support to the transmission...... system and comprises a LVFRT solution in short-circuit faults. The presented model, LVFRT solution and impact on the system stability are investigated as a case study of a 1,000 MW offshore windfarm grid-connected through four parallel VSC-HVDC systems each with a 280 MVA power rating. The investigation...

  1. A new model for evaluating maintenance energy requirements in dogs: allometric equation from 319 pet dogs.

    Science.gov (United States)

    Divol, Guilhem; Priymenko, Nathalie

    2017-01-01

    Reports concerning maintenance energy requirements (MER) in dogs are common but most of the data cover laboratory or utility dogs. This study establishes those of healthy adult pet dogs and the factors which cause these energy requirements to vary. Within the framework of a nutrition teaching exercise, each student followed a pet from his entourage and gathered accurate records of its feeding habits. Data have been restricted to healthy adult dogs with an ideal body weight (BW) which did not vary more than 5 % during the study period. A total of 319 eligible records were analysed using multiple linear regression. Variation factors such as ownership, breed, sex and neutered status, bedding location, temperament and feeding habits were then analysed individually using a non-parametric model. Two models result from this study, one excluding age ( r 2 0·813) and a more accurate one which takes into consideration the age in years ( r 2 0·816). The second model was assessed with the main variation factors and shows that: MER (kcal) = k 1 × k 2 × k 3 × k 4 × k 5 × 128 × BW 0·740 × age -0·050 /d ( r 2 0·836), with k 1 the effect of the breed, k 2 the effect of sex and neutered status, k 3 the effect of bedding location, k 4 the effect of temperament and k 5 the effect of the type of feed. The resulting model is very similar to the recommendations made by the National Research Council (2006) but a greater accuracy was obtained using age raised to a negative power, as demonstrated in human nutrition.

  2. Effects of atmospheric variability on energy utilization and conservation. [Space heating energy demand modeling; Program HEATLOAD

    Energy Technology Data Exchange (ETDEWEB)

    Reiter, E.R.; Johnson, G.R.; Somervell, W.L. Jr.; Sparling, E.W.; Dreiseitly, E.; Macdonald, B.C.; McGuirk, J.P.; Starr, A.M.

    1976-11-01

    Research conducted between 1 July 1975 and 31 October 1976 is reported. A ''physical-adaptive'' model of the space-conditioning demand for energy and its response to changes in weather regimes was developed. This model includes parameters pertaining to engineering factors of building construction, to weather-related factors, and to socio-economic factors. Preliminary testing of several components of the model on the city of Greeley, Colorado, yielded most encouraging results. Other components, especially those pertaining to socio-economic factors, are still under development. Expansion of model applications to different types of structures and larger regions is presently underway. A CRT-display model for energy demand within the conterminous United States also has passed preliminary tests. A major effort was expended to obtain disaggregated data on energy use from utility companies throughout the United States. The study of atmospheric variability revealed that the 22- to 26-day vacillation in the potential and kinetic energy modes of the Northern Hemisphere is related to the behavior of the planetary long-waves, and that the midwinter dip in zonal available potential energy is reflected in the development of blocking highs. Attempts to classify weather patterns over the eastern and central United States have proceeded satisfactorily to the point where testing of our method for longer time periods appears desirable.

  3. Improved utilization of ADAS-cog assessment data through item response theory based pharmacometric modeling.

    Science.gov (United States)

    Ueckert, Sebastian; Plan, Elodie L; Ito, Kaori; Karlsson, Mats O; Corrigan, Brian; Hooker, Andrew C

    2014-08-01

    This work investigates improved utilization of ADAS-cog data (the primary outcome in Alzheimer's disease (AD) trials of mild and moderate AD) by combining pharmacometric modeling and item response theory (IRT). A baseline IRT model characterizing the ADAS-cog was built based on data from 2,744 individuals. Pharmacometric methods were used to extend the baseline IRT model to describe longitudinal ADAS-cog scores from an 18-month clinical study with 322 patients. Sensitivity of the ADAS-cog items in different patient populations as well as the power to detect a drug effect in relation to total score based methods were assessed with the IRT based model. IRT analysis was able to describe both total and item level baseline ADAS-cog data. Longitudinal data were also well described. Differences in the information content of the item level components could be quantitatively characterized and ranked for mild cognitively impairment and mild AD populations. Based on clinical trial simulations with a theoretical drug effect, the IRT method demonstrated a significantly higher power to detect drug effect compared to the traditional method of analysis. A combined framework of IRT and pharmacometric modeling permits a more effective and precise analysis than total score based methods and therefore increases the value of ADAS-cog data.

  4. Brain in flames – animal models of psychosis: utility and limitations

    Directory of Open Access Journals (Sweden)

    Mattei D

    2015-05-01

    Full Text Available Daniele Mattei,1 Regina Schweibold,1,2 Susanne A Wolf1 1Department of Cellular Neuroscience, Max-Delbrueck-Center for Molecular Medicine, Berlin, Germany; 2Department of Neurosurgery, Helios Clinics, Berlin, Germany Abstract: The neurodevelopmental hypothesis of schizophrenia posits that schizophrenia is a psychopathological condition resulting from aberrations in neurodevelopmental processes caused by a combination of environmental and genetic factors which proceed long before the onset of clinical symptoms. Many studies discuss an immunological component in the onset and progression of schizophrenia. We here review studies utilizing animal models of schizophrenia with manipulations of genetic, pharmacologic, and immunological origin. We focus on the immunological component to bridge the studies in terms of evaluation and treatment options of negative, positive, and cognitive symptoms. Throughout the review we link certain aspects of each model to the situation in human schizophrenic patients. In conclusion we suggest a combination of existing models to better represent the human situation. Moreover, we emphasize that animal models represent defined single or multiple symptoms or hallmarks of a given disease. Keywords: inflammation, schizophrenia, microglia, animal models 

  5. On the utility of land surface models for agricultural drought monitoring

    Directory of Open Access Journals (Sweden)

    W. T. Crow

    2012-09-01

    Full Text Available The lagged rank cross-correlation between model-derived root-zone soil moisture estimates and remotely sensed vegetation indices (VI is examined between January 2000 and December 2010 to quantify the skill of various soil moisture models for agricultural drought monitoring. Examined modeling strategies range from a simple antecedent precipitation index to the application of modern land surface models (LSMs based on complex water and energy balance formulations. A quasi-global evaluation of lagged VI/soil moisture cross-correlation suggests, when globally averaged across the entire annual cycle, soil moisture estimates obtained from complex LSMs provide little added skill (< 5% in relative terms in anticipating variations in vegetation condition relative to a simplified water accounting procedure based solely on observed precipitation. However, larger amounts of added skill (5–15% in relative terms can be identified when focusing exclusively on the extra-tropical growing season and/or utilizing soil moisture values acquired by averaging across a multi-model ensemble.

  6. Computational model of precision grip in Parkinson’s disease: A Utility based approach

    Directory of Open Access Journals (Sweden)

    Ankur eGupta

    2013-12-01

    Full Text Available We propose a computational model of Precision Grip (PG performance in normal subjects and Parkinson’s Disease (PD patients. Prior studies on grip force generation in PD patients show an increase in grip force during ON medication and an increase in the variability of the grip force during OFF medication (Fellows et al 1998; Ingvarsson et al 1997. Changes in grip force generation in dopamine-deficient PD conditions strongly suggest contribution of the Basal Ganglia, a deep brain system having a crucial role in translating dopamine signals to decision making. The present approach is to treat the problem of modeling grip force generation as a problem of action selection, which is one of the key functions of the Basal Ganglia. The model consists of two components: 1 the sensory-motor loop component, and 2 the Basal Ganglia component. The sensory-motor loop component converts a reference position and a reference grip force, into lift force and grip force profiles, respectively. These two forces cooperate in grip-lifting a load. The sensory-motor loop component also includes a plant model that represents the interaction between two fingers involved in PG, and the object to be lifted. The Basal Ganglia component is modeled using Reinforcement Learning with the significant difference that the action selection is performed using utility distribution instead of using purely Value-based distribution, thereby incorporating risk-based decision making. The proposed model is able to account for the precision grip results from normal and PD patients accurately (Fellows et. al. 1998; Ingvarsson et. al. 1997. To our knowledge the model is the first model of precision grip in PD conditions.

  7. A combined model to assess technical and economic consequences of changing conditions and management options for wastewater utilities.

    Science.gov (United States)

    Giessler, Mathias; Tränckner, Jens

    2018-02-01

    The paper presents a simplified model that quantifies economic and technical consequences of changing conditions in wastewater systems on utility level. It has been developed based on data from stakeholders and ministries, collected by a survey that determined resulting effects and adapted measures. The model comprises all substantial cost relevant assets and activities of a typical German wastewater utility. It consists of three modules: i) Sewer for describing the state development of sewer systems, ii) WWTP for process parameter consideration of waste water treatment plants (WWTP) and iii) Cost Accounting for calculation of expenses in the cost categories and resulting charges. Validity and accuracy of this model was verified by using historical data from an exemplary wastewater utility. Calculated process as well as economic parameters shows a high accuracy compared to measured parameters and given expenses. Thus, the model is proposed to support strategic, process oriented decision making on utility level. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Existing and Required Modeling Capabilities for Evaluating ATM Systems and Concepts

    Science.gov (United States)

    Odoni, Amedeo R.; Bowman, Jeremy; Delahaye, Daniel; Deyst, John J.; Feron, Eric; Hansman, R. John; Khan, Kashif; Kuchar, James K.; Pujet, Nicolas; Simpson, Robert W.

    1997-01-01

    ATM systems throughout the world are entering a period of major transition and change. The combination of important technological developments and of the globalization of the air transportation industry has necessitated a reexamination of some of the fundamental premises of existing Air Traffic Management (ATM) concepts. New ATM concepts have to be examined, concepts that may place more emphasis on: strategic traffic management; planning and control; partial decentralization of decision-making; and added reliance on the aircraft to carry out strategic ATM plans, with ground controllers confined primarily to a monitoring and supervisory role. 'Free Flight' is a case in point. In order to study, evaluate and validate such new concepts, the ATM community will have to rely heavily on models and computer-based tools/utilities, covering a wide range of issues and metrics related to safety, capacity and efficiency. The state of the art in such modeling support is adequate in some respects, but clearly deficient in others. It is the objective of this study to assist in: (1) assessing the strengths and weaknesses of existing fast-time models and tools for the study of ATM systems and concepts and (2) identifying and prioritizing the requirements for the development of additional modeling capabilities in the near future. A three-stage process has been followed to this purpose: 1. Through the analysis of two case studies involving future ATM system scenarios, as well as through expert assessment, modeling capabilities and supporting tools needed for testing and validating future ATM systems and concepts were identified and described. 2. Existing fast-time ATM models and support tools were reviewed and assessed with regard to the degree to which they offer the capabilities identified under Step 1. 3 . The findings of 1 and 2 were combined to draw conclusions about (1) the best capabilities currently existing, (2) the types of concept testing and validation that can be carried

  9. Modelling Imperfect Product Line Requirements with Fuzzy Feature Diagrams

    NARCIS (Netherlands)

    Noppen, J.A.R.; van den Broek, P.M.; Weston, Nathan; Rashid, Awais

    In this article, we identify that partial, vague and conflicting information can severely limit the effectiveness of approaches that derive feature trees from textual requirement specifications. We examine the impact such imperfect information has on feature tree extraction and we propose the use of

  10. Quantitative utilization of prior biological knowledge in the Bayesian network modeling of gene expression data

    Directory of Open Access Journals (Sweden)

    Gao Shouguo

    2011-08-01

    Full Text Available Abstract Background Bayesian Network (BN is a powerful approach to reconstructing genetic regulatory networks from gene expression data. However, expression data by itself suffers from high noise and lack of power. Incorporating prior biological knowledge can improve the performance. As each type of prior knowledge on its own may be incomplete or limited by quality issues, integrating multiple sources of prior knowledge to utilize their consensus is desirable. Results We introduce a new method to incorporate the quantitative information from multiple sources of prior knowledge. It first uses the Naïve Bayesian classifier to assess the likelihood of functional linkage between gene pairs based on prior knowledge. In this study we included cocitation in PubMed and schematic similarity in Gene Ontology annotation. A candidate network edge reservoir is then created in which the copy number of each edge is proportional to the estimated likelihood of linkage between the two corresponding genes. In network simulation the Markov Chain Monte Carlo sampling algorithm is adopted, and samples from this reservoir at each iteration to generate new candidate networks. We evaluated the new algorithm using both simulated and real gene expression data including that from a yeast cell cycle and a mouse pancreas development/growth study. Incorporating prior knowledge led to a ~2 fold increase in the number of known transcription regulations recovered, without significant change in false positive rate. In contrast, without the prior knowledge BN modeling is not always better than a random selection, demonstrating the necessity in network modeling to supplement the gene expression data with additional information. Conclusion our new development provides a statistical means to utilize the quantitative information in prior biological knowledge in the BN modeling of gene expression data, which significantly improves the performance.

  11. From requirement document to formal modelling and decomposition of control systems

    OpenAIRE

    Yeganefard, Sanaz

    2014-01-01

    Formal modelling of control systems can help with identifying missing requirements and design flaws before implementing them. However, modelling using formal languages can be challenging and time consuming. Therefore intermediate steps may be required to simplify the transition from informal requirements to a formal model.In this work we firstly provide a four-stage approach for structuring and formalising requirements of a control system. This approach is based on monitored, controlled, mode...

  12. Three-dimensional cell culture model utilization in cancer stem cell research.

    Science.gov (United States)

    Bielecka, Zofia F; Maliszewska-Olejniczak, Kamila; Safir, Ilan J; Szczylik, Cezary; Czarnecka, Anna M

    2017-08-01

    Three-dimensional (3D) cell culture models are becoming increasingly popular in contemporary cancer research and drug resistance studies. Recently, scientists have begun incorporating cancer stem cells (CSCs) into 3D models and modifying culture components in order to mimic in vivo conditions better. Currently, the global cell culture market is primarily focused on either 3D cancer cell cultures or stem cell cultures, with less focus on CSCs. This is evident in the low product availability officially indicated for 3D CSC model research. This review discusses the currently available commercial products for CSC 3D culture model research. Additionally, we discuss different culture media and components that result in higher levels of stem cell subpopulations while better recreating the tumor microenvironment. In summary, although progress has been made applying 3D technology to CSC research, this technology could be further utilized and a greater number of 3D kits dedicated specifically to CSCs should be implemented. © 2016 The Authors. Biological Reviews published by John Wiley & Sons Ltd on behalf of Cambridge Philosophical Society.

  13. Model of sustainable utilization of organic solids waste in Cundinamarca, Colombia

    Directory of Open Access Journals (Sweden)

    Solanyi Castañeda Torres

    2017-05-01

    Full Text Available Introduction: This article considers a proposal of a model of use of organic solids waste for the department of Cundinamarca, which responds to the need for a tool to support decision-making for the planning and management of organic solids waste. Objective: To perform an approximation of a conceptual technical and mathematician optimization model to support decision-making in order to minimize environmental impacts. Materials and methods: A descriptive study was applied due to the fact that some fundamental characteristics of the studied homogeneous phenomenon are presented and it is also considered to be quasi experimental. The calculation of the model for plants of the department is based on three axes (environmental, economic and social, that are present in the general equation of optimization. Results: A model of harnessing organic solids waste in the techniques of biological treatment of composting aerobic and worm cultivation is obtained, optimizing the system with the emissions savings of greenhouse gases spread into the atmosphere, and in the reduction of the overall cost of final disposal of organic solids waste in sanitary landfill. Based on the economic principle of utility that determines the environmental feasibility and sustainability in the plants of harnessing organic solids waste to the department, organic fertilizers such as compost and humus capture carbon and nitrogen that reduce the tons of CO2.

  14. Neuro-fuzzy inverse model control structure of robotic manipulators utilized for physiotherapy applications

    Directory of Open Access Journals (Sweden)

    A.A. Fahmy

    2013-12-01

    Full Text Available This paper presents a new neuro-fuzzy controller for robot manipulators. First, an inductive learning technique is applied to generate the required inverse modeling rules from input/output data recorded in the off-line structure learning phase. Second, a fully differentiable fuzzy neural network is developed to construct the inverse dynamics part of the controller for the online parameter learning phase. Finally, a fuzzy-PID-like incremental controller was employed as Feedback servo controller. The proposed control system was tested using dynamic model of a six-axis industrial robot. The control system showed good results compared to the conventional PID individual joint controller.

  15. The Benefit of Ambiguity in Understanding Goals in Requirements Modelling

    DEFF Research Database (Denmark)

    Paay, Jeni; Pedell, Sonja; Sterling, Leon

    2011-01-01

    of their research is to create technologies that support more flexible and meaningful social interactions, by combining best practice in software engineering with ethnographic techniques to model complex social interactions from their socially oriented life for the purposes of building rich socio...... ambiguity in the process of elicitation and analysis through the use of empirically informed quality goals attached to functional goals. The authors demonstrate the benefit of articulating a quality goal without turning it into a functional goal. Their study shows that quality goals kept at a high level...... of abstraction, ambiguous and open for conversations through the modelling process add richness to goal models, and communicate quality attributes of the interaction being modelled to the design phase, where this ambiguity is regarded as a resource for design....

  16. Context analysis for a new regulatory model for electric utilities in Brazil

    International Nuclear Information System (INIS)

    El Hage, Fabio S.; Rufín, Carlos

    2016-01-01

    This article examines what would have to change in the Brazilian regulatory framework in order to make utilities profit from energy efficiency and the integration of resources, instead of doing so from traditional consumption growth, as it happens at present. We argue that the Brazilian integrated electric sector resembles a common-pool resources problem, and as such it should incorporate, in addition to the centralized operation for power dispatch already in place, demand side management, behavioral strategies, and smart grids, attained through a new business and regulatory model for utilities. The paper proposes several measures to attain a more sustainable and productive electricity distribution industry: decoupling revenues from volumetric sales through a fixed maximum load fee, which would completely offset current disincentives for energy efficiency; the creation of a market for negawatts (saved megawatts) using the current Brazilian mechanism of public auctions for the acquisition of wholesale energy; and the integration of technologies, especially through the growth of unregulated products and services. Through these measures, we believe that Brazil could improve both energy security and overall sustainability of its power sector in the long run. - Highlights: • Necessary changes in the Brazilian regulatory framework towards energy efficiency. • How to incorporate demand side management, behavioral strategies, and smart grids. • Proposition of a market for negawatts at public auctions. • Measures to attain a more sustainable electricity distribution industry in Brazil.

  17. Utilizing evolutionary information and gene expression data for estimating gene networks with bayesian network models.

    Science.gov (United States)

    Tamada, Yoshinori; Bannai, Hideo; Imoto, Seiya; Katayama, Toshiaki; Kanehisa, Minoru; Miyano, Satoru

    2005-12-01

    Since microarray gene expression data do not contain sufficient information for estimating accurate gene networks, other biological information has been considered to improve the estimated networks. Recent studies have revealed that highly conserved proteins that exhibit similar expression patterns in different organisms, have almost the same function in each organism. Such conserved proteins are also known to play similar roles in terms of the regulation of genes. Therefore, this evolutionary information can be used to refine regulatory relationships among genes, which are estimated from gene expression data. We propose a statistical method for estimating gene networks from gene expression data by utilizing evolutionarily conserved relationships between genes. Our method simultaneously estimates two gene networks of two distinct organisms, with a Bayesian network model utilizing the evolutionary information so that gene expression data of one organism helps to estimate the gene network of the other. We show the effectiveness of the method through the analysis on Saccharomyces cerevisiae and Homo sapiens cell cycle gene expression data. Our method was successful in estimating gene networks that capture many known relationships as well as several unknown relationships which are likely to be novel. Supplementary information is available at http://bonsai.ims.u-tokyo.ac.jp/~tamada/bayesnet/.

  18. Explaining regional variations in health care utilization between Swiss cantons using panel econometric models.

    Science.gov (United States)

    Camenzind, Paul A

    2012-03-13

    In spite of a detailed and nation-wide legislation frame, there exist large cantonal disparities in consumed quantities of health care services in Switzerland. In this study, the most important factors of influence causing these regional disparities are determined. The findings can also be productive for discussing the containment of health care consumption in other countries. Based on the literature, relevant factors that cause geographic disparities of quantities and costs in western health care systems are identified. Using a selected set of these factors, individual panel econometric models are calculated to explain the variation of the utilization in each of the six largest health care service groups (general practitioners, specialist doctors, hospital inpatient, hospital outpatient, medication, and nursing homes) in Swiss mandatory health insurance (MHI). The main data source is 'Datenpool santésuisse', a database of Swiss health insurers. For all six health care service groups, significant factors influencing the utilization frequency over time and across cantons are found. A greater supply of service providers tends to have strong interrelations with per capita consumption of MHI services. On the demand side, older populations and higher population densities represent the clearest driving factors. Strategies to contain consumption and costs in health care should include several elements. In the federalist Swiss system, the structure of regional health care supply seems to generate significant effects. However, the extent of driving factors on the demand side (e.g., social deprivation) or financing instruments (e.g., high deductibles) should also be considered.

  19. Safety Culture: A Requirement for New Business Models — Lessons Learned from Other High Risk Industries

    International Nuclear Information System (INIS)

    Kecklund, L.

    2016-01-01

    -cost subcontractors can turn out to be much more expensive due to interface proliferation. Other negative effects are social dumping by external contractors and loss of competence if procurement requirements are not taking quality and safety issues into account. Based on MTO Safety’s extensive experience in the nuclear domain and work on safety management and safety culture in the aviation, railway and maritime domain, the paper will present lessons learned which are applicable to the nuclear industry for facing the major challenges ahead. Assuring safety is a fundamental requirement for obtaining a licence to operate a business in nuclear power, aviation and railways, thus safety culture is an essential requirement for a successful business. Therefore safety culture must be part of any new business model in high risk industries. In the future safety culture and leadership commitment and skills in creating safety culture will be even more important. The paper will discuss how companies and public utilities are to achieve this and how the regulators are to assess this where learning across industries is a key success factor. (author)

  20. Glucose is required to maintain high ATP-levels for the energy utilizing steps during PDT-induced apoptosis

    International Nuclear Information System (INIS)

    Oberdanner, C.; Plaetzer, K.; Kiesslich, T.; Krammer, B.

    2003-01-01

    Full text: Photodynamic therapy (PDT) may trigger apoptosis or necrosis in cancer cells. Several steps in the induction and execution of apoptosis require high amounts of adenosine-5'-triphosphate (ATP). Since the mitochondrial membrane potential (ΔΨ) decreases early in apoptosis, we raised the question about the mechanisms of maintaining a sufficiently high ATP-level. We therefore monitored ΔΨ and the intracellular ATP-level of apoptotic human epidermoid carcinoma cells (A431) after photodynamic treatment with aluminium (III) phthalocyanine tetrasulfonate chloride. A maximum of caspase-3 activation and nuclear fragmentation was found at fluences of about 4 J.cm -2 . Under these conditions apoptotic cells reduced ΔΨ rapidly, while the ATP-level remained high for 4 to 6 hours after treatment for cells supplied with glucose. To analyze the contribution of glycolysis to the energy supply during apoptosis experiments were carried out with cells deprivated of glucose. These cells showed a rapid drop of ATP-content and neither caspase-activation nor nuclear fragmentation could be detected. We conclude that the use of glucose as a source of ATP is obligatory for the execution of PDT-induced apoptosis. (author)

  1. 4M Overturned Pyramid (MOP) Model Utilization: Case Studies on Collision in Indonesian and Japanese Maritime Traffic Systems (MTS)

    OpenAIRE

    Wanginingastuti Mutmainnah; Masao Furusho

    2016-01-01

    4M Overturned Pyramid (MOP) model is a new model, proposed by authors, to characterized MTS which is adopting epidemiological model that determines causes of accidents, including not only active failures but also latent failures and barriers. This model is still being developed. One of utilization of MOP model is characterizing accidents in MTS, i.e. collision in Indonesia and Japan that is written in this paper. The aim of this paper is to show the characteristics of ship collision accidents...

  2. Modeling menopause: The utility of rodents in translational behavioral endocrinology research.

    Science.gov (United States)

    Koebele, Stephanie V; Bimonte-Nelson, Heather A

    2016-05-01

    The human menopause transition and aging are each associated with an increase in a variety of health risk factors including, but not limited to, cardiovascular disease, osteoporosis, cancer, diabetes, stroke, sexual dysfunction, affective disorders, sleep disturbances, and cognitive decline. It is challenging to systematically evaluate the biological underpinnings associated with the menopause transition in the human population. For this reason, rodent models have been invaluable tools for studying the impact of gonadal hormone fluctuations and eventual decline on a variety of body systems. While it is essential to keep in mind that some of the mechanisms associated with aging and the transition into a reproductively senescent state can differ when translating from one species to another, animal models provide researchers with opportunities to gain a fundamental understanding of the key elements underlying reproduction and aging processes, paving the way to explore novel pathways for intervention associated with known health risks. Here, we discuss the utility of several rodent models used in the laboratory for translational menopause research, examining the benefits and drawbacks in helping us to better understand aging and the menopause transition in women. The rodent models discussed are ovary-intact, ovariectomy, and 4-vinylcylohexene diepoxide for the menopause transition. We then describe how these models may be implemented in the laboratory, particularly in the context of cognition. Ultimately, we aim to use these animal models to elucidate novel perspectives and interventions for maintaining a high quality of life in women, and to potentially prevent or postpone the onset of negative health consequences associated with these significant life changes during aging. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  3. Modeling menopause: The utility of rodents in translational behavioral endocrinology research

    Science.gov (United States)

    Koebele, Stephanie V.; Bimonte-Nelson, Heather A.

    2016-01-01

    The human menopause transition and aging are each associated with an increase in a variety of health risk factors including, but not limited to, cardiovascular disease, osteoporosis, cancer, diabetes, stroke, sexual dysfunction, affective disorders, sleep disturbances, and cognitive decline. It is often challenging to systematically evaluate the biological underpinnings associated with the menopause transition in the human population. For this reason, rodent models have been invaluable tools for studying the impact of gonadal hormone fluctuations and eventual decline on a variety of body systems. While it is essential to keep in mind that some of the mechanisms associated with aging and the transition into a reproductively senescent state can differ when translating from one species to another, animal models provide researchers with opportunities to gain a fundamental understanding of the key elements underlying reproduction and aging processes, paving the way to explore novel pathways for intervention associated with known health risks. Here, we discuss the utility of several rodent models used in the laboratory for translational menopause research, examining the benefits and drawbacks in helping us to better understand aging and the menopause transition in women. The rodent models discussed are ovary-intact, ovariectomy, and 4-vinylcylohexene diepoxide for the menopause transition. We then describe how these models may be implemented in the laboratory, particularly in the context of cognition. Ultimately, we aim to use these animal models to elucidate novel perspectives and interventions for maintaining a high quality of life in women, and to potentially prevent or postpone the onset of negative health consequences associated with these significant life changes during aging. PMID:27013283

  4. OPTIMIZATION OF ATM AND BRANCH CASH OPERATIONS USING AN INTEGRATED CASH REQUIREMENT FORECASTING AND CASH OPTIMIZATION MODEL

    Directory of Open Access Journals (Sweden)

    Canser BİLİR

    2018-04-01

    Full Text Available In this study, an integrated cash requirement forecasting and cash inventory optimization model is implemented in both the branch and automated teller machine (ATM networks of a mid-sized bank in Turkey to optimize the bank’s cash supply chain. The implemented model’s objective is to minimize the idle cash levels at both branches and ATMs without decreasing the customer service level (CSL by providing the correct amount of cash at the correct location and time. To the best of our knowledge, the model is the first integrated model in the literature to be applied to both ATMs and branches simultaneously. The results demonstrated that the integrated model dramatically decreased the idle cash levels at both branches and ATMs without degrading the availability of cash and hence customer satisfaction. An in-depth analysis of the results also indicated that the results were more remarkable for branches. The results also demonstrated that the utilization of various seasonal indices plays a very critical role in the forecasting of cash requirements for a bank. Another unique feature of the study is that the model is the first to include the recycling feature of ATMs. The results demonstrated that as a result of the inclusion of the deliberate seasonal indices in the forecasting model, the integrated cash optimization models can be used to estimate the cash requirements of recycling ATMs.

  5. Mathematical model of a utility firm. Final technical report, Part I

    Energy Technology Data Exchange (ETDEWEB)

    1983-08-21

    Utility companies are in the predicament of having to make forecasts, and draw up plans for the future, in an increasingly fluid and volatile socio-economic environment. The project being reported is to contribute to an understanding of the economic and behavioral processes that take place within a firm, and without it. Three main topics are treated. One is the representation of the characteristics of the members of an organization, to the extent to which characteristics seem pertinent to the processes of interest. The second is the appropriate management of the processes of change by an organization. The third deals with the competitive striving towards an economic equilibrium among the members of a society in the large, on the theory that this process might be modeled in a way which is similar to the one for the intra-organizational ones. This volume covers mainly the first topic.

  6. Requirements and Problems in Parallel Model Development at DWD

    Directory of Open Access Journals (Sweden)

    Ulrich Schäattler

    2000-01-01

    Full Text Available Nearly 30 years after introducing the first computer model for weather forecasting, the Deutscher Wetterdienst (DWD is developing the 4th generation of its numerical weather prediction (NWP system. It consists of a global grid point model (GME based on a triangular grid and a non-hydrostatic Lokal Modell (LM. The operational demand for running this new system is immense and can only be met by parallel computers. From the experience gained in developing earlier NWP models, several new problems had to be taken into account during the design phase of the system. Most important were portability (including efficieny of the programs on several computer architectures and ease of code maintainability. Also the organization and administration of the work done by developers from different teams and institutions is more complex than it used to be. This paper describes the models and gives some performance results. The modular approach used for the design of the LM is explained and the effects on the development are discussed.

  7. Three Tier Unified Process Model for Requirement Negotiations and Stakeholder Collaborations

    Science.gov (United States)

    Niazi, Muhammad Ashraf Khan; Abbas, Muhammad; Shahzad, Muhammad

    2012-11-01

    This research paper is focused towards carrying out a pragmatic qualitative analysis of various models and approaches of requirements negotiations (a sub process of requirements management plan which is an output of scope managementís collect requirements process) and studies stakeholder collaborations methodologies (i.e. from within communication management knowledge area). Experiential analysis encompass two tiers; first tier refers to the weighted scoring model while second tier focuses on development of SWOT matrices on the basis of findings of weighted scoring model for selecting an appropriate requirements negotiation model. Finally the results are simulated with the help of statistical pie charts. On the basis of simulated results of prevalent models and approaches of negotiations, a unified approach for requirements negotiations and stakeholder collaborations is proposed where the collaboration methodologies are embeded into selected requirements negotiation model as internal parameters of the proposed process alongside some external required parameters like MBTI, opportunity analysis etc.

  8. Complex problems require complex solutions: the utility of social quality theory for addressing the Social Determinants of Health

    Directory of Open Access Journals (Sweden)

    Ward Paul R

    2011-08-01

    Full Text Available Abstract Background In order to improve the health of the most vulnerable groups in society, the WHO Commission on Social Determinants of Health (CSDH called for multi-sectoral action, which requires research and policy on the multiple and inter-linking factors shaping health outcomes. Most conceptual tools available to researchers tend to focus on singular and specific social determinants of health (SDH (e.g. social capital, empowerment, social inclusion. However, a new and innovative conceptual framework, known as social quality theory, facilitates a more complex and complete understanding of the SDH, with its focus on four domains: social cohesion, social inclusion, social empowerment and socioeconomic security, all within the same conceptual framework. This paper provides both an overview of social quality theory in addition to findings from a national survey of social quality in Australia, as a means of demonstrating the operationalisation of the theory. Methods Data were collected using a national random postal survey of 1044 respondents in September, 2009. Multivariate logistic regression analysis was conducted. Results Statistical analysis revealed that people on lower incomes (less than $45000 experience worse social quality across all of the four domains: lower socio-economic security, lower levels of membership of organisations (lower social cohesion, higher levels of discrimination and less political action (lower social inclusion and lower social empowerment. The findings were mixed in terms of age, with people over 65 years experiencing lower socio-economic security, but having higher levels of social cohesion, experiencing lower levels of discrimination (higher social inclusion and engaging in more political action (higher social empowerment. In terms of gender, women had higher social cohesion than men, although also experienced more discrimination (lower social inclusion. Conclusions Applying social quality theory allows

  9. Complex problems require complex solutions: the utility of social quality theory for addressing the Social Determinants of Health.

    Science.gov (United States)

    Ward, Paul R; Meyer, Samantha B; Verity, Fiona; Gill, Tiffany K; Luong, Tini C N

    2011-08-05

    In order to improve the health of the most vulnerable groups in society, the WHO Commission on Social Determinants of Health (CSDH) called for multi-sectoral action, which requires research and policy on the multiple and inter-linking factors shaping health outcomes. Most conceptual tools available to researchers tend to focus on singular and specific social determinants of health (SDH) (e.g. social capital, empowerment, social inclusion). However, a new and innovative conceptual framework, known as social quality theory, facilitates a more complex and complete understanding of the SDH, with its focus on four domains: social cohesion, social inclusion, social empowerment and socioeconomic security, all within the same conceptual framework. This paper provides both an overview of social quality theory in addition to findings from a national survey of social quality in Australia, as a means of demonstrating the operationalisation of the theory. Data were collected using a national random postal survey of 1044 respondents in September, 2009. Multivariate logistic regression analysis was conducted. Statistical analysis revealed that people on lower incomes (less than $45000) experience worse social quality across all of the four domains: lower socio-economic security, lower levels of membership of organisations (lower social cohesion), higher levels of discrimination and less political action (lower social inclusion) and lower social empowerment. The findings were mixed in terms of age, with people over 65 years experiencing lower socio-economic security, but having higher levels of social cohesion, experiencing lower levels of discrimination (higher social inclusion) and engaging in more political action (higher social empowerment). In terms of gender, women had higher social cohesion than men, although also experienced more discrimination (lower social inclusion). Applying social quality theory allows researchers and policy makers to measure and respond to the

  10. Prediction of Adequate Prenatal Care Utilization Based on the Extended Parallel Process Model.

    Science.gov (United States)

    Hajian, Sepideh; Imani, Fatemeh; Riazi, Hedyeh; Salmani, Fatemeh

    2017-10-01

    Pregnancy complications are one of the major public health concerns. One of the main causes of preventable complications is the absence of or inadequate provision of prenatal care. The present study was conducted to investigate whether Extended Parallel Process Model's constructs can predict the utilization of prenatal care services. The present longitudinal prospective study was conducted on 192 pregnant women selected through the multi-stage sampling of health facilities in Qeshm, Hormozgan province, from April to June 2015. Participants were followed up from the first half of pregnancy until their childbirth to assess adequate or inadequate/non-utilization of prenatal care services. Data were collected using the structured Risk Behavior Diagnosis Scale. The analysis of the data was carried out in SPSS-22 using one-way ANOVA, linear regression and logistic regression analysis. The level of significance was set at 0.05. Totally, 178 pregnant women with a mean age of 25.31±5.42 completed the study. Perceived self-efficacy (OR=25.23; Pprenatal care. Husband's occupation in the labor market (OR=0.43; P=0.02), unwanted pregnancy (OR=0.352; Pcare for the minors or elderly at home (OR=0.35; P=0.045) were associated with lower odds of receiving prenatal care. The model showed that when perceived efficacy of the prenatal care services overcame the perceived threat, the likelihood of prenatal care usage will increase. This study identified some modifiable factors associated with prenatal care usage by women, providing key targets for appropriate clinical interventions.

  11. Modeling and optimization of processes for clean and efficient pulverized coal combustion in utility boilers

    Directory of Open Access Journals (Sweden)

    Belošević Srđan V.

    2016-01-01

    Full Text Available Pulverized coal-fired power plants should provide higher efficiency of energy conversion, flexibility in terms of boiler loads and fuel characteristics and emission reduction of pollutants like nitrogen oxides. Modification of combustion process is a cost-effective technology for NOx control. For optimization of complex processes, such as turbulent reactive flow in coal-fired furnaces, mathematical modeling is regularly used. The NOx emission reduction by combustion modifications in the 350 MWe Kostolac B boiler furnace, tangentially fired by pulverized Serbian lignite, is investigated in the paper. Numerical experiments were done by an in-house developed three-dimensional differential comprehensive combustion code, with fuel- and thermal-NO formation/destruction reactions model. The code was developed to be easily used by engineering staff for process analysis in boiler units. A broad range of operating conditions was examined, such as fuel and preheated air distribution over the burners and tiers, operation mode of the burners, grinding fineness and quality of coal, boiler loads, cold air ingress, recirculation of flue gases, water-walls ash deposition and combined effect of different parameters. The predictions show that the NOx emission reduction of up to 30% can be achieved by a proper combustion organization in the case-study furnace, with the flame position control. Impact of combustion modifications on the boiler operation was evaluated by the boiler thermal calculations suggesting that the facility was to be controlled within narrow limits of operation parameters. Such a complex approach to pollutants control enables evaluating alternative solutions to achieve efficient and low emission operation of utility boiler units. [Projekat Ministarstva nauke Republike Srbije, br. TR-33018: Increase in energy and ecology efficiency of processes in pulverized coal-fired furnace and optimization of utility steam boiler air preheater by using in

  12. Mathematical Formulation Requirements and Specifications for the Process Models

    International Nuclear Information System (INIS)

    Steefel, C.; Moulton, D.; Pau, G.; Lipnikov, K.; Meza, J.; Lichtner, P.; Wolery, T.; Bacon, D.; Spycher, N.; Bell, J.; Moridis, G.; Yabusaki, S.; Sonnenthal, E.; Zyvoloski, G.; Andre, B.; Zheng, L.; Davis, J.

    2010-01-01

    The Advanced Simulation Capability for Environmental Management (ASCEM) is intended to be a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM program is aimed at addressing critical EM program needs to better understand and quantify flow and contaminant transport behavior in complex geological systems. It will also address the long-term performance of engineered components including cementitious materials in nuclear waste disposal facilities, in order to reduce uncertainties and risks associated with DOE EM's environmental cleanup and closure activities. Building upon national capabilities developed from decades of Research and Development in subsurface geosciences, computational and computer science, modeling and applied mathematics, and environmental remediation, the ASCEM initiative will develop an integrated, open-source, high-performance computer modeling system for multiphase, multicomponent, multiscale subsurface flow and contaminant transport. This integrated modeling system will incorporate capabilities for predicting releases from various waste forms, identifying exposure pathways and performing dose calculations, and conducting systematic uncertainty quantification. The ASCEM approach will be demonstrated on selected sites, and then applied to support the next generation of performance assessments of nuclear waste disposal and facility decommissioning across the EM complex. The Multi-Process High Performance Computing (HPC) Simulator is one of three thrust areas in ASCEM. The other two are the Platform and Integrated Toolsets (dubbed the Platform) and Site Applications. The primary objective of the HPC Simulator is to provide a flexible and extensible computational engine to simulate the coupled processes and flow scenarios described by the conceptual models developed using the ASCEM Platform. The graded and iterative approach to assessments naturally

  13. Modeling Utility Load and Temperature Relationships for Use with Long-Lead Forecasts.

    Science.gov (United States)

    Robinson, Peter J.

    1997-05-01

    Models relating system-wide average temperature to total system load were developed for the Virginia Power and Duke Power service areas in the southeastern United States. Daily data for the 1985-91 period were used. The influence of temperature on load was at a minimum around 18°C and increased more rapidly with increasing temperatures than with decreasing ones. The response was sensitive to the day of the week, and models using separate weekdays as well as one using pooled data were created. None adequately accounted for civic holidays or for extreme temperatures. Estimates of average loads over a 3-month period, however, were accurate to within ±3%. The models were used to transform the probability distribution of 3-month average temperatures for each system, derived from the historical record, into load probabilities. These were used with the categorical temperature probabilities given by the National Weather Service long-lead forecasts to estimate the forecast load probabilities. In summer and winter the resultant change in distribution is sufficient to have an impact on the advance fuel purchase decisions of the utilities. Results in spring and fall are more ambiguous.

  14. Evaluation of remedial alternative of a LNAPL plume utilizing groundwater modeling

    International Nuclear Information System (INIS)

    Johnson, T.; Way, S.; Powell, G.

    1997-01-01

    The TIMES model was utilized to evaluate remedial options for a large LNAPL spill that was impacting the North Platte River in Glenrock, Wyoming. LNAPL was found discharging into the river from the adjoining alluvial aquifer. Subsequent investigations discovered an 18 hectare plume extended across the alluvium and into a sandstone bedrock outcrop to the south of the river. The TIMES model was used to estimate the LNAPL volume and to evaluate options for optimizing LNAPL recovery. Data collected from recovery and monitoring wells were used for model calibration. A LNAPL volume of 5.5 million L was estimated, over 3.0 million L of which is in the sandstone bedrock. An existing product recovery system was evaluated for its effectiveness. Three alternative recovery scenarios were also evaluated to aid in selecting the most cost-effective and efficient recovery system for the site. An active wellfield hydraulically upgradient of the existing recovery system was selected as most appropriate to augment the existing system in recovering LNAPL efficiently

  15. Classification models of child molesters utilizing the Abel Assessment for sexual interest.

    Science.gov (United States)

    Abel, G G; Jordan, A; Hand, C G; Holland, L A; Phipps, A

    2001-05-01

    The aims of this study are to demonstrate 1) the criterion validity of the Abel Assessment for sexual interest (AASI) based on its ability to discriminate between non child molesters and admitting child molesters, and 2) its resistance to falsification based on its ability to discriminate between liar-denier child molesters and non child molesters. A group of 747 participants matched by age, race, and income was used to develop three logistic regression equations. The models compare a group of non child molesting patients under evaluation for other paraphilias to three groups: 1) a group of admitting molesters of girls under 14 years of age, 2) a group of admitting molesters of boys under 14 years of age, and 3) a group believed to be concealing or denying having molested. Both of the equations designed to discriminate between admitting child molesters and non child molesters were statistically significant. The equation contrasting child molesters attempting to conceal or deny their behavior and non child molesting patients was also statistically significant. The models classifying admitting child molesters versus non child molesters demonstrate criterion validity, while the third model provides evidence of the AASI's resistance to falsification and its utility as a tool in the detection of child molesters who deny the behavior. Results of the equations are reported and suggestions for their use are discussed.

  16. A clinically relevant model of osteoinduction: a process requiring calcium phosphate and BMP/Wnt signalling.

    Science.gov (United States)

    Eyckmans, J; Roberts, S J; Schrooten, J; Luyten, F P

    2010-06-01

    In this study, we investigated a clinically relevant model of in vivo ectopic bone formation utilizing human periosteum derived cells (HPDCs) seeded in a Collagraft carrier and explored the mechanisms by which this process is driven. Bone formation occurred after eight weeks when a minimum of one million HPDCs was loaded on Collagraft carriers and implanted subcutaneously in NMRI nu/nu mice. De novo bone matrix, mainly secreted by the HPDCs, was found juxta-proximal of the calcium phosphate (CaP) granules suggesting that CaP may have triggered the 'osteoinductive program'. Indeed, removal of the CaP granules by ethylenediaminetetraacetic acid decalcification prior to cell seeding and implantation resulted in loss of bone formation. In addition, inhibition of endogenous bone morphogenetic protein and Wnt signalling by overexpression of the secreted antagonists Noggin and Frzb, respectively, also abrogated osteoinduction. Proliferation of the engrafted HPDCs was strongly reduced in the decalcified scaffolds or when seeded with adenovirus-Noggin/Frzb transduced HPDCs indicating that cell division of the engrafted HPDCs is required for the direct bone formation cascade. These data suggest that this model of bone formation is similar to that observed during physiological intramembranous bone development and may be of importance when investigating tissue engineering strategies.

  17. Modelling production of field crops and its requirements

    NARCIS (Netherlands)

    Wit, de C.T.; Keulen, van H.

    1987-01-01

    Simulation models are being developed that enable quantitative estimates of the growth and production of the main agricultural crops under a wide range of weather and soil conditions. For this purpose, several hierarchically ordered production situations are distinguished in such a way that the

  18. Predicting Flu Season Requirements: An Undergraduate Modeling Project

    Science.gov (United States)

    Kramlich, Gary R., II; Braunstein Fierson, Janet L.; Wright, J. Adam

    2010-01-01

    This project was designed to be used in a freshman calculus class whose students had already been introduced to logistic functions and basic data modeling techniques. It need not be limited to such an audience, however; it has also been implemented in a topics in mathematics class for college upperclassmen. Originally intended to be presented in…

  19. Modelling of landfill gas adsorption with bottom ash for utilization of renewable energy

    Energy Technology Data Exchange (ETDEWEB)

    Miao, Chen

    2011-10-06

    Energy crisis, environment pollution and climate change are the serious challenges to people worldwide. In the 21st century, human being is trend to research new technology of renewable energy, so as to slow down global warming and develop society in an environmentally sustainable method. Landfill gas, produced by biodegradable municipal solid waste in landfill, is a renewable energy source. In this work, landfill gas utilization for energy generation is introduced. Landfill gas is able to produce hydrogen by steam reforming reactions. There is a steam reformer equipment in the fuel cells system. A sewage plant of Cologne in Germany has run the Phosphoric Acid Fuel Cells power station with biogas for more than 50,000 hours successfully. Landfill gas thus may be used as fuel for electricity generation via fuel cells system. For the purpose of explaining the possibility of landfill gas utilization via fuel cells, the thermodynamics of landfill gas steam reforming are discussed by simulations. In practice, the methane-riched gas can be obtained by landfill gas purification and upgrading. This work investigate a new method for upgrading-landfill gas adsorption with bottom ash experimentally. Bottom ash is a by-product of municipal solid waste incineration, some of its physical and chemical properties are analysed in this work. The landfill gas adsorption experimental data show bottom ash can be used as a potential adsorbent for landfill gas adsorption to remove CO{sub 2}. In addition, the alkalinity of bottom ash eluate can be reduced in these adsorption processes. Therefore, the interactions between landfill gas and bottom ash can be explained by series reactions accordingly. Furthermore, a conceptual model involving landfill gas adsorption with bottom ash is developed. In this thesis, the parameters of landfill gas adsorption equilibrium equations can be obtained by fitting experimental data. On the other hand, these functions can be deduced with theoretical approach

  20. A Quantitative Human Spacecraft Design Evaluation Model for Assessing Crew Accommodation and Utilization

    Science.gov (United States)

    Fanchiang, Christine

    Crew performance, including both accommodation and utilization factors, is an integral part of every human spaceflight mission from commercial space tourism, to the demanding journey to Mars and beyond. Spacecraft were historically built by engineers and technologists trying to adapt the vehicle into cutting edge rocketry with the assumption that the astronauts could be trained and will adapt to the design. By and large, that is still the current state of the art. It is recognized, however, that poor human-machine design integration can lead to catastrophic and deadly mishaps. The premise of this work relies on the idea that if an accurate predictive model exists to forecast crew performance issues as a result of spacecraft design and operations, it can help designers and managers make better decisions throughout the design process, and ensure that the crewmembers are well-integrated with the system from the very start. The result should be a high-quality, user-friendly spacecraft that optimizes the utilization of the crew while keeping them alive, healthy, and happy during the course of the mission. Therefore, the goal of this work was to develop an integrative framework to quantitatively evaluate a spacecraft design from the crew performance perspective. The approach presented here is done at a very fundamental level starting with identifying and defining basic terminology, and then builds up important axioms of human spaceflight that lay the foundation for how such a framework can be developed. With the framework established, a methodology for characterizing the outcome using a mathematical model was developed by pulling from existing metrics and data collected on human performance in space. Representative test scenarios were run to show what information could be garnered and how it could be applied as a useful, understandable metric for future spacecraft design. While the model is the primary tangible product from this research, the more interesting outcome of

  1. Mathematical Formulation Requirements and Specifications for the Process Models

    Energy Technology Data Exchange (ETDEWEB)

    Steefel, C.; Moulton, D.; Pau, G.; Lipnikov, K.; Meza, J.; Lichtner, P.; Wolery, T.; Bacon, D.; Spycher, N.; Bell, J.; Moridis, G.; Yabusaki, S.; Sonnenthal, E.; Zyvoloski, G.; Andre, B.; Zheng, L.; Davis, J.

    2010-11-01

    The Advanced Simulation Capability for Environmental Management (ASCEM) is intended to be a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM program is aimed at addressing critical EM program needs to better understand and quantify flow and contaminant transport behavior in complex geological systems. It will also address the long-term performance of engineered components including cementitious materials in nuclear waste disposal facilities, in order to reduce uncertainties and risks associated with DOE EM's environmental cleanup and closure activities. Building upon national capabilities developed from decades of Research and Development in subsurface geosciences, computational and computer science, modeling and applied mathematics, and environmental remediation, the ASCEM initiative will develop an integrated, open-source, high-performance computer modeling system for multiphase, multicomponent, multiscale subsurface flow and contaminant transport. This integrated modeling system will incorporate capabilities for predicting releases from various waste forms, identifying exposure pathways and performing dose calculations, and conducting systematic uncertainty quantification. The ASCEM approach will be demonstrated on selected sites, and then applied to support the next generation of performance assessments of nuclear waste disposal and facility decommissioning across the EM complex. The Multi-Process High Performance Computing (HPC) Simulator is one of three thrust areas in ASCEM. The other two are the Platform and Integrated Toolsets (dubbed the Platform) and Site Applications. The primary objective of the HPC Simulator is to provide a flexible and extensible computational engine to simulate the coupled processes and flow scenarios described by the conceptual models developed using the ASCEM Platform. The graded and iterative approach to assessments

  2. Utility of large-animal models of BPD: chronically ventilated preterm lambs.

    Science.gov (United States)

    Albertine, Kurt H

    2015-05-15

    This paper is focused on unique insights provided by the preterm lamb physiological model of bronchopulmonary dysplasia (BPD). Connections are also made to insights provided by the former preterm baboon model of BPD, as well as to rodent models of lung injury to the immature, postnatal lung. The preterm lamb and baboon models recapitulate the clinical setting of preterm birth and respiratory failure that require prolonged ventilation support for days or weeks with oxygen-rich gas. An advantage of the preterm lamb model is the large size of preterm lambs, which facilitates physiological studies for days or weeks during the evolution of neonatal chronic lung disease (CLD). To this advantage is linked an integrated array of morphological, biochemical, and molecular analyses that are identifying the role of individual genes in the pathogenesis of neonatal CLD. Results indicate that the mode of ventilation, invasive mechanical ventilation vs. less invasive high-frequency nasal ventilation, is related to outcomes. Our approach also includes pharmacological interventions that test causality of specific molecular players, such as vitamin A supplementation in the pathogenesis of neonatal CLD. The new insights that are being gained from our preterm lamb model may have important translational implications about the pathogenesis and treatment of BPD in preterm human infants. Copyright © 2015 the American Physiological Society.

  3. Modeling Late-State Serpentinization on Enceladus and Implications for Methane-Utilizing Microbial Metabolisms

    Science.gov (United States)

    Hart, R.; Cardace, D.

    2017-12-01

    Modeling investigations of Enceladus and other icy-satellites have included physicochemical properties (Sohl et al., 2010; Glein et al., 2015; Neveu et al., 2015), geophysical prospects of serpentinization (Malamud and Prialnik, 2016; Vance et al., 2016), and aqueous geochemistry across different antifreeze fluid-rock scenarios (Neveu et al., 2017). To more effectively evaluate the habitability of Enceladus, in the context of recent observations (Waite et al., 2017), we model the potential bioenergetic pathways that would be thermodynamically favorable at the interface of hydrothermal water-rock reactions resulting from late stage serpentinization (>90% serpentinized), hypothesized on Enceladus. Building on previous geochemical model outputs of Enceladus (Neveu et al., 2017), and bioenergetic modeling (as in Amend and Shock, 2001; Cardace et al., 2015), we present a model of late stage serpentinization possible at the water-rock interface of Enceladus, and report changing activities of chemical species related to methane utilization by microbes over the course of serpentinization using the Geochemist's Workbench REACT code [modified Extended Debye-Hückel (Helgeson, 1969) using the thermodynamic database of SUPCRT92 (Johnson et al., 1992)]. Using a model protolith speculated to exist at Enceladus's water-rock boundary, constrained by extraterrestrial analog analytical data for subsurface serpentinites of the Coast Range Ophiolite (Lower Lake, CA, USA) mélange rocks, we deduce evolving habitability conditions as the model protolith reacts with feasible, though hypothetical, planetary ocean chemistries (from Glien et al., 2015, and Neveu et al., 2017). Major components of modeled oceans, Na-Cl, Mg-Cl, and Ca-Cl, show shifts in the feasibility of CO2-CH4-H2 driven microbial habitability, occurring early in the reaction progress, with methanogenesis being bioenergetically favored. Methanotrophy was favored late in the reaction progress of some Na-Cl systems and in the

  4. Requirements for Logical Models for Value-Added Tax Legislation

    DEFF Research Database (Denmark)

    Nielsen, Morten Ib; Simonsen, Jakob Grue; Larsen, Ken Friis

    Enterprise resource planning (ERP) systems are ubiquitous in commercial enterprises of all sizes and invariably need to account for the notion of value-added tax (VAT). The legal and technical difficulties in handling VAT are exacerbated by spanning a broad and chaotic spectrum of intricate country...... of the Danish VAT law in Web Ontology Language (OWL) and in Con¿git Product Modeling Language (CPML)....

  5. Single High Fidelity Geometric Data Sets for LCM - Model Requirements

    Science.gov (United States)

    2006-11-01

    triangles (.raw) to the native triangular facet file (.facet). The software vendors recommend the use of McNeil and Associates’ Rhinoceros 3D for all...surface modeling and export. Rhinoceros has the capability and precision to create highly detailed 3D surface geometry suitable for radar cross section... white before ending up at blue as the temperature increases [27]. IR radiation was discovered in 1800 but its application is still limited in

  6. COMPLEAT (Community-Oriented Model for Planning Least-Cost Energy Alternatives and Technologies): A planning tool for publicly owned electric utilities. [Community-Oriented Model for Planning Least-Cost Energy Alternatives and Technologies (Compleat)

    Energy Technology Data Exchange (ETDEWEB)

    1990-09-01

    COMPLEAT takes its name, as an acronym, from Community-Oriented Model for Planning Least-Cost Energy Alternatives and Technologies. It is an electric utility planning model designed for use principally by publicly owned electric utilities and agencies serving such utilities. As a model, COMPLEAT is significantly more full-featured and complex than called out in APPA's original plan and proposal to DOE. The additional complexity grew out of a series of discussions early in the development schedule, in which it became clear to APPA staff and advisors that the simplicity characterizing the original plan, while highly desirable in terms of utility applications, was not achievable if practical utility problems were to be addressed. The project teams settled on Energy 20/20, an existing model developed by Dr. George Backus of Policy Assessment Associates, as the best candidate for the kinds of modifications and extensions that would be required. The remainder of the project effort was devoted to designing specific input data files, output files, and user screens and to writing and testing the compute programs that would properly implement the desired features around Energy 20/20 as a core program. This report presents in outline form, the features and user interface of COMPLEAT.

  7. Performance Requirements Modeling andAssessment for Active Power Ancillary Services

    DEFF Research Database (Denmark)

    Bondy, Daniel Esteban Morales; Thavlov, Anders; Tougaard, Janus Bundsgaard Mosbæk

    2017-01-01

    system operation, a reliable service delivery is required, yet it may not be appropriate to apply conventional performance requirements to new technologies and methods. The service performance requirements and assessment methods therefore need to be generalized and standardized in order to include future...... ancillary service sources. This paper develops a modeling method for ancillary services performance requirements, including performance and verification indices. The use of the modeling method and the indices is exemplified in two case studies....

  8. Protein (multi-)location prediction: utilizing interdependencies via a generative model

    Science.gov (United States)

    Shatkay, Hagit

    2015-01-01

    Motivation: Proteins are responsible for a multitude of vital tasks in all living organisms. Given that a protein’s function and role are strongly related to its subcellular location, protein location prediction is an important research area. While proteins move from one location to another and can localize to multiple locations, most existing location prediction systems assign only a single location per protein. A few recent systems attempt to predict multiple locations for proteins, however, their performance leaves much room for improvement. Moreover, such systems do not capture dependencies among locations and usually consider locations as independent. We hypothesize that a multi-location predictor that captures location inter-dependencies can improve location predictions for proteins. Results: We introduce a probabilistic generative model for protein localization, and develop a system based on it—which we call MDLoc—that utilizes inter-dependencies among locations to predict multiple locations for proteins. The model captures location inter-dependencies using Bayesian networks and represents dependency between features and locations using a mixture model. We use iterative processes for learning model parameters and for estimating protein locations. We evaluate our classifier MDLoc, on a dataset of single- and multi-localized proteins derived from the DBMLoc dataset, which is the most comprehensive protein multi-localization dataset currently available. Our results, obtained by using MDLoc, significantly improve upon results obtained by an initial simpler classifier, as well as on results reported by other top systems. Availability and implementation: MDLoc is available at: http://www.eecis.udel.edu/∼compbio/mdloc. Contact: shatkay@udel.edu. PMID:26072505

  9. Decision model incorporating utility theory and measurement of social values applied to nuclear waste management

    International Nuclear Information System (INIS)

    Litchfield, J.W.; Hansen, J.V.; Beck, L.C.

    1975-07-01

    A generalized computer-based decision analysis model was developed and tested. Several alternative concepts for ultimate disposal have already been developed; however, significant research is still required before any of these can be implemented. To make a choice based on technical estimates of the costs, short-term safety, long-term safety, and accident detection and recovery requires estimating the relative importance of each of these factors or attributes. These relative importance estimates primarily involve social values and therefore vary from one individual to the next. The approach used was to sample various public groups to determine the relative importance of each of the factors to the public. These estimates of importance weights were combined in a decision analysis model with estimates, furnished by technical experts, of the degree to which each alternative concept achieves each of the criteria. This model then integrates the two separate and unique sources of information and provides the decision maker with information as to the preferences and concerns of the public as well as the technical areas within each concept which need further research. The model can rank the alternatives using sampled public opinion and techno-economic data. This model provides a decision maker with a structured approach to subdividing complex alternatives into a set of more easily considered attributes, measuring the technical performance of each alternative relative to each attribute, estimating relevant social values, and assimilating quantitative information in a rational manner to estimate total value for each alternative. Because of the explicit nature of this decision analysis, the decision maker can select a specific alternative supported by clear documentation and justification for his assumptions and estimates. (U.S.)

  10. "Open Access" Requires Clarification: Medical Journal Publication Models Evolve.

    Science.gov (United States)

    Lubowitz, James H; Brand, Jefferson C; Rossi, Michael J; Provencher, Matthew T

    2017-03-01

    While Arthroscopy journal is a traditional subscription model journal, our companion journal Arthroscopy Techniques is "open access." We used to believe open access simply meant online and free of charge. However, while open-access journals are free to readers, in 2017 authors must make a greater sacrifice in the form of an article-processing charge (APC). Again, while this does not apply to Arthroscopy, the APC will apply to Arthroscopy Techniques. Copyright © 2016 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  11. Modeling strategy to identify patients with primary immunodeficiency utilizing risk management and outcome measurement.

    Science.gov (United States)

    Modell, Vicki; Quinn, Jessica; Ginsberg, Grant; Gladue, Ron; Orange, Jordan; Modell, Fred

    2017-06-01

    This study seeks to generate analytic insights into risk management and probability of an identifiable primary immunodeficiency defect. The Jeffrey Modell Centers Network database, Jeffrey Modell Foundation's 10 Warning Signs, the 4 Stages of Testing Algorithm, physician-reported clinical outcomes, programs of physician education and public awareness, the SPIRIT® Analyzer, and newborn screening, taken together, generates P values of less than 0.05%. This indicates that the data results do not occur by chance, and that there is a better than 95% probability that the data are valid. The objectives are to improve patients' quality of life, while generating significant reduction of costs. The advances of the world's experts aligned with these JMF programs can generate analytic insights as to risk management and probability of an identifiable primary immunodeficiency defect. This strategy reduces the uncertainties related to primary immunodeficiency risks, as we can screen, test, identify, and treat undiagnosed patients. We can also address regional differences and prevalence, age, gender, treatment modalities, and sites of care, as well as economic benefits. These tools support high net benefits, substantial financial savings, and significant reduction of costs. All stakeholders, including patients, clinicians, pharmaceutical companies, third party payers, and government healthcare agencies, must address the earliest possible precise diagnosis, appropriate intervention and treatment, as well as stringent control of healthcare costs through risk assessment and outcome measurement. An affected patient is entitled to nothing less, and stakeholders are responsible to utilize tools currently available. Implementation offers a significant challenge to the entire primary immunodeficiency community.

  12. Cost-Utility Analysis of Bariatric Surgery in Italy: Results of Decision-Analytic Modelling

    Directory of Open Access Journals (Sweden)

    Marcello Lucchese

    2017-06-01

    Full Text Available Objective: To evaluate the cost-effectiveness of bariatric surgery in Italy from a third-party payer perspective over a medium-term (10 years and a long-term (lifetime horizon. Methods: A state-transition Markov model was developed, in which patients may experience surgery, post-surgery complications, diabetes mellitus type 2, cardiovascular diseases or die. Transition probabilities, costs, and utilities were obtained from the Italian and international literature. Three types of surgeries were considered: gastric bypass, sleeve gastrectomy, and adjustable gastric banding. A base-case analysis was performed for the population, the characteristics of which were obtained from surgery candidates in Italy. Results: In the base-case analysis, over 10 years, bariatric surgery led to cost increment of EUR 2,661 and generated additional 1.1 quality-adjusted life years (QALYs. Over a lifetime, surgery led to savings of EUR 8,649, additional 0.5 life years and 3.2 QALYs. Bariatric surgery was cost-effective at 10 years with an incremental cost-effectiveness ratio of EUR 2,412/QALY and dominant over conservative management over a lifetime. Conclusion: In a comprehensive decision analytic model, a current mix of surgical methods for bariatric surgery was cost-effective at 10 years and cost-saving over the lifetime of the Italian patient cohort considered in this analysis.

  13. Cost-Utility Analysis of Bariatric Surgery in Italy: Results of Decision-Analytic Modelling.

    Science.gov (United States)

    Lucchese, Marcello; Borisenko, Oleg; Mantovani, Lorenzo Giovanni; Cortesi, Paolo Angelo; Cesana, Giancarlo; Adam, Daniel; Burdukova, Elisabeth; Lukyanov, Vasily; Di Lorenzo, Nicola

    2017-01-01

    To evaluate the cost-effectiveness of bariatric surgery in Italy from a third-party payer perspective over a medium-term (10 years) and a long-term (lifetime) horizon. A state-transition Markov model was developed, in which patients may experience surgery, post-surgery complications, diabetes mellitus type 2, cardiovascular diseases or die. Transition probabilities, costs, and utilities were obtained from the Italian and international literature. Three types of surgeries were considered: gastric bypass, sleeve gastrectomy, and adjustable gastric banding. A base-case analysis was performed for the population, the characteristics of which were obtained from surgery candidates in Italy. In the base-case analysis, over 10 years, bariatric surgery led to cost increment of EUR 2,661 and generated additional 1.1 quality-adjusted life years (QALYs). Over a lifetime, surgery led to savings of EUR 8,649, additional 0.5 life years and 3.2 QALYs. Bariatric surgery was cost-effective at 10 years with an incremental cost-effectiveness ratio of EUR 2,412/QALY and dominant over conservative management over a lifetime. In a comprehensive decision analytic model, a current mix of surgical methods for bariatric surgery was cost-effective at 10 years and cost-saving over the lifetime of the Italian patient cohort considered in this analysis. © 2017 The Author(s) Published by S. Karger GmbH, Freiburg.

  14. Practical utilization of modeling and simulation in laboratory process waste assessments

    International Nuclear Information System (INIS)

    Lyttle, T.W.; Smith, D.M.; Weinrach, J.B.; Burns, M.L.

    1993-01-01

    At Los Alamos National Laboratory (LANL), facility waste streams tend to be small but highly diverse. Initial characterization of such waste streams is difficult in part due to a lack of tools to assist the waste generators in completing such assessments. A methodology has been developed at LANL to allow process knowledgeable field personnel to develop baseline waste generation assessments and to evaluate potential waste minimization technology. This process waste assessment (PWA) system is an application constructed within the process modeling system. The Process Modeling System (PMS) is an object-oriented, mass balance-based, discrete-event simulation using the common LISP object system (CLOS). Analytical capabilities supported within the PWA system include: complete mass balance specifications, historical characterization of selected waste streams and generation of facility profiles for materials consumption, resource utilization and worker exposure. Anticipated development activities include provisions for a best available technologies (BAT) database and integration with the LANL facilities management Geographic Information System (GIS). The environments used to develop these assessment tools will be discussed in addition to a review of initial implementation results

  15. Optimal energy-utilization ratio for long-distance cruising of a model fish

    Science.gov (United States)

    Liu, Geng; Yu, Yong-Liang; Tong, Bing-Gang

    2012-07-01

    The efficiency of total energy utilization and its optimization for long-distance migration of fish have attracted much attention in the past. This paper presents theoretical and computational research, clarifying the above well-known classic questions. Here, we specify the energy-utilization ratio (fη) as a scale of cruising efficiency, which consists of the swimming speed over the sum of the standard metabolic rate and the energy consumption rate of muscle activities per unit mass. Theoretical formulation of the function fη is made and it is shown that based on a basic dimensional analysis, the main dimensionless parameters for our simplified model are the Reynolds number (Re) and the dimensionless quantity of the standard metabolic rate per unit mass (Rpm). The swimming speed and the hydrodynamic power output in various conditions can be computed by solving the coupled Navier-Stokes equations and the fish locomotion dynamic equations. Again, the energy consumption rate of muscle activities can be estimated by the quotient of dividing the hydrodynamic power by the muscle efficiency studied by previous researchers. The present results show the following: (1) When the value of fη attains a maximum, the dimensionless parameter Rpm keeps almost constant for the same fish species in different sizes. (2) In the above cases, the tail beat period is an exponential function of the fish body length when cruising is optimal, e.g., the optimal tail beat period of Sockeye salmon is approximately proportional to the body length to the power of 0.78. Again, the larger fish's ability of long-distance cruising is more excellent than that of smaller fish. (3) The optimal swimming speed we obtained is consistent with previous researchers’ estimations.

  16. A choice modelling analysis on the similarity between distribution utilities' and industrial customers' price and quality preferences

    Energy Technology Data Exchange (ETDEWEB)

    Soederberg, Magnus [Gothenburg School of Business, Economics and Law, Department of Business Administration, Industrial and Financial Management, PO Box 610, SE-405 30 Gothenburg (Sweden)

    2008-05-15

    The Swedish Electricity Act states that electricity distribution must comply with both price and quality requirements. In order to maintain efficient regulation it is necessary to firstly, define quality attributes and secondly, determine a customer's priorities concerning price and quality attributes. If distribution utilities gain an understanding of customer preferences and incentives for reporting them, the regulator can save a lot of time by surveying them rather than their customers. This study applies a choice modelling methodology where utilities and industrial customers are asked to evaluate the same twelve choice situations in which price and four specific quality attributes are varied. The preferences expressed by the utilities, and estimated by a random parameter logit, correspond quite well with the preferences expressed by the largest industrial customers. The preferences expressed by the utilities are reasonably homogenous in relation to forms of association (private limited, public and trading partnership). If the regulator acts according to the preferences expressed by the utilities, smaller industrial customers will have to pay for quality they have not asked for. (author)

  17. Balancing energy development and conservation: A method utilizing species distribution models

    Science.gov (United States)

    Jarnevich, C.S.; Laubhan, M.K.

    2011-01-01

    Alternative energy development is increasing, potentially leading to negative impacts on wildlife populations already stressed by other factors. Resource managers require a scientifically based methodology to balance energy development and species conservation, so we investigated modeling habitat suitability using Maximum Entropy to develop maps that could be used with other information to help site energy developments. We selected one species of concern, the Lesser Prairie-Chicken (LPCH; Tympanuchus pallidicinctus) found on the southern Great Plains of North America, as our case study. LPCH populations have been declining and are potentially further impacted by energy development. We used LPCH lek locations in the state of Kansas along with several environmental and anthropogenic parameters to develop models that predict the probability of lek occurrence across the landscape. The models all performed well as indicated by the high test area under the curve (AUC) scores (all >0.9). The inclusion of anthropogenic parameters in models resulted in slightly better performance based on AUC values, indicating that anthropogenic features may impact LPCH lek habitat suitability. Given the positive model results, this methodology may provide additional guidance in designing future survey protocols, as well as siting of energy development in areas of marginal or unsuitable habitat for species of concern. This technique could help to standardize and quantify the impacts various developments have upon at-risk species. ?? 2011 Springer Science+Business Media, LLC (outside the USA).

  18. Formal Requirements Modeling with Executable Use Cases and Coloured Petri Nets

    OpenAIRE

    Jørgensen, Jens Bæk; Tjell, Simon; Fernandes, Joao Miguel

    2009-01-01

    This paper presents executable use cases (EUCs), which constitute a model-based approach to requirements engineering. EUCs may be used as a supplement to model-driven development (MDD) and can describe and link user-level requirements and more technical software specifications. In MDD, user-level requirements are not always explicitly described, since usually it is sufficient that one provides a specification, or platform-independent model, of the software that is to be developed. Th...

  19. Comparisons of Academic Researchers' and Physical Education Teachers' Perspectives on the Utilization of the Tactical Games Model

    Science.gov (United States)

    Harvey, Stephen; Pill, Shane

    2016-01-01

    Research commentary suggests the utilization of Tactical Games Models (TGMs) only exists in isolated instances, particularly where teachers demonstrate true fidelity to these models. In contrast, many academics have adopted TGMs into their courses. Consequently, the purpose of this study was to investigate reasons for this disparity. Participants…

  20. Requirements for modeling airborne microbial contamination in space stations

    Science.gov (United States)

    Van Houdt, Rob; Kokkonen, Eero; Lehtimäki, Matti; Pasanen, Pertti; Leys, Natalie; Kulmala, Ilpo

    2018-03-01

    Exposure to bioaerosols is one of the facets that affect indoor air quality, especially for people living in densely populated or confined habitats, and is associated to a wide range of health effects. Good indoor air quality is thus vital and a prerequisite for fully confined environments such as space habitats. Bioaerosols and microbial contamination in these confined space stations can have significant health impacts, considering the unique prevailing conditions and constraints of such habitats. Therefore, biocontamination in space stations is strictly monitored and controlled to ensure crew and mission safety. However, efficient bioaerosol control measures rely on solid understanding and knowledge on how these bioaerosols are created and dispersed, and which factors affect the survivability of the associated microorganisms. Here we review the current knowledge gained from relevant studies in this wide and multidisciplinary area of bioaerosol dispersion modeling and biological indoor air quality control, specifically taking into account the specific space conditions.

  1. Utilization of sulfate additives in biomass combustion: fundamental and modeling aspects

    DEFF Research Database (Denmark)

    Wu, Hao; Jespersen, Jacob Boll; Grell, Morten Nedergaard

    2013-01-01

    Sulfates, such as ammonium sulfate, aluminum sulfate and ferric sulfate, are effective additives for converting the alkali chlorides released from biomass combustion to the less harmful alkali sulfates. Optimization of the use of these additives requires knowledge on their decomposition rate...... was combined with a detailed gas-phase kinetic model of KCl sulfation and a model of K2SO4 condensation to simulate the sulfation of KCl by ferric sulfate addition. The simulation results showed good agreements with the experiments conducted in a biomass grate-firing combustor, where ferric sulfate...... and elemental sulfur were used as additives. The results indicated that the SO3 released from ferric sulfate decomposition was the main contributor to KCl sulfation and that the effectiveness of ferric sulfate addition was sensitive to the applied temperature conditions. Comparison of the effectiveness...

  2. Resource planning for gas utilities: Using a model to analyze pivotal issues

    Energy Technology Data Exchange (ETDEWEB)

    Busch, J.F.; Comnes, G.A.

    1995-11-01

    With the advent of wellhead price decontrols that began in the late 1970s and the development of open access pipelines in the 1980s and 90s, gas local distribution companies (LDCs) now have increased responsibility for their gas supplies and face an increasingly complex array of supply and capacity choices. Heretofore this responsibility had been share with the interstate pipelines that provide bundled firm gas supplies. Moreover, gas supply an deliverability (capacity) options have multiplied as the pipeline network becomes increasing interconnected and as new storage projects are developed. There is now a fully-functioning financial market for commodity price hedging instruments and, on interstate Pipelines, secondary market (called capacity release) now exists. As a result of these changes in the natural gas industry, interest in resource planning and computer modeling tools for LDCs is increasing. Although in some ways the planning time horizon has become shorter for the gas LDC, the responsibility conferred to the LDC and complexity of the planning problem has increased. We examine current gas resource planning issues in the wake of the Federal Energy Regulatory Commission`s (FERC) Order 636. Our goal is twofold: (1) to illustrate the types of resource planning methods and models used in the industry and (2) to illustrate some of the key tradeoffs among types of resources, reliability, and system costs. To assist us, we utilize a commercially-available dispatch and resource planning model and examine four types of resource planning problems: the evaluation of new storage resources, the evaluation of buyback contracts, the computation of avoided costs, and the optimal tradeoff between reliability and system costs. To make the illustration of methods meaningful yet tractable, we developed a prototype LDC and used it for the majority of our analysis.

  3. Modeling invasion of metastasizing cancer cells to bone marrow utilizing ecological principles.

    Science.gov (United States)

    Chen, Kun-Wan; Pienta, Kenneth J

    2011-10-03

    The invasion of a new species into an established ecosystem can be directly compared to the steps involved in cancer metastasis. Cancer must grow in a primary site, extravasate and survive in the circulation to then intravasate into target organ (invasive species survival in transport). Cancer cells often lay dormant at their metastatic site for a long period of time (lag period for invasive species) before proliferating (invasive spread). Proliferation in the new site has an impact on the target organ microenvironment (ecological impact) and eventually the human host (biosphere impact). Tilman has described mathematical equations for the competition between invasive species in a structured habitat. These equations were adapted to study the invasion of cancer cells into the bone marrow microenvironment as a structured habitat. A large proportion of solid tumor metastases are bone metastases, known to usurp hematopoietic stem cells (HSC) homing pathways to establish footholds in the bone marrow. This required accounting for the fact that this is the natural home of hematopoietic stem cells and that they already occupy this structured space. The adapted Tilman model of invasion dynamics is especially valuable for modeling the lag period or dormancy of cancer cells. The Tilman equations for modeling the invasion of two species into a defined space have been modified to study the invasion of cancer cells into the bone marrow microenvironment. These modified equations allow a more flexible way to model the space competition between the two cell species. The ability to model initial density, metastatic seeding into the bone marrow and growth once the cells are present, and movement of cells out of the bone marrow niche and apoptosis of cells are all aspects of the adapted equations. These equations are currently being applied to clinical data sets for verification and further refinement of the models.

  4. Utilizing a rat delayed implantation model to teach integrative endocrinology and reproductive biology.

    Science.gov (United States)

    Geisert, Rodney D; Smith, Michael F; Schmelzle, Amanda L; Green, Jonathan A

    2018-03-01

    In this teaching laboratory, the students are directed in an exercise that involves designing and performing an experiment to determine estrogen's role in regulating delayed implantation (diapause) in female rats. To encourage active participation by the students, a discussion question is provided before the laboratory exercise in which each student is asked to search the literature and provide written answers to questions and to formulate an experiment to test the role of ovarian estrogen in inducing implantation in female rats. One week before the laboratory exercise, students discuss their answers to the questions with the instructor to develop an experiment to test their hypothesis that estrogen is involved with inducing implantation in the rat. A rat delayed implantation model was established that utilizes an estrogen receptor antagonist (ICI 182,780), which inhibits the action of ovarian estrogens. Groups of mated females are treated with either carrier (control) or ICI 182,780 (ICI) every other day, starting on day 2 postcoitus (pc) until day 8 pc. One-half of the females receiving ICI are injected with estradiol-17β on day 8 pc to induce implantation 4 days after the controls. If the ICI-treated females are not administered estradiol, embryo implantation occurs spontaneously ~4 days after the last ICI injection on day 8. This is a very simple protocol that is very effective and provides an excellent experiment for student discussion on hormone action and the use of agonists and antagonists.

  5. Research utilization in the building industry: decision model and preliminary assessment

    Energy Technology Data Exchange (ETDEWEB)

    Watts, R.L.; Johnson, D.R.; Smith, S.A.; Westergard, E.J.

    1985-10-01

    The Research Utilization Program was conceived as a far-reaching means for managing the interactions of the private sector and the federal research sector as they deal with energy conservation in buildings. The program emphasizes a private-public partnership in planning a research agenda and in applying the results of ongoing and completed research. The results of this task support the hypothesis that the transfer of R and D results to the buildings industry can be accomplished more efficiently and quickly by a systematic approach to technology transfer. This systematic approach involves targeting decision makers, assessing research and information needs, properly formating information, and then transmitting the information through trusted channels. The purpose of this report is to introduce elements of a market-oriented knowledge base, which would be useful to the Building Systems Division, the Office of Buildings and Community Systems and their associated laboratories in managing a private-public research partnership on a rational systematic basis. This report presents conceptual models and data bases that can be used in formulating a technology transfer strategy and in planning technology transfer programs.

  6. Towards utilizing GPUs in information visualization: a model and implementation of image-space operations.

    Science.gov (United States)

    McDonnel, Bryan; Elmqvist, Niklas

    2009-01-01

    Modern programmable GPUs represent a vast potential in terms of performance and visual flexibility for information visualization research, but surprisingly few applications even begin to utilize this potential. In this paper, we conjecture that this may be due to the mismatch between the high-level abstract data types commonly visualized in our field, and the low-level floating-point model supported by current GPU shader languages. To help remedy this situation, we present a refinement of the traditional information visualization pipeline that is amenable to implementation using GPU shaders. The refinement consists of a final image-space step in the pipeline where the multivariate data of the visualization is sampled in the resolution of the current view. To concretize the theoretical aspects of this work, we also present a visual programming environment for constructing visualization shaders using a simple drag-and-drop interface. Finally, we give some examples of the use of shaders for well-known visualization techniques.

  7. Quality improvement in healthcare delivery utilizing the patient-centered medical home model.

    Science.gov (United States)

    Akinci, Fevzi; Patel, Poonam M

    2014-01-01

    Despite the fact that the United States dedicates so much of its resources to healthcare, the current healthcare delivery system still faces significant quality challenges. The lack of effective communication and coordination of care services across the continuum of care poses disadvantages for those requiring long-term management of their chronic conditions. This is why the new transformation in healthcare known as the patient-centered medical home (PCMH) can help restore confidence in our population that the healthcare services they receive is of the utmost quality and will effectively enhance their quality of life. Healthcare using the PCMH model is delivered with the patient at the center of the transformation and by reinvigorating primary care. The PCMH model strives to deliver effective quality care while attempting to reduce costs. In order to relieve some of our healthcare system distresses, organizations can modify their delivery of care to be patient centered. Enhanced coordination of services, better provider access, self-management, and a team-based approach to care represent some of the key principles of the PCMH model. Patients that can most benefit are those that require long-term management of their conditions such as chronic disease and behavioral health patient populations. The PCMH is a feasible option for delivery reform as pilot studies have documented successful outcomes. Controversy about the lack of a medical neighborhood has created concern about the overall sustainability of the medical home. The medical home can stand independently and continuously provide enhanced care services as a movement toward higher quality care while organizations and government policy assess what types of incentives to put into place for the full collaboration and coordination of care in the healthcare system.

  8. Optimal urban water conservation strategies considering embedded energy: coupling end-use and utility water-energy models.

    Science.gov (United States)

    Escriva-Bou, A.; Lund, J. R.; Pulido-Velazquez, M.; Spang, E. S.; Loge, F. J.

    2014-12-01

    Although most freshwater resources are used in agriculture, a greater amount of energy is consumed per unit of water supply for urban areas. Therefore, efforts to reduce the carbon footprint of water in cities, including the energy embedded within household uses, can be an order of magnitude larger than for other water uses. This characteristic of urban water systems creates a promising opportunity to reduce global greenhouse gas emissions, particularly given rapidly growing urbanization worldwide. Based on a previous Water-Energy-CO2 emissions model for household water end uses, this research introduces a probabilistic two-stage optimization model considering technical and behavioral decision variables to obtain the most economical strategies to minimize household water and water-related energy bills given both water and energy price shocks. Results show that adoption rates to reduce energy intensive appliances increase significantly, resulting in an overall 20% growth in indoor water conservation if household dwellers include the energy cost of their water use. To analyze the consequences on a utility-scale, we develop an hourly water-energy model based on data from East Bay Municipal Utility District in California, including the residential consumption, obtaining that water end uses accounts for roughly 90% of total water-related energy, but the 10% that is managed by the utility is worth over 12 million annually. Once the entire end-use + utility model is completed, several demand-side management conservation strategies were simulated for the city of San Ramon. In this smaller water district, roughly 5% of total EBMUD water use, we found that the optimal household strategies can reduce total GHG emissions by 4% and utility's energy cost over 70,000/yr. Especially interesting from the utility perspective could be the "smoothing" of water use peaks by avoiding daytime irrigation that among other benefits might reduce utility energy costs by 0.5% according to our

  9. Optimal growth of Lactobacillus casei in a Cheddar cheese ripening model system requires exogenous fatty acids.

    Science.gov (United States)

    Tan, W S; Budinich, M F; Ward, R; Broadbent, J R; Steele, J L

    2012-04-01

    Flavor development in ripening Cheddar cheese depends on complex microbial and biochemical processes that are difficult to study in natural cheese. Thus, our group has developed Cheddar cheese extract (CCE) as a model system to study these processes. In previous work, we found that CCE supported growth of Lactobacillus casei, one of the most prominent nonstarter lactic acid bacteria (NSLAB) species found in ripening Cheddar cheese, to a final cell density of 10(8) cfu/mL at 37°C. However, when similar growth experiments were performed at 8°C in CCE derived from 4-mo-old cheese (4mCCE), the final cell densities obtained were only about 10(6) cfu/mL, which is at the lower end of the range of the NSLAB population expected in ripening Cheddar cheese. Here, we report that addition of Tween 80 to CCE resulted in a significant increase in the final cell density of L. casei during growth at 8°C and produced concomitant changes in cytoplasmic membrane fatty acid (CMFA) composition. Although the effect was not as dramatic, addition of milk fat or a monoacylglycerol (MAG) mixture based on the MAG profile of milk fat to 4mCCE also led to an increased final cell density of L. casei in CCE at 8°C and changes in CMFA composition. These observations suggest that optimal growth of L. casei in CCE at low temperature requires supplementation with a source of fatty acids (FA). We hypothesize that L. casei incorporates environmental FA into its CMFA, thereby reducing its energy requirement for growth. The exogenous FA may then be modified or supplemented with FA from de novo synthesis to arrive at a CMFA composition that yields the functionality (i.e., viscosity) required for growth in specific conditions. Additional studies utilizing the CCE model to investigate microbial contributions to cheese ripening should be conducted in CCE supplemented with 1% milk fat. Copyright © 2012 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  10. On Early Conflict Identification by Requirements Modeling of Energy System Control Structures

    DEFF Research Database (Denmark)

    Heussen, Kai; Gehrke, Oliver; Niemann, Hans Henrik

    2015-01-01

    Control systems are purposeful systems involving goal-oriented information processing (cyber) and technical (physical) structures. Requirements modeling formalizes fundamental concepts and relations of a system architecture at a high-level design stage and can be used to identify potential design...... issues early. For requirements formulation of control structures, cyber and physical aspects need to be jointly represented to express interdependencies, check for consistency and discover potentially conflicting requirements. Early identification of potential conflicts may prevent larger problems...... modeling for early requirements checking using a suitable modeling language, and illustrates how this approach enables the identification of several classes of controller conflict....

  11. A semi-automated approach for generating natural language requirements documents based on business process models

    NARCIS (Netherlands)

    Aysolmaz, Banu; Leopold, Henrik; Reijers, Hajo A.; Demirörs, Onur

    2018-01-01

    Context: The analysis of requirements for business-related software systems is often supported by using business process models. However, the final requirements are typically still specified in natural language. This means that the knowledge captured in process models must be consistently

  12. Requirements-level semantics and model checking of object-oriented statecharts

    NARCIS (Netherlands)

    Eshuis, H.; Jansen, D.N.; Wieringa, Roelf J.

    2002-01-01

    In this paper we define a requirements-level execution semantics for object-oriented statecharts and show how properties of a system specified by these statecharts can be model checked using tool support for model checkers. Our execution semantics is requirements-level because it uses the perfect

  13. Solar Sail Models and Test Measurements Correspondence for Validation Requirements Definition

    Science.gov (United States)

    Ewing, Anthony; Adams, Charles

    2004-01-01

    Solar sails are being developed as a mission-enabling technology in support of future NASA science missions. Current efforts have advanced solar sail technology sufficient to justify a flight validation program. A primary objective of this activity is to test and validate solar sail models that are currently under development so that they may be used with confidence in future science mission development (e.g., scalable to larger sails). Both system and model validation requirements must be defined early in the program to guide design cycles and to ensure that relevant and sufficient test data will be obtained to conduct model validation to the level required. A process of model identification, model input/output documentation, model sensitivity analyses, and test measurement correspondence is required so that decisions can be made to satisfy validation requirements within program constraints.

  14. Dynamic Flight Simulation Utilizing High Fidelity CFD-Based Nonlinear Reduced Order Model, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall technical objective of the Phase I effort is to develop a nonlinear aeroelastic solver utilizing the FUN3D generated nonlinear aerodynamic Reduced Order...

  15. Mathematical model of a utility firm. Final technical report, Part IIA

    Energy Technology Data Exchange (ETDEWEB)

    1983-08-21

    This volume is part of a project aimed at developing an understanding of the dynamical processes that evolve within an electric utility firm, and without it. The volume covers organizational dynamics and many-person symmetric games. (DLC)

  16. German Utilities and Distributed PV:How to Overcome Barriers to Business Model Innovation

    OpenAIRE

    Richter, Mario

    2012-01-01

    The transformation of the energy industry towards a more sustainable production of electricity increases the importance of distributed generation from renewable sources such as solar photovoltaic (PV). German utilities have largely failed to benefit from this development and lost 97% of the distributed PV generation market to investors from outside the electric power industry. Recent studies indicate that utilities have to react to prevent revenue erosion and loss of profits. This study ident...

  17. Factor Structure and Predictive Utility of the 2 x 2 Achievement Goal Model in a Sample of Taiwan Students

    Science.gov (United States)

    Chiang, Yu-Tzu; Yeh, Yu-Chen; Lin, Sunny S. J.; Hwang, Fang-Ming

    2011-01-01

    This study examined structure and predictive utility of the 2 x 2 achievement goal model among Taiwan pre-university school students (ages 10 to 16) who learned Chinese language arts. The confirmatory factor analyses of Achievement Goal Questionnaire-Chinese version provided good fitting between the factorial and dimensional structures with the…

  18. The Utility of the Prototype/Willingness Model in Predicting Alcohol Use among North American Indigenous Adolescents

    Science.gov (United States)

    Armenta, Brian E.; Hautala, Dane S.; Whitbeck, Les B.

    2015-01-01

    In the present study, we considered the utility of the prototype/willingness model in predicting alcohol use among North-American Indigenous adolescents. Specifically, using longitudinal data, we examined the associations among subjective drinking norms, positive drinker prototypes, drinking expectations (as a proxy of drinking willingness), and…

  19. Fire rehabilitation decisions at landscape scales: utilizing state-and-transition models developed through disturbance response grouping of ecological sites

    Science.gov (United States)

    Recognizing the utility of ecological sites and the associated state-and-transition model (STM) for decision support, the Bureau of Land Management in Nevada partnered with Nevada NRCS and the University of Nevada, Reno (UNR) in 2009 with the goal of creating a team that could (1) expedite developme...

  20. Evaluation Capacity Building in the Context of Military Psychological Health: Utilizing Preskill and Boyle's Multidisciplinary Model

    Science.gov (United States)

    Hilton, Lara; Libretto, Salvatore

    2017-01-01

    The need for evaluation capacity building (ECB) in military psychological health is apparent in light of the proliferation of newly developed, yet untested programs coupled with the lack of internal evaluation expertise. This study addresses these deficiencies by utilizing Preskill and Boyle's multidisciplinary ECB model within a post-traumatic…

  1. A novel approach towards fatigue damage prognostics of composite materials utilizing SHM data and stochastic degradation modeling

    NARCIS (Netherlands)

    Loutas, T.; Eleftheroglou, N.

    2016-01-01

    A prognostic framework is proposed in order to estimate the remaining useful life of composite materials under fatigue loading based on acoustic emission data and a sophisticated Non Homogenous Hidden Semi Markov Model. Bayesian neural networks are also utilized as an alternative machine learning

  2. Models and Tabu Search Metaheuristics for Service Network Design with Asset-Balance Requirements

    DEFF Research Database (Denmark)

    Pedersen, Michael Berliner; Crainic, T.G.; Madsen, Oli B.G.

    2009-01-01

    This paper focuses on a generic model for service network design, which includes asset positioning and utilization through constraints on asset availability at terminals. We denote these relations as "design-balance constraints" and focus on the design-balanced capacitated multicommodity network...... design model, a generalization of the capacitated multicommodity network design model generally used in service network design applications. Both arc-and cycle-based formulations for the new model are presented. The paper also proposes a tabu search metaheuristic framework for the arc-based formulation...

  3. The utility of bathymetric echo sounding data in modelling benthic impacts using NewDEPOMOD driven by an FVCOM model.

    Science.gov (United States)

    Rochford, Meghan; Black, Kenneth; Aleynik, Dmitry; Carpenter, Trevor

    2017-04-01

    The Scottish Environmental Protection Agency (SEPA) are currently implementing new regulations for consenting developments at new and pre-existing fish farms. Currently, a 15-day current record from multiple depths at one location near the site is required to run DEPOMOD, a depositional model used to determine the depositional footprint of waste material from fish farms, developed by Cromey et al. (2002). The present project involves modifying DEPOMOD to accept data from 3D hydrodynamic models to allow for a more accurate representation of the currents around the farms. Bathymetric data are key boundary conditions for accurate modelling of current velocity data. The aim of the project is to create a script that will use the outputs from FVCOM, a 3D hydrodynamic model developed by Chen et al. (2003), and input them into NewDEPOMOD (a new version of DEPOMOD with more accurately parameterised sediment transport processes) to determine the effect of a fish farm on the surrounding environment. This study compares current velocity data under two scenarios; the first, using interpolated bathymetric data, and the second using bathymetric data collected during a bathymetric echo sounding survey of the site. Theoretically, if the hydrodynamic model is of high enough resolution, the two scenarios should yield relatively similar results. However, the expected result is that the survey data will be of much higher resolution and therefore of better quality, producing more realistic velocity results. The improvement of bathymetric data will also improve sediment transport predictions in NewDEPOMOD. This work will determine the sensitivity of model predictions to bathymetric data accuracy at a range of sites with varying bathymetric complexity and thus give information on the potential costs and benefits of echo sounding survey data inputs. Chen, C., Liu, H. and Beardsley, R.C., 2003. An unstructured grid, finite-volume, three-dimensional, primitive equations ocean model

  4. Investigation of the effects of external current systems on the MAGSAT data utilizing grid cell modeling techniques

    Science.gov (United States)

    Klumpar, D. M. (Principal Investigator)

    1981-01-01

    Efforts devoted to reading MAGSAT data tapes in preparation for further analysis of the MAGSAT data are discussed. A modeling procedure developed to compute the magnetic fields at satellite orbit due to hypothesized current distributions in the ionosphere and magnetosphere is described. This technique utilizes a linear current element representation of the large-scale space-current system. Several examples of the model field perturbations computed along hypothetical satellite orbits are shown.

  5. Investigation of the effects of external current systems on the MAGSAT data utilizing grid cell modeling techniques

    Science.gov (United States)

    Klumpar, D. M. (Principal Investigator)

    1982-01-01

    The feasibility of modeling magnetic fields due to certain electrical currents flowing in the Earth's ionosphere and magnetosphere was investigated. A method was devised to carry out forward modeling of the magnetic perturbations that arise from space currents. The procedure utilizes a linear current element representation of the distributed electrical currents. The finite thickness elements are combined into loops which are in turn combined into cells having their base in the ionosphere. In addition to the extensive field modeling, additional software was developed for the reduction and analysis of the MAGSAT data in terms of the external current effects. Direct comparisons between the models and the MAGSAT data are possible.

  6. Linking nitrogen deposition to nitrate concentrations in groundwater below nature areas : modelling approach and data requirements

    NARCIS (Netherlands)

    Bonten, L.T.C.; Mol-Dijkstra, J.P.; Wieggers, H.J.J.; Vries, de W.; Pul, van W.A.J.; Hoek, van den K.W.

    2009-01-01

    This study determines the most suitable model and required model improvements to link atmospheric deposition of nitrogen and other elements in the Netherlands to measurements of nitrogen and other elements in the upper groundwater. The deterministic model SMARTml was found to be the most suitable

  7. Analysis of the 918th Contracting Battalion and 410th Contracting Support Brigade Utilizing the Contract Management Maturity Model

    Science.gov (United States)

    2015-12-01

    Opportunity, Threat ( SWOT ) analysis is an example of an assessment that does just that. Regardless of which type of organizational assessment is used , the...Rendon, 2015a, p. 19). These processes include key activities such as conducting requirements analysis and definition , market research, developing a...Monterey, CA 93943. Analysis of the 918th Contracting Battalion and 410th Contracting Support Brigade Utilizing the Contract Management Maturity

  8. OAM system based on TMN for utility telecommunication network. Proposal of modeling method about managed objects; TMN ni motozuku denryoku tsushinmo no un`yo kanri system. Kanri object no sekkei shuho ni kansuru kento

    Energy Technology Data Exchange (ETDEWEB)

    Hirozawa, T.; Yusa, H.; Otani, T. [Central Research Institute of Electric Power Industry, Tokyo (Japan); Okamura, K. [Tokyo Electric Power Co. Inc., Tokyo (Japan)

    1996-03-01

    To construct an advanced operation and management system for utility telecommunications management network (TMN), this paper proposes a modeling method of managed objects (MOs) required for managing and managed systems, such as an asynchronous transmission mode (ATM) exchanger. Flexible line setting and path switching control are required for the advanced TMN, which must cope with the extension and modification of functions, flexibly. Assignment of roles of managing sides and managed sides was determined. Then, structures of objects such as facilities and logic data, and their interaction were modeled. Common management functions and objects of each function were classified. Based on the TMN standard and MOs of the existing design peculiar to utility, new MOs peculiar to utility were defined in response to the models. The existing MOs can be effectively utilized, and the optimum MOs to be incorporated can be expected. The MOs peculiar to utility are added to the common specification of electric power industry. Since they can be reused for the extension and modification of functions, the cost can be reduced. The MOs applicable to path switching control of utility were designed as a trial. 9 refs., 16 figs., 10 tabs.

  9. Adolescent idiopathic scoliosis screening for school, community, and clinical health promotion practice utilizing the PRECEDE-PROCEED model

    Directory of Open Access Journals (Sweden)

    Wyatt Lawrence A

    2005-11-01

    Full Text Available Abstract Background Screening for adolescent idiopathic scoliosis (AIS is a commonly performed procedure for school children during the high risk years. The PRECEDE-PROCEDE (PP model is a health promotion planning model that has not been utilized for the clinical diagnosis of AIS. The purpose of this research is to study AIS in the school age population using the PP model and its relevance for community, school, and clinical health promotion. Methods MEDLINE was utilized to locate AIS data. Studies were screened for relevance and applicability under the auspices of the PP model. Where data was unavailable, expert opinion was utilized based on consensus. Results The social assessment of quality of life is limited with few studies approaching the long-term effects of AIS. Epidemiologically, AIS is the most common form of scoliosis and leading orthopedic problem in children. Behavioral/environmental studies focus on discovering etiologic relationships yet this data is confounded because AIS is not a behavioral. Illness and parenting health behaviors can be appreciated. The educational diagnosis is confounded because AIS is an orthopedic disorder and not behavioral. The administration/policy diagnosis is hindered in that scoliosis screening programs are not considered cost-effective. Policies are determined in some schools because 26 states mandate school scoliosis screening. There exists potential error with the Adam's test. The most widely used measure in the PP model, the Health Belief Model, has not been utilized in any AIS research. Conclusion The PP model is a useful tool for a comprehensive study of a particular health concern. This research showed where gaps in AIS research exist suggesting that there may be problems to the implementation of school screening. Until research disparities are filled, implementation of AIS screening by school, community, and clinical health promotion will be compromised. Lack of data and perceived importance by

  10. The "proactive" model of learning: Integrative framework for model-free and model-based reinforcement learning utilizing the associative learning-based proactive brain concept.

    Science.gov (United States)

    Zsuga, Judit; Biro, Klara; Papp, Csaba; Tajti, Gabor; Gesztelyi, Rudolf

    2016-02-01

    Reinforcement learning (RL) is a powerful concept underlying forms of associative learning governed by the use of a scalar reward signal, with learning taking place if expectations are violated. RL may be assessed using model-based and model-free approaches. Model-based reinforcement learning involves the amygdala, the hippocampus, and the orbitofrontal cortex (OFC). The model-free system involves the pedunculopontine-tegmental nucleus (PPTgN), the ventral tegmental area (VTA) and the ventral striatum (VS). Based on the functional connectivity of VS, model-free and model based RL systems center on the VS that by integrating model-free signals (received as reward prediction error) and model-based reward related input computes value. Using the concept of reinforcement learning agent we propose that the VS serves as the value function component of the RL agent. Regarding the model utilized for model-based computations we turned to the proactive brain concept, which offers an ubiquitous function for the default network based on its great functional overlap with contextual associative areas. Hence, by means of the default network the brain continuously organizes its environment into context frames enabling the formulation of analogy-based association that are turned into predictions of what to expect. The OFC integrates reward-related information into context frames upon computing reward expectation by compiling stimulus-reward and context-reward information offered by the amygdala and hippocampus, respectively. Furthermore we suggest that the integration of model-based expectations regarding reward into the value signal is further supported by the efferent of the OFC that reach structures canonical for model-free learning (e.g., the PPTgN, VTA, and VS). (c) 2016 APA, all rights reserved).

  11. A Comprehensive Energy Analysis and Related Carbon Footprint of Dairy Farms, Part 2: Investigation and Modeling of Indirect Energy Requirements

    Directory of Open Access Journals (Sweden)

    Giuseppe Todde

    2018-02-01

    Full Text Available Dairy cattle farms are continuously developing more intensive systems of management, which require higher utilization of durable and non-durable inputs. These inputs are responsible for significant direct and indirect fossil energy requirements, which are related to remarkable emissions of CO2. This study focused on investigating the indirect energy requirements of 285 conventional dairy farms and the related carbon footprint. A detailed analysis of the indirect energy inputs related to farm buildings, machinery and agricultural inputs was carried out. A partial life cycle assessment approach was carried out to evaluate indirect energy inputs and the carbon footprint of farms over a period of one harvest year. The investigation highlights the importance and the weight related to the use of agricultural inputs, which represent more than 80% of the total indirect energy requirements. Moreover, the analyses carried out underline that the assumption of similarity in terms of requirements of indirect energy and related carbon emissions among dairy farms is incorrect especially when observing different farm sizes and milk production levels. Moreover, a mathematical model to estimate the indirect energy requirements of dairy farms has been developed in order to provide an instrument allowing researchers to assess the energy incorporated into farm machinery, agricultural inputs and buildings. Combining the results of this two-part series, the total energy demand (expressed in GJ per farm results in being mostly due to agricultural inputs and fuel consumption, which have the largest share of the annual requirements for each milk yield class. Direct and indirect energy requirements increased, going from small sized farms to larger ones, from 1302–5109 GJ·y−1, respectively. However, the related carbon dioxide emissions expressed per 100 kg of milk showed a negative trend going from class <5000 to >9000 kg of milk yield, where larger farms were able to

  12. Analysis and Characterization of Damage and Failure Utilizing a Generalized Composite Material Model Suitable for Use in Impact Problems

    Science.gov (United States)

    Goldberg, Robert K.; Carney, Kelly S.; DuBois, Paul; Khaled, Bilal; Hoffarth, Canio; Rajan, Subramaniam; Blankenhorn, Gunther

    2016-01-01

    A material model which incorporates several key capabilities which have been identified by the aerospace community as lacking in state-of-the art composite impact models is under development. In particular, a next generation composite impact material model, jointly developed by the FAA and NASA, is being implemented into the commercial transient dynamic finite element code LS-DYNA. The material model, which incorporates plasticity, damage, and failure, utilizes experimentally based tabulated input to define the evolution of plasticity and damage and the initiation of failure as opposed to specifying discrete input parameters (such as modulus and strength). The plasticity portion of the orthotropic, three-dimensional, macroscopic composite constitutive model is based on an extension of the Tsai-Wu composite failure model into a generalized yield function with a non-associative flow rule. For the damage model, a strain equivalent formulation is utilized to allow for the uncoupling of the deformation and damage analyses. In the damage model, a semi-coupled approach is employed where the overall damage in a particular coordinate direction is assumed to be a multiplicative combination of the damage in that direction resulting from the applied loads in the various coordinate directions. Due to the fact that the plasticity and damage models are uncoupled, test procedures and methods to both characterize the damage model and to covert the material stress-strain curves from the true (damaged) stress space to the effective (undamaged) stress space have been developed. A methodology has been developed to input the experimentally determined composite failure surface in a tabulated manner. An analytical approach is then utilized to track how close the current stress state is to the failure surface.

  13. Utilizing ARC EMCS Seedling Cassettes as Highly Versatile Miniature Growth Chambers for Model Organism Experiments

    Science.gov (United States)

    Freeman, John L.; Steele, Marianne K.; Sun, Gwo-Shing; Heathcote, David; Reinsch, S.; DeSimone, Julia C.; Myers, Zachary A.

    2014-01-01

    The aim of our ground testing was to demonstrate the capability of safely putting specific model organisms into dehydrated stasis, and to later rehydrate and successfully grow them inside flight proven ARC EMCS seedling cassettes. The ARC EMCS seedling cassettes were originally developed to support seedling growth during space flight. The seeds are attached to a solid substrate, launched dry, and then rehydrated in a small volume of media on orbit to initiate the experiment. We hypothesized that the same seedling cassettes should be capable of acting as culture chambers for a wide range of organisms with minimal or no modification. The ability to safely preserve live organisms in a dehydrated state allows for on orbit experiments to be conducted at the best time for crew operations and more importantly provides a tightly controlled physiologically relevant growth experiment with specific environmental parameters. Thus, we performed a series of ground tests that involved growing the organisms, preparing them for dehydration on gridded Polyether Sulfone (PES) membranes, dry storage at ambient temperatures for varying periods of time, followed by rehydration. Inside the culture cassettes, the PES membranes were mounted above blotters containing dehydrated growth media. These were mounted on stainless steel bases and sealed with plastic covers that have permeable membrane covered ports for gas exchange. The results showed we were able to demonstrate acceptable normal growth of C.elegans (nematodes), E.coli (bacteria), S.cerevisiae (yeast), Polytrichum (moss) spores and protonemata, C.thalictroides (fern), D.discoideum (amoeba), and H.dujardini (tardigrades). All organisms showed acceptable growth and rehydration in both petri dishes and culture cassettes initially, and after various time lengths of dehydration. At the end of on orbit ISS European Modular Cultivation System experiments the cassettes could be frozen at ultra-low temperatures, refrigerated, or chemically

  14. Bioreactor process parameter screening utilizing a Plackett-Burman design for a model monoclonal antibody.

    Science.gov (United States)

    Agarabi, Cyrus D; Schiel, John E; Lute, Scott C; Chavez, Brittany K; Boyne, Michael T; Brorson, Kurt A; Khan, Mansoora; Read, Erik K

    2015-06-01

    Consistent high-quality antibody yield is a key goal for cell culture bioprocessing. This endpoint is typically achieved in commercial settings through product and process engineering of bioreactor parameters during development. When the process is complex and not optimized, small changes in composition and control may yield a finished product of less desirable quality. Therefore, changes proposed to currently validated processes usually require justification and are reported to the US FDA for approval. Recently, design-of-experiments-based approaches have been explored to rapidly and efficiently achieve this goal of optimized yield with a better understanding of product and process variables that affect a product's critical quality attributes. Here, we present a laboratory-scale model culture where we apply a Plackett-Burman screening design to parallel cultures to study the main effects of 11 process variables. This exercise allowed us to determine the relative importance of these variables and identify the most important factors to be further optimized in order to control both desirable and undesirable glycan profiles. We found engineering changes relating to culture temperature and nonessential amino acid supplementation significantly impacted glycan profiles associated with fucosylation, β-galactosylation, and sialylation. All of these are important for monoclonal antibody product quality. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  15. Developing a Predictive Model for Unscheduled Maintenance Requirements on United States Air Force Installations

    National Research Council Canada - National Science Library

    Kovich, Matthew D; Norton, J. D

    2008-01-01

    .... This paper develops one such method by using linear regression and time series analysis to develop a predictive model to forecast future year man-hour and funding requirements for unscheduled maintenance...

  16. Personal recommender systems for learners in lifelong learning: requirements, techniques and model

    NARCIS (Netherlands)

    Drachsler, Hendrik; Hummel, Hans; Koper, Rob

    2007-01-01

    Drachsler, H., Hummel, H. G. K., & Koper, R. (2008). Personal recommender systems for learners in lifelong learning: requirements, techniques and model. International Journal of Learning Technology, 3(4), 404-423.

  17. Towards a framework for improving goal-oriented requirement models quality

    OpenAIRE

    Cares, Carlos; Franch Gutiérrez, Javier

    2009-01-01

    Goal-orientation is a widespread and useful approach to Requirements Engineering. However, quality assessment frameworks focused on goal-oriented processes are either limited or remain on the theoretical side. Requirements quality initiatives range from simple metrics applicable to requirements documents, to general-purpose quality frameworks that include syntactic, semantic and pragmatic concerns. In some recent works, we have proposed a metrics framework for goal-oriented models, b...

  18. A manpower training requirements model for new weapons systems, with applications to the infantry fighting vehicle

    OpenAIRE

    Kenehan, Douglas J.

    1981-01-01

    Approved for public release; distribution is unlimited This thesis documents the methodology and parameters used in designing a manpower training requirements model for new weapons systems. This model provides manpower planners with the capability of testing alternative fielding policies and adjusting model parameters to improve the use of limited personnel resources. Use of the model is illustrated in a detailed analysis of the planned introduction of the Infantry Fighting Vehicle into t...

  19. Fuel requirements for experimental devices in MTR reactors. A perturbation model for reactor core analysis

    International Nuclear Information System (INIS)

    Beeckmans de West-Meerbeeck, A.

    1991-01-01

    Irradiation in neutron absorbing devices, requiring high fast neutron fluxes in the core or high thermal fluxes in the reflector and flux traps, lead to higher density fuel and larger core dimensions. A perturbation model of the reactor core helps to estimate the fuel requirements. (orig.)

  20. Model-Based Requirements Analysis for Reactive Systems with UML Sequence Diagrams and Coloured Petri Nets

    DEFF Research Database (Denmark)

    Tjell, Simon; Lassen, Kristian Bisgaard

    2008-01-01

    In this paper, we describe a formal foundation for a specialized approach to automatically checking traces against real-time requirements. The traces are obtained from simulation of Coloured Petri Net (CPN) models of reactive systems. The real-time requirements are expressed in terms of a derivat...

  1. The Nuremberg Code subverts human health and safety by requiring animal modeling

    OpenAIRE

    Greek, Ray; Pippus, Annalea; Hansen, Lawrence A

    2012-01-01

    Abstract Background The requirement that animals be used in research and testing in order to protect humans was formalized in the Nuremberg Code and subsequent national and international laws, codes, and declarations. Discussion We review the history of these requirements and contrast what was known via science about animal models then with what is known now. We further analyze the predictive...

  2. State of the Art : Integrated Management of Requirements in Model-Based Software Engineering

    OpenAIRE

    Thörn, Christer

    2006-01-01

    This report describes the background and future of research concerning integrated management of requirements in model-based software engineering. The focus is on describing the relevant topics and existing theoretical backgrounds that form the basis for the research. The report describes the fundamental difficulties of requirements engineering for software projects, and proposes that the results and methods of models in software engineering can help leverage those problems. Taking inspiration...

  3. A Hybrid Parallel Execution Model for Logic Based Requirement Specifications (Invited Paper

    Directory of Open Access Journals (Sweden)

    Jeffrey J. P. Tsai

    1999-05-01

    Full Text Available It is well known that undiscovered errors in a requirements specification is extremely expensive to be fixed when discovered in the software maintenance phase. Errors in the requirement phase can be reduced through the validation and verification of the requirements specification. Many logic-based requirements specification languages have been developed to achieve these goals. However, the execution and reasoning of a logic-based requirements specification can be very slow. An effective way to improve their performance is to execute and reason the logic-based requirements specification in parallel. In this paper, we present a hybrid model to facilitate the parallel execution of a logic-based requirements specification language. A logic-based specification is first applied by a data dependency analysis technique which can find all the mode combinations that exist within a specification clause. This mode information is used to support a novel hybrid parallel execution model, which combines both top-down and bottom-up evaluation strategies. This new execution model can find the failure in the deepest node of the search tree at the early stage of the evaluation, thus this new execution model can reduce the total number of nodes searched in the tree, the total processes needed to be generated, and the total communication channels needed in the search process. A simulator has been implemented to analyze the execution behavior of the new model. Experiments show significant improvement based on several criteria.

  4. Utility values associated with advanced or metastatic non-small cell lung cancer: data needs for economic modeling.

    Science.gov (United States)

    Brown, Jacqueline; Cook, Keziah; Adamski, Kelly; Lau, Jocelyn; Bargo, Danielle; Breen, Sarah; Chawla, Anita

    2017-04-01

    Cost-effectiveness analyses often inform healthcare reimbursement decisions. The preferred measure of effectiveness is the quality adjusted life year (QALY) gained, where the quality of life adjustment is measured in terms of utility. Areas covered: We assessed the availability and variation of utility values for health states associated with advanced or metastatic non-small cell lung cancer (NSCLC) to identify values appropriate for cost-effectiveness models assessing alternative treatments. Our systematic search of six electronic databases (January 2000 to August 2015) found the current literature to be sparse in terms of utility values associated with NSCLC, identifying 27 studies. Utility values were most frequently reported over time and by treatment type, and less frequently by disease response, stage of disease, adverse events or disease comorbidities. Expert commentary: In response to rising healthcare costs, payers increasingly consider the cost-effectiveness of novel treatments in reimbursement decisions, especially in oncology. As the number of therapies available to treat NSCLC increases, cost-effectiveness analyses will play a key role in reimbursement decisions in this area. Quantifying the relationship between health and quality of life for NSCLC patients via utility values is an important component of assessing the cost effectiveness of novel treatments.

  5. An optimization model for carbon capture & storage/utilization vs. carbon trading: A case study of fossil-fired power plants in Turkey.

    Science.gov (United States)

    Ağralı, Semra; Üçtuğ, Fehmi Görkem; Türkmen, Burçin Atılgan

    2018-06-01

    We consider fossil-fired power plants that operate in an environment where a cap and trade system is in operation. These plants need to choose between carbon capture and storage (CCS), carbon capture and utilization (CCU), or carbon trading in order to obey emissions limits enforced by the government. We develop a mixed-integer programming model that decides on the capacities of carbon capture units, if it is optimal to install them, the transportation network that needs to be built for transporting the carbon captured, and the locations of storage sites, if they are decided to be built. Main restrictions on the system are the minimum and maximum capacities of the different parts of the pipeline network, the amount of carbon that can be sold to companies for utilization, and the capacities on the storage sites. Under these restrictions, the model aims to minimize the net present value of the sum of the costs associated with installation and operation of the carbon capture unit and the transportation of carbon, the storage cost in case of CCS, the cost (or revenue) that results from the emissions trading system, and finally the negative revenue of selling the carbon to other entities for utilization. We implement the model on General Algebraic Modeling System (GAMS) by using data associated with two coal-fired power plants located in different regions of Turkey. We choose enhanced oil recovery (EOR) as the process in which carbon would be utilized. The results show that CCU is preferable to CCS as long as there is sufficient demand in the EOR market. The distance between the location of emission and location of utilization/storage, and the capacity limits on the pipes are an important factor in deciding between carbon capture and carbon trading. At carbon prices over $15/ton, carbon capture becomes preferable to carbon trading. These results show that as far as Turkey is concerned, CCU should be prioritized as a means of reducing nation-wide carbon emissions in an

  6. Impact of physiologically based pharmacokinetic models on regulatory reviews and product labels: Frequent utilization in the field of oncology.

    Science.gov (United States)

    Yoshida, K; Budha, N; Jin, J Y

    2017-05-01

    Physiologically based pharmacokinetic (PBPK) modeling can be used to predict drug pharmacokinetics in virtual populations using models that integrate understanding of physiological systems. PBPK models have been widely utilized for predicting pharmacokinetics in clinically untested scenarios during drug applications and regulatory reviews in recent years. Here, we provide a comprehensive review of the application of PBPK in new drug application (NDA) review documents from the US Food and Drug Administration (FDA) in the past 4 years. © 2017 The Authors Clinical Pharmacology & Therapeutics published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  7. Mathematical model of a utility firm. Final technical report, Part III

    Energy Technology Data Exchange (ETDEWEB)

    1983-08-21

    This project is aimed at understanding the economic and behavioral processes that take place within a utility firm, and without it. This volume covers dynamics of economic systems (Phase II of the project): economic equilibrium theory, discrete economics, exchange economics, production economics, approach to equilibrium.

  8. Physician Requirements-1990. For Cardiology.

    Science.gov (United States)

    Tracy, Octavious; Birchette-Pierce, Cheryl

    Professional requirements for physicians specializing in cardiology were estimated to assist policymakers in developing guidelines for graduate medical education. The determination of physician requirements was based on an adjusted needs rather than a demand or utilization model. For each illness, manpower requirements were modified by the…

  9. Creating a Test Validated Structural Dynamic Finite Element Model of the Multi-Utility Technology Test Bed Aircraft

    Science.gov (United States)

    Pak, Chan-Gi; Truong, Samson S.

    2014-01-01

    Small modeling errors in the finite element model will eventually induce errors in the structural flexibility and mass, thus propagating into unpredictable errors in the unsteady aerodynamics and the control law design. One of the primary objectives of Multi Utility Technology Test Bed, X-56A, aircraft is the flight demonstration of active flutter suppression, and therefore in this study, the identification of the primary and secondary modes for the structural model tuning based on the flutter analysis of X-56A. The ground vibration test validated structural dynamic finite element model of the X-56A is created in this study. The structural dynamic finite element model of the X-56A is improved using a model tuning tool. In this study, two different weight configurations of the X-56A have been improved in a single optimization run.

  10. Elastic Model Transitions: a Hybrid Approach Utilizing Quadratic Inequality Constrained Least Squares (LSQI) and Direct Shape Mapping (DSM)

    Science.gov (United States)

    Jurenko, Robert J.; Bush, T. Jason; Ottander, John A.

    2014-01-01

    A method for transitioning linear time invariant (LTI) models in time varying simulation is proposed that utilizes both quadratically constrained least squares (LSQI) and Direct Shape Mapping (DSM) algorithms to determine physical displacements. This approach is applicable to the simulation of the elastic behavior of launch vehicles and other structures that utilize multiple LTI finite element model (FEM) derived mode sets that are propagated throughout time. The time invariant nature of the elastic data for discrete segments of the launch vehicle trajectory presents a problem of how to properly transition between models while preserving motion across the transition. In addition, energy may vary between flex models when using a truncated mode set. The LSQI-DSM algorithm can accommodate significant changes in energy between FEM models and carries elastic motion across FEM model transitions. Compared with previous approaches, the LSQI-DSM algorithm shows improvements ranging from a significant reduction to a complete removal of transients across FEM model transitions as well as maintaining elastic motion from the prior state.

  11. Estimating required information size by quantifying diversity in random-effects model meta-analyses

    DEFF Research Database (Denmark)

    Wetterslev, Jørn; Thorlund, Kristian; Brok, Jesper

    2009-01-01

    BACKGROUND: There is increasing awareness that meta-analyses require a sufficiently large information size to detect or reject an anticipated intervention effect. The required information size in a meta-analysis may be calculated from an anticipated a priori intervention effect or from...... an intervention effect suggested by trials with low-risk of bias. METHODS: Information size calculations need to consider the total model variance in a meta-analysis to control type I and type II errors. Here, we derive an adjusting factor for the required information size under any random-effects model meta......-trial variability and a sampling error estimate considering the required information size. D2 is different from the intuitively obvious adjusting factor based on the common quantification of heterogeneity, the inconsistency (I2), which may underestimate the required information size. Thus, D2 and I2 are compared...

  12. Driving forces behind the increasing cardiovascular treatment intensity.A dynamic epidemiologic model of trends in Danish cardiovascular drug utilization.

    DEFF Research Database (Denmark)

    Kildemoes, Helle Wallach; Andersen, Morten

    Background: In many Western countries cardiovascular treatment intensity (DDD/1000 inhabitants/day, DDD/TID) has grown substantially during the last decades. Changed drug utilization pattern - rather than population ageing - was hypothesized to be the main driving force behind the growth....... Objectives: To investigate the driving forces behind the increasing treatment prevalence of cardiovascular drugs, in particular statins, by means of a dynamic epidemiologic drug utilization model. Methods: Material: All Danish residents older than 20 years by January 1, 1996 (4.0 million inhabitants), were...... followed with respect to out-of-hospital redemptions of cardiovascular prescription drugs in the period 1996-2005. The impact of population ageing on cardiovascular treatment intensity was investigated by comparing crude and age/gender standardised intensities. Epidemiologic model: We developed a three...

  13. Model project to promote cultivation and utilization of renewable resources. Modellvorhaben zur Foerderung des Anbaus und der Verwertung nachwachsender Rohstoffe

    Energy Technology Data Exchange (ETDEWEB)

    1991-09-01

    This revised report on the model projects presents individual projects and measures complementary to each other, documenting, in their totality, an advanced state of development. Moreover it shows the following: that the basic challenge of a model project, especially in the field of the energetic use of biomass, can be met by marrying agriculture to power utilities. So, projects are under way where cultivation of China reed and its utilization in power-and-heat cogeneration plants will, in the future, complement each other. Further questions that are not represented in the research programme of Lower Saxonia are dealt with at the federal level, so that the field of renewable resurces may currently be considered as comprehensively covered. (orig./EF).

  14. The Utility of Cognitive Plausibility in Language Acquisition Modeling: Evidence from Word Segmentation

    Science.gov (United States)

    Phillips, Lawrence; Pearl, Lisa

    2015-01-01

    The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's "cognitive plausibility." We suggest that though every computational model necessarily idealizes the modeled task, an informative language acquisition…

  15. Capital Regulation, Liquidity Requirements and Taxation in a Dynamic Model of Banking

    NARCIS (Netherlands)

    Di Nicolo, G.; Gamba, A.; Lucchetta, M.

    2011-01-01

    This paper formulates a dynamic model of a bank exposed to both credit and liquidity risk, which can resolve financial distress in three costly forms: fire sales, bond issuance ad equity issuance. We use the model to analyze the impact of capital regulation, liquidity requirements and taxation on

  16. Capital Regulation, Liquidity Requirements and Taxation in a Dynamic Model of Banking

    NARCIS (Netherlands)

    Di Nicolo, G.; Gamba, A.; Lucchetta, M.

    2011-01-01

    This paper formulates a dynamic model of a bank exposed to both credit and liquidity risk, which can resolve financial distress in three costly forms: fire sales, bond issuance and equity issuance. We use the model to analyze the impact of capital regulation, liquidity requirements and taxation on

  17. Knowledge Based Characterization of Cross-Models Constraints to Check Design and Modeling Requirements

    Science.gov (United States)

    Simonn Zayas, David; Monceaux, Anne; Ait-Ameur, Yamine

    2011-08-01

    Nowadays, complexity of systems frequently implies different engineering teams handling various descriptive models. Each team having a variety of expertise backgrounds, domain knowledge and modeling practices, the heterogeneity of the models themselves is a logical consequence. Therefore, even individually models are well managed; their diversity becomes a problem when engineers need to share their models to perform some overall validations. One way of reducing this heterogeneity is to take into consideration the implicit knowledge which is not contained in the models but it is cardinal to understand them. In a first stage of our research, we have defined and implemented an approach recommending the formalization of implicit knowledge to enrich models in order to ease cross- model checks. Nevertheless, to fill the gap between the specification of the system and the validation of a cross- model constraint, in this paper we suggest giving values to some relevant characteristics to reinforce the approach.

  18. Numerical approach to optimal portfolio in a power utility regime-switching model

    Science.gov (United States)

    Gyulov, Tihomir B.; Koleva, Miglena N.; Vulkov, Lubin G.

    2017-12-01

    We consider a system of weakly coupled degenerate semi-linear parabolic equations of optimal portfolio in a regime-switching with power utility function, derived by A.R. Valdez and T. Vargiolu [14]. First, we discuss some basic properties of the solution of this system. Then, we develop and analyze implicit-explicit, flux limited finite difference schemes for the differential problem. Numerical experiments are discussed.

  19. Promoting remyelination: utilizing a viral model of demyelination to assess cell-based therapies

    OpenAIRE

    Marro, Brett S; Blanc, Caroline A; Loring, Jeanne F; Cahalan, Michael D; Lane, Thomas E

    2014-01-01

    Multiple sclerosis (MS) is a chronic inflammatory disease of the CNS. While a broad range of therapeutics effectively reduce the incidence of focal white matter inflammation and plaque formation for patients with relapse-remitting forms of MS, a challenge within the field is to develop therapies that allow for axonal protection and remyelination. In the last decade, growing interest has focused on utilizing neural precursor cells (NPCs) to promote remyelination. To understand how NPCs functio...

  20. Utilization of MRI for Cerebral White Matter Injury in a Hypobaric Swine Model-Validation of Technique

    Science.gov (United States)

    2017-05-23

    disruption of axonal integrity as measured by DTI.11,12 The traditional neuropathological model posits gaseous embolic damage to the brain as the main...embolic cerebral damage and change in permeabil- ity of the blood brain barrier (BBB).26,27 DSC utilizes an exogenous contrast imaging agent that...spatial (1.6 × 1.6 × 2.0 mm) resolution, acquiring 23 slices for full brain coverage with no gaps. Imaging slices were prescribed axially. Image

  1. Utilization of design data on conventional system to building information modeling (BIM)

    Science.gov (United States)

    Akbar, Boyke M.; Z. R., Dewi Larasati

    2017-11-01

    Nowadays infrastructure development becomes one of the main priorities in the developed country such as Indonesia. The use of conventional design system is considered no longer effectively support the infrastructure projects, especially for the high complexity building design, due to its fragmented system issues. BIM comes as one of the solutions in managing projects in an integrated manner. Despite of the all known BIM benefits, there are some obstacles on the migration process to BIM. The two main of the obstacles are; the BIM implementation unpreparedness of some project parties and a concerns to leave behind the existing database and create a new one on the BIM system. This paper discusses the utilization probabilities of the existing CAD data from the conventional design system for BIM system. The existing conventional CAD data's and BIM design system output was studied to examine compatibility issues between two subject and followed by an utilization scheme-strategy probabilities. The goal of this study is to add project parties' eagerness in migrating to BIM by maximizing the existing data utilization and hopefully could also increase BIM based project workflow quality.

  2. Input data requirements for performance modelling and monitoring of photovoltaic plants

    DEFF Research Database (Denmark)

    Gavriluta, Anamaria Florina; Spataru, Sergiu; Sera, Dezso

    2018-01-01

    This work investigates the input data requirements in the context of performance modeling of thin-film photovoltaic (PV) systems. The analysis focuses on the PVWatts performance model, well suited for on-line performance monitoring of PV strings, due to its low number of parameters and high...... accuracy. The work aims to identify the minimum amount of input data required for parameterizing an accurate model of the PV plant. The analysis was carried out for both amorphous silicon (a-Si) and cadmium telluride (CdTe), using crystalline silicon (c-Si) as a base for comparison. In the studied cases...

  3. Clinical Utility and Safety of a Model-Based Patient-Tailored Dose of Vancomycin in Neonates.

    Science.gov (United States)

    Leroux, Stéphanie; Jacqz-Aigrain, Evelyne; Biran, Valérie; Lopez, Emmanuel; Madeleneau, Doriane; Wallon, Camille; Zana-Taïeb, Elodie; Virlouvet, Anne-Laure; Rioualen, Stéphane; Zhao, Wei

    2016-04-01

    Pharmacokinetic modeling has often been applied to evaluate vancomycin pharmacokinetics in neonates. However, clinical application of the model-based personalized vancomycin therapy is still limited. The objective of the present study was to evaluate the clinical utility and safety of a model-based patient-tailored dose of vancomycin in neonates. A model-based vancomycin dosing calculator, developed from a population pharmacokinetic study, has been integrated into the routine clinical care in 3 neonatal intensive care units (Robert Debré, Cochin Port Royal, and Clocheville hospitals) between 2012 and 2014. The target attainment rate, defined as the percentage of patients with a first therapeutic drug monitoring serum vancomycin concentration achieving the target window of 15 to 25 mg/liter, was selected as an endpoint for evaluating the clinical utility. The safety evaluation was focused on nephrotoxicity. The clinical application of the model-based patient-tailored dose of vancomycin has been demonstrated in 190 neonates. The mean (standard deviation) gestational and postnatal ages of the study population were 31.1 (4.9) weeks and 16.7 (21.7) days, respectively. The target attainment rate increased from 41% to 72% without any case of vancomycin-related nephrotoxicity. This proof-of-concept study provides evidence for integrating model-based antimicrobial therapy in neonatal routine care. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  4. The Nuremberg Code subverts human health and safety by requiring animal modeling

    Directory of Open Access Journals (Sweden)

    Greek Ray

    2012-07-01

    Full Text Available Abstract Background The requirement that animals be used in research and testing in order to protect humans was formalized in the Nuremberg Code and subsequent national and international laws, codes, and declarations. Discussion We review the history of these requirements and contrast what was known via science about animal models then with what is known now. We further analyze the predictive value of animal models when used as test subjects for human response to drugs and disease. We explore the use of animals for models in toxicity testing as an example of the problem with using animal models. Summary We conclude that the requirements for animal testing found in the Nuremberg Code were based on scientifically outdated principles, compromised by people with a vested interest in animal experimentation, serve no useful function, increase the cost of drug development, and prevent otherwise safe and efficacious drugs and therapies from being implemented.

  5. The Nuremberg Code subverts human health and safety by requiring animal modeling.

    Science.gov (United States)

    Greek, Ray; Pippus, Annalea; Hansen, Lawrence A

    2012-07-08

    The requirement that animals be used in research and testing in order to protect humans was formalized in the Nuremberg Code and subsequent national and international laws, codes, and declarations. We review the history of these requirements and contrast what was known via science about animal models then with what is known now. We further analyze the predictive value of animal models when used as test subjects for human response to drugs and disease. We explore the use of animals for models in toxicity testing as an example of the problem with using animal models. We conclude that the requirements for animal testing found in the Nuremberg Code were based on scientifically outdated principles, compromised by people with a vested interest in animal experimentation, serve no useful function, increase the cost of drug development, and prevent otherwise safe and efficacious drugs and therapies from being implemented.

  6. Connecting Requirements to Architecture and Analysis via Model-Based Systems Engineering

    Science.gov (United States)

    Cole, Bjorn F.; Jenkins, J. Steven

    2015-01-01

    In traditional systems engineering practice, architecture, concept development, and requirements development are related but still separate activities. Concepts for operation, key technical approaches, and related proofs of concept are developed. These inform the formulation of an architecture at multiple levels, starting with the overall system composition and functionality and progressing into more detail. As this formulation is done, a parallel activity develops a set of English statements that constrain solutions. These requirements are often called "shall statements" since they are formulated to use "shall." The separation of requirements from design is exacerbated by well-meaning tools like the Dynamic Object-Oriented Requirements System (DOORS) that remained separated from engineering design tools. With the Europa Clipper project, efforts are being taken to change the requirements development approach from a separate activity to one intimately embedded in formulation effort. This paper presents a modeling approach and related tooling to generate English requirement statements from constraints embedded in architecture definition.

  7. 4M Overturned Pyramid (MOP Model Utilization: Case Studies on Collision in Indonesian and Japanese Maritime Traffic Systems (MTS

    Directory of Open Access Journals (Sweden)

    Wanginingastuti Mutmainnah

    2016-07-01

    Full Text Available 4M Overturned Pyramid (MOP model is a new model, proposed by authors, to characterized MTS which is adopting epidemiological model that determines causes of accidents, including not only active failures but also latent failures and barriers. This model is still being developed. One of utilization of MOP model is characterizing accidents in MTS, i.e. collision in Indonesia and Japan that is written in this paper. The aim of this paper is to show the characteristics of ship collision accidents that occur both in Indonesian and Japanese maritime traffic systems. There were 22 collision cases in 2008–2012 (8 cases in Indonesia and 14 cases in Japan. The characteristics presented in this paper show failure events at every stage of the three accident development stages (the beginning of an accident, the accident itself, and the evacuation process.

  8. Twenty-four hour metabolic rate measurements utilized as a reference to evaluate several prediction equations for calculating energy requirements in healthy infants

    Directory of Open Access Journals (Sweden)

    Rising Russell

    2011-02-01

    Full Text Available Abstract Background To date, only short-duration metabolic rate measurements of less than four hours have been used to evaluate prediction equations for calculating energy requirements in healthy infants. Therefore, the objective of this analysis was to utilize direct 24-hour metabolic rate measurements from a prior study to evaluate the accuracy of several currently used prediction equations for calculating energy expenditure (EE in healthy infants. Methods Data from 24-hour EE, resting (RMR and sleeping (SMR metabolic rates obtained from 10 healthy infants, served as a reference to evaluate 11 length-weight (LWT and weight (WT based prediction equations. Six prediction equations have been previously derived from 50 short-term EE measurements in the Enhanced Metabolic Testing Activity Chamber (EMTAC for assessing 24-hour EE, (EMTACEE-LWT and EMTACEE-WT, RMR (EMTACRMR-LWT and EMTACRMR-WT and SMR (EMTACSMR-LWT and EMTACSMR-WT. The last five additional prediction equations for calculating RMR consisted of the World Health Organization (WHO, the Schofield (SCH-LWT and SCH-WT and the Oxford (OXFORD-LWT and OXFORD-WT. Paired t-tests and the Bland & Altman limit analysis were both applied to evaluate the performance of each equation in comparison to the reference data. Results 24-hour EE, RMR and SMR calculated with the EMTACEE-WT, EMTACRMR-WT and both the EMTACSMR-LWT and EMTACSMR-WT prediction equations were similar, p = NS, to that obtained from the reference measurements. However, RMR calculated using the WHO, SCH-LWT, SCH-WT, OXFORD-LWT and OXFORD-WT prediction equations were not comparable to the direct 24-hour metabolic measurements (p Conclusions Weight based prediction equations, derived from short-duration EE measurements in the EMTAC, were accurate for calculating EE, RMR and SMR in healthy infants.

  9. Non-monotonic modelling from initial requirements: a proposal and comparison with monotonic modelling methods

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wupper, H.; Wieringa, Roelf J.

    2008-01-01

    Researchers make a significant effort to develop new modelling languages and tools. However, they spend less effort developing methods for constructing models using these languages and tools. We are developing a method for building an embedded system model for formal verification. Our method

  10. Parity-specific and two-sex utility models of reproductive intentions.

    Science.gov (United States)

    Fried, E S; Hofferth, S L; Udry, J R

    1980-02-01

    This paper uses married couples' anticipated consequences of having a (another) child to predict their reproductive intentions. Parity-specific models identify different variables as predictors of reproductive behavior at different parities but do not yield interpretable patterns of difference by parity. Parity-specific models are not significantly stronger predictors of reproductive behavior. Generally, wife-only models are distinctly superior to husband-only models. Two-sex models are usually better predictors than one-sex models but not enough better to justify the additional cost.

  11. Model-Based Requirements Management in Gear Systems Design Based On Graph-Based Design Languages

    Directory of Open Access Journals (Sweden)

    Kevin Holder

    2017-10-01

    Full Text Available For several decades, a wide-spread consensus concerning the enormous importance of an in-depth clarification of the specifications of a product has been observed. A weak clarification of specifications is repeatedly listed as a main cause for the failure of product development projects. Requirements, which can be defined as the purpose, goals, constraints, and criteria associated with a product development project, play a central role in the clarification of specifications. The collection of activities which ensure that requirements are identified, documented, maintained, communicated, and traced throughout the life cycle of a system, product, or service can be referred to as “requirements engineering”. These activities can be supported by a collection and combination of strategies, methods, and tools which are appropriate for the clarification of specifications. Numerous publications describe the strategy and the components of requirements management. Furthermore, recent research investigates its industrial application. Simultaneously, promising developments of graph-based design languages for a holistic digital representation of the product life cycle are presented. Current developments realize graph-based languages by the diagrams of the Unified Modelling Language (UML, and allow the automatic generation and evaluation of multiple product variants. The research presented in this paper seeks to present a method in order to combine the advantages of a conscious requirements management process and graph-based design languages. Consequently, the main objective of this paper is the investigation of a model-based integration of requirements in a product development process by means of graph-based design languages. The research method is based on an in-depth analysis of an exemplary industrial product development, a gear system for so-called “Electrical Multiple Units” (EMU. Important requirements were abstracted from a gear system

  12. Perceived Life Changes in Adults with Acquired Immunodeficiency Syndrome and Kaposi’s Sarcoma Utilizing a Behavioral Systems Model.

    Science.gov (United States)

    1987-01-01

    instrument and added by the subject. Genitourinary will be the eliminative processes and excretory products of the urinary and genital systems . To be...Sarcoma HS/tiAIN -- Utilizinq a Behavioral Systems M~odel A~OMN 1rfft.U~ Lieborah Ann Schobel (J AVIt S11UI’ICNT AT: AaWR NT"ko" In University of...SARCOMA UTILIZING A BEHAVIORAL SYSTEMS MODEL ABSTRACT The lack of documentation regarding the true impact of an AIDS diagnosis leaves care providers

  13. Requirements for psychological models to support design: Towards ecological task analysis

    Science.gov (United States)

    Kirlik, Alex

    1991-01-01

    Cognitive engineering is largely concerned with creating environmental designs to support skillful and effective human activity. A set of necessary conditions are proposed for psychological models capable of supporting this enterprise. An analysis of the psychological nature of the design product is used to identify a set of constraints that models must meet if they can usefully guide design. It is concluded that cognitive engineering requires models with resources for describing the integrated human-environment system, and that these models must be capable of describing the activities underlying fluent and effective interaction. These features are required in order to be able to predict the cognitive activity that will be required given various design concepts, and to design systems that promote the acquisition of fluent, skilled behavior. These necessary conditions suggest that an ecological approach can provide valuable resources for psychological modeling to support design. Relying heavily on concepts from Brunswik's and Gibson's ecological theories, ecological task analysis is proposed as a framework in which to predict the types of cognitive activity required to achieve productive behavior, and to suggest how interfaces can be manipulated to alleviate certain types of cognitive demands. The framework is described in terms, and illustrated with an example from the previous research on modeling skilled human-environment interaction.

  14. The experimental autoimmune encephalomyelitis (EAE) model of MS: utility for understanding disease pathophysiology and treatment

    Science.gov (United States)

    ROBINSON, ANDREW P.; HARP, CHRISTOPHER T.; NORONHA, AVERTANO; MILLER, STEPHEN D.

    2014-01-01

    While no single model can exactly recapitulate all aspects of multiple sclerosis (MS), animal models are essential in understanding the induction and pathogenesis of the disease and to develop therapeutic strategies that limit disease progression and eventually lead to effective treatments for the human disease. Several different models of MS exist, but by far the best understood and most commonly used is the rodent model of experimental autoimmune encephalomyelitis (EAE). This model is typically induced by either active immunization with myelin-derived proteins or peptides in adjuvant or by passive transfer of activated myelin-specific CD4+ T lymphocytes. Mouse models are most frequently used because of the inbred genotype of laboratory mice, their rapid breeding capacity, the ease of genetic manipulation, and availability of transgenic and knockout mice to facilitate mechanistic studies. Although not all therapeutic strategies for MS have been developed in EAE, all of the current US Food and Drug Administration (FDA)-approved immunomodulatory drugs are effective to some degree in treating EAE, a strong indicator that EAE is an extremely useful model to study potential treatments for MS. Several therapies, such as glatiramer acetate (GA: Copaxone), and natalizumab (Tysabri), were tested first in the mouse model of EAE and then went on to clinical trials. Here we discuss the usefulness of the EAE model in understanding basic disease pathophysiology and developing treatments for MS as well as the potential drawbacks of this model. PMID:24507518

  15. Exploring the Utility of Logistic Mixed Modeling Approaches to Simultaneously Investigate Item and Testlet DIF on Testlet-based Data.

    Science.gov (United States)

    Fukuhara, Hirotaka; Paek, Insu

    2016-01-01

    This study explored the utility of logistic mixed models for the analysis of differential item functioning when item response data were testlet-based. Decomposition of differential item functioning (DIF) into item level and testlet level for the testlet-based data was introduced to separate possible sources of DIF: (1) an item, (2) a testlet, and (3) both the item and the testlet. Simulation study was conducted to investigate the performance of several logistic mixed models as well as the Mantel-Haenszel method under the conditions, in which the item-related DIF and testlet-related DIF were present simultaneously. The results revealed that a new DIF model based on a logistic mixed model with random item effects and item covariates could capture the item-related DIF and testlet-related DIF well under certain conditions.

  16. Numerical Study on Guided Wave Propagation in Wood Utility Poles: Finite Element Modelling and Parametric Sensitivity Analysis

    Directory of Open Access Journals (Sweden)

    Yang Yu

    2017-10-01

    Full Text Available Recently, guided wave (GW-based non-destructive evaluation (NDE techniques have been developed and considered as a potential candidate for integrity assessment of wood structures, such as wood utility poles. However, due to the lack of understanding on wave propagation in such structures, especially under the effect of surroundings such as soil, current GW-based NDE methods fail to properly account for the propagation of GWs and to contribute reliable and correct results. To solve this critical issue, this work investigates the behaviour of wave propagation in the wood utility pole with the consideration of the influence of soil. The commercial finite element (FE analysis software ANSYS is used to simulate GW propagation in a wood utility pole. In order to verify the numerical findings, the laboratory testing is also conducted in parallel with the numerical results to experimentally verify the effectiveness of developed FE models. Finally, sensitivity analysis is also carried out based on FE models of wood pole under different material properties, boundary conditions and excitation types.

  17. Do Stochastic Traffic Assignment Models Consider Differences in Road Users Utility Functions?

    DEFF Research Database (Denmark)

    Nielsen, Otto Anker

    1996-01-01

    The early stochastic traffic assignment models (e.g. Dial, 1971) build on the assump-tion that different routes are independent (the logit-model concept). Thus, the models gave severe problems in networks with overlapping routes. Daganzo & Sheffi (1977) suggested to use probit based models...... to overcome this problem. Sheffi & Powell (1981) presented a practically operational solution algorithm in which the travel resistance for each road segment is adjusted according to a Monte Carlo simulation following the Normal-distribution. By this the road users’ ‘perceived travel resistances’ are simulated....... A similar concept is used as a part of the Sto-chastic User Equilibrium model (SUE) suggested by Daganzo and Sheffi (1977) and operationalized by Sheffi & Powell (1982). In the paper it is discussed whether this way of modelling the ‘perceived travel resistance’ is sufficient to describe the road users...

  18. Variable speed hydrodynamic model of anAuv utilizing cross tunnel thrusters

    Science.gov (United States)

    2017-09-01

    controllability at low speeds to safely and effectively dock the vehicle in a subsea environment. Kongsberg, REMUS’s production company , has released a REMUS...actual REMUS controllers’ design and parameters being proprietary information. However, even with a slightly different controller implemented in the model...performed on the model until the behavior of the modeled REMUS closely approximated the characteristic dive and thrust behaviors of the actual vehicle

  19. A Stochastic Traffic Assignment Model Considering Differences in Passengers Utility Functions

    DEFF Research Database (Denmark)

    Nielsen, Otto Anker

    1997-01-01

    The paper presents a framework for public transport assignment that builds on the probit-based model of Sheffi & Powell (1981 & 1982). Hereby, the problems with overlapping routes that occur in many public transport models can be avoided. In the paper, the probit-based model in its pure form....... This is both due to the probit model’s ability to describe overlapping routes and due to the many different weights and distributions that make it possible to calibrate the model. In practice, the many parameters might also be the methods main weakness, since this complicates the calibration....

  20. Army Business Transformation: The Utility of Using Corporate Business Models within the Institutional Army

    National Research Council Canada - National Science Library

    Bailer, Jr., John J

    2007-01-01

    .... Through a survey of the literature of published corporate business plans and models, military reports, Army depot case studies, and comparative analysis of emerging computer software technology...

  1. Implications of Model Structure and Detail for Utility Planning. Scenario Case Studies using the Resource Planning Model

    Energy Technology Data Exchange (ETDEWEB)

    Mai, Trieu [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Barrows, Clayton [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Lopez, Anthony [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hale, Elaine [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Dyson, Mark [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Eurek, Kelly [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2015-04-23

    We examine how model investment decisions change under different model configurations and assumptions related to renewable capacity credit, the inclusion or exclusion of operating reserves, dispatch period sampling, transmission power flow modeling, renewable spur line costs, and the ability of a planning region to import and export power. For all modeled scenarios, we find that under market conditions where new renewable deployment is predominantly driven by renewable portfolio standards, model representations of wind and solar capacity credit and interactions between balancing areas are most influential in avoiding model investments in excess thermal capacity. We also compare computation time between configurations to evaluate tradeoffs between computational burden and model accuracy. From this analysis, we find that certain advanced dispatch representations (e.g., DC optimal power flow) can have dramatic adverse effects on computation time but can be largely inconsequential to model investment outcomes, at least at the renewable penetration levels modeled. Finally, we find that certain underappreciated aspects of new capacity investment decisions and model representations thereof, such as spur lines for new renewable capacity, can influence model outcomes particularly in the renewable technology and location chosen by the model. Though this analysis is not comprehensive and results are specific to the model region, input assumptions, and optimization-modeling framework employed, the findings are intended to provide a guide for model improvement opportunities.

  2. Utilizing Mars Global Reference Atmospheric Model (Mars-GRAM 2005) to Evaluate Entry Probe Mission Sites

    Science.gov (United States)

    Justh, Hilary L.; Justus, Carl G.

    2008-01-01

    The Mars Global Reference Atmospheric Model (Mars-GRAM 2005) is an engineering-level atmospheric model widely used for diverse mission applications. An overview is presented of Mars-GRAM 2005 and its new features. The "auxiliary profile" option is one new feature of Mars-GRAM 2005. This option uses an input file of temperature and density versus altitude to replace the mean atmospheric values from Mars-GRAM's conventional (General Circulation Model) climatology. Any source of data or alternate model output can be used to generate an auxiliary profile. Auxiliary profiles for this study were produced from mesoscale model output (Southwest Research Institute's Mars Regional Atmospheric Modeling System (MRAMS) model and Oregon State University's Mars mesoscale model (MMM5) model) and a global Thermal Emission Spectrometer (TES) database. The global TES database has been specifically generated for purposes of making Mars-GRAM auxiliary profiles. This data base contains averages and standard deviations of temperature, density, and thermal wind components, averaged over 5-by-5 degree latitude-longitude bins and 15 degree Ls bins, for each of three Mars years of TES nadir data. The Mars Science Laboratory (MSL) sites are used as a sample of how Mars-GRAM' could be a valuable tool for planning of future Mars entry probe missions. Results are presented using auxiliary profiles produced from the mesoscale model output and TES observed data for candidate MSL landing sites. Input parameters rpscale (for density perturbations) and rwscale (for wind perturbations) can be used to "recalibrate" Mars-GRAM perturbation magnitudes to better replicate observed or mesoscale model variability.

  3. Definition of common support equipment and space station interface requirements for IOC model technology experiments

    Science.gov (United States)

    Russell, Richard A.; Waiss, Richard D.

    1988-01-01

    A study was conducted to identify the common support equipment and Space Station interface requirements for the IOC (initial operating capabilities) model technology experiments. In particular, each principal investigator for the proposed model technology experiment was contacted and visited for technical understanding and support for the generation of the detailed technical backup data required for completion of this study. Based on the data generated, a strong case can be made for a dedicated technology experiment command and control work station consisting of a command keyboard, cathode ray tube, data processing and storage, and an alert/annunciator panel located in the pressurized laboratory.

  4. Improved Traceability of Mission Concept to Requirements Using Model Based Systems Engineering

    Science.gov (United States)

    Reil, Robin

    2014-01-01

    Model Based Systems Engineering (MBSE) has recently been gaining significant support as a means to improve the traditional document-based systems engineering (DBSE) approach to engineering complex systems. In the spacecraft design domain, there are many perceived and propose benefits of an MBSE approach, but little analysis has been presented to determine the tangible benefits of such an approach (e.g. time and cost saved, increased product quality). This thesis presents direct examples of how developing a small satellite system model can improve traceability of the mission concept to its requirements. A comparison of the processes and approaches for MBSE and DBSE is made using the NASA Ames Research Center SporeSat CubeSat mission as a case study. A model of the SporeSat mission is built using the Systems Modeling Language standard and No Magics MagicDraw modeling tool. The model incorporates mission concept and requirement information from the missions original DBSE design efforts. Active dependency relationships are modeled to analyze the completeness and consistency of the requirements to the mission concept. Overall experience and methodology are presented for both the MBSE and original DBSE design efforts of SporeSat.

  5. Improved Traceability of a Small Satellite Mission Concept to Requirements Using Model Based System Engineering

    Science.gov (United States)

    Reil, Robin L.

    2014-01-01

    Model Based Systems Engineering (MBSE) has recently been gaining significant support as a means to improve the "traditional" document-based systems engineering (DBSE) approach to engineering complex systems. In the spacecraft design domain, there are many perceived and propose benefits of an MBSE approach, but little analysis has been presented to determine the tangible benefits of such an approach (e.g. time and cost saved, increased product quality). This paper presents direct examples of how developing a small satellite system model can improve traceability of the mission concept to its requirements. A comparison of the processes and approaches for MBSE and DBSE is made using the NASA Ames Research Center SporeSat CubeSat mission as a case study. A model of the SporeSat mission is built using the Systems Modeling Language standard and No Magic's MagicDraw modeling tool. The model incorporates mission concept and requirement information from the mission's original DBSE design efforts. Active dependency relationships are modeled to demonstrate the completeness and consistency of the requirements to the mission concept. Anecdotal information and process-duration metrics are presented for both the MBSE and original DBSE design efforts of SporeSat.

  6. Fractured rock modeling in the National Waste Terminal Storage Program: a review of requirements and status

    Energy Technology Data Exchange (ETDEWEB)

    St. John, C.; Krug, A.; Key, S.; Monsees, J.

    1983-05-01

    Generalized computer codes capable of forming the basis for numerical models of fractured rock masses are being used within the NWTS program. Little additional development of these codes is considered justifiable, except in the area of representation of discrete fractures. On the other hand, model preparation requires definition of medium-specific constitutive descriptions and site characteristics and is therefore legitimately conducted by each of the media-oriented projects within the National Waste Terminal Storage program. However, it is essential that a uniform approach to the role of numerical modeling be adopted, including agreement upon the contribution of modeling to the design and licensing process and the need for, and means of, model qualification for particular purposes. This report discusses the role of numerical modeling, reviews the capabilities of several computer codes that are being used to support design or performance assessment, and proposes a framework for future numerical modeling activities within the NWTS program.

  7. Fractured rock modeling in the National Waste Terminal Storage Program: a review of requirements and status

    International Nuclear Information System (INIS)

    St John, C.; Krug, A.; Key, S.; Monsees, J.

    1983-05-01

    Generalized computer codes capable of forming the basis for numerical models of fractured rock masses are being used within the NWTS program. Little additional development of these codes is considered justifiable, except in the area of representation of discrete fractures. On the other hand, model preparation requires definition of medium-specific constitutive descriptions and site characteristics and is therefore legitimately conducted by each of the media-oriented projects within the National Waste Terminal Storage program. However, it is essential that a uniform approach to the role of numerical modeling be adopted, including agreement upon the contribution of modeling to the design and licensing process and the need for, and means of, model qualification for particular purposes. This report discusses the role of numerical modeling, reviews the capabilities of several computer codes that are being used to support design or performance assessment, and proposes a framework for future numerical modeling activities within the NWTS program

  8. New efficient utility upper bounds for the fully adaptive model of attack trees

    NARCIS (Netherlands)

    Buldas, Ahto; Lenin, Aleksandr

    2013-01-01

    We present a new fully adaptive computational model for attack trees that allows attackers to repeat atomic attacks if they fail and to play on if they are caught and have to pay penalties. The new model allows safer conclusions about the security of real-life systems and is somewhat

  9. A Brand Loyalty Model Utilizing Team Identification and Customer Satisfaction in the Licensed Sports Product Industry

    Science.gov (United States)

    Lee, Soonhwan; Shin, Hongbum; Park, Jung-Jun; Kwon, Oh-Ryun

    2010-01-01

    The purpose of this study was to investigate the relationship among the attitudinal brand loyalty variables (i.e., cognitive, affective, and conative components), team identification, and customer satisfaction by developing a structural equation model, based on Oliver's (1997) attitudinal brand loyalty model. The results of this study confirmed…

  10. Analytic model utilizing the complex ABCD method for range dependency of a monostatic coherent lidar

    DEFF Research Database (Denmark)

    Olesen, Anders Sig; Pedersen, Anders Tegtmeier; Hanson, Steen Grüner

    2014-01-01

    In this work, we present an analytic model for analyzing the range and frequency dependency of a monostatic coherent lidar measuring velocities of a diffuse target. The model of the signal power spectrum includes both the contribution from the optical system as well as the contribution from the t...

  11. GIS-based suitability modeling and multi-criteria decision analysis for utility scale solar plants in four states in the Southeast U.S

    Science.gov (United States)

    Tisza, Kata

    Photovoltaic (PV) development shows significantly smaller growth in the Southeast U.S., than in the Southwest; which is mainly due to the low cost of fossil-fuel based energy production in the region and the lack of solar incentives. However, the Southeast has appropriate insolation conditions (4.0-6.0 KWh/m2/day) for photovoltaic deployment and in the past decade the region has experienced the highest population growth for the entire country. These factors, combined with new renewable energy portfolio policies, could create an opportunity for PV to provide some of the energy that will be required to sustain this growth. The goal of the study was to investigate the potential for PV generation in the Southeast region by identifying suitable areas for a utility-scale solar power plant deployment. Four states with currently low solar penetration were studied: Georgia, North Carolina, South Carolina and Tennessee. Feasible areas were assessed with Geographic Information Systems (GIS) software using solar, land use and population growth criteria combined with proximity to transmission lines and roads. After the GIS-based assessment of the areas, technological potential was calculated for each state. Multi-decision analysis model (MCDA) was used to simulate the decision making method for a strategic PV installation. The model accounted for all criteria necessary to consider in case of a PV development and also included economic and policy criteria, which is thought to be a strong influence on the PV market. Three different scenarios were established, representing decision makers' theoretical preferences. Map layers created in the first part were used as basis for the MCDA and additional technical, economic and political/market criteria were added. A sensitivity analysis was conducted to test the model's robustness. Finally, weighted criteria were assigned to the GIS map layers, so that the different preference systems could be visualized. As a result, lands suitable for

  12. Universal Quantum Computing:. Third Gen Prototyping Utilizing Relativistic `Trivector' R-Qubit Modeling Surmounting Uncertainty

    Science.gov (United States)

    Amoroso, Richard L.; Kauffman, Louis H.; Giandinoto, Salvatore

    2013-09-01

    We postulate bulk universal quantum computing (QC) cannot be achieved without surmounting the quantum uncertainty principle, an inherent barrier by empirical definition in the regime described by the Copenhagen interpretation of quantum theory - the last remaining hurdle to bulk QC. To surmount uncertainty with probability 1, we redefine the basis for the qubit utilizing a unique form of M-Theoretic Calabi-Yau mirror symmetry cast in an LSXD Dirac covariant polarized vacuum with an inherent `Feynman synchronization backbone'. This also incorporates a relativistic qubit (r-qubit) providing additional degrees of freedom beyond the traditional Block 2-sphere qubit bringing the r-qubit into correspondence with our version of Relativistic Topological Quantum Field Theory (RTQFT). We present a 3rd generation prototype design for simplifying bulk QC implementation.

  13. Renewable Resources: a national catalog of model projects. Volume 4. Western Solar Utilization Network Region

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-07-01

    This compilation of diverse conservation and renewable energy projects across the United States was prepared through the enthusiastic participation of solar and alternate energy groups from every state and region. Compiled and edited by the Center for Renewable Resources, these projects reflect many levels of innovation and technical expertise. In many cases, a critique analysis is presented of how projects performed and of the institutional conditions associated with their success or failure. Some 2000 projects are included in this compilation; most have worked, some have not. Information about all is presented to aid learning from these experiences. The four volumes in this set are arranged in state sections by geographic region, coinciding with the four Regional Solar Energy Centers. The table of contents is organized by project category so that maximum cross-referencing may be obtained. This volume includes information on the Western Solar Utilization Network Region. (WHK)

  14. Modeling Multi-Reservoir Hydropower Systems in the Sierra Nevada with Environmental Requirements and Climate Warming

    Science.gov (United States)

    Rheinheimer, David Emmanuel

    Hydropower systems and other river regulation often harm instream ecosystems, partly by altering the natural flow and temperature regimes that ecosystems have historically depended on. These effects are compounded at regional scales. As hydropower and ecosystems are increasingly valued globally due to growing values for clean energy and native species as well as and new threats from climate warming, it is important to understand how climate warming might affect these systems, to identify tradeoffs between different water uses for different climate conditions, and to identify promising water management solutions. This research uses traditional simulation and optimization to explore these issues in California's upper west slope Sierra Nevada mountains. The Sierra Nevada provides most of the water for California's vast water supply system, supporting high-elevation hydropower generation, ecosystems, recreation, and some local municipal and agricultural water supply along the way. However, regional climate warming is expected to reduce snowmelt and shift runoff to earlier in the year, affecting all water uses. This dissertation begins by reviewing important literature related to the broader motivations of this study, including river regulation, freshwater conservation, and climate change. It then describes three substantial studies. First, a weekly time step water resources management model spanning the Feather River watershed in the north to the Kern River watershed in the south is developed. The model, which uses the Water Evaluation And Planning System (WEAP), includes reservoirs, run-of-river hydropower, variable head hydropower, water supply demand, and instream flow requirements. The model is applied with a runoff dataset that considers regional air temperature increases of 0, 2, 4 and 6 °C to represent historical, near-term, mid-term and far-term (end-of-century) warming. Most major hydropower turbine flows are simulated well. Reservoir storage is also

  15. Utility usage forecasting

    Science.gov (United States)

    Hosking, Jonathan R. M.; Natarajan, Ramesh

    2017-08-22

    The computer creates a utility demand forecast model for weather parameters by receiving a plurality of utility parameter values, wherein each received utility parameter value corresponds to a weather parameter value. Determining that a range of weather parameter values lacks a sufficient amount of corresponding received utility parameter values. Determining one or more utility parameter values that corresponds to the range of weather parameter values. Creating a model which correlates the received and the determined utility parameter values with the corresponding weather parameters values.

  16. Mars Colony in situ resource utilization: An integrated architecture and economics model

    Science.gov (United States)

    Shishko, Robert; Fradet, René; Do, Sydney; Saydam, Serkan; Tapia-Cortez, Carlos; Dempster, Andrew G.; Coulton, Jeff

    2017-09-01

    This paper reports on our effort to develop an ensemble of specialized models to explore the commercial potential of mining water/ice on Mars in support of a Mars Colony. This ensemble starts with a formal systems architecting framework to describe a Mars Colony and capture its artifacts' parameters and technical attributes. The resulting database is then linked to a variety of ;downstream; analytic models. In particular, we integrated an extraction process (i.e., ;mining;) model, a simulation of the colony's environmental control and life support infrastructure known as HabNet, and a risk-based economics model. The mining model focuses on the technologies associated with in situ resource extraction, processing, storage and handling, and delivery. This model computes the production rate as a function of the systems' technical parameters and the local Mars environment. HabNet simulates the fundamental sustainability relationships associated with establishing and maintaining the colony's population. The economics model brings together market information, investment and operating costs, along with measures of market uncertainty and Monte Carlo techniques, with the objective of determining the profitability of commercial water/ice in situ mining operations. All told, over 50 market and technical parameters can be varied in order to address ;what-if; questions, including colony location.

  17. Animal models and therapeutic molecular targets of cancer: utility and limitations.

    Science.gov (United States)

    Cekanova, Maria; Rathore, Kusum

    2014-01-01

    Cancer is the term used to describe over 100 diseases that share several common hallmarks. Despite prevention, early detection, and novel therapies, cancer is still the second leading cause of death in the USA. Successful bench-to-bedside translation of basic scientific findings about cancer into therapeutic interventions for patients depends on the selection of appropriate animal experimental models. Cancer research uses animal and human cancer cell lines in vitro to study biochemical pathways in these cancer cells. In this review, we summarize the important animal models of cancer with focus on their advantages and limitations. Mouse cancer models are well known, and are frequently used for cancer research. Rodent models have revolutionized our ability to study gene and protein functions in vivo and to better understand their molecular pathways and mechanisms. Xenograft and chemically or genetically induced mouse cancers are the most commonly used rodent cancer models. Companion animals with spontaneous neoplasms are still an underexploited tool for making rapid advances in human and veterinary cancer therapies by testing new drugs and delivery systems that have shown promise in vitro and in vivo in mouse models. Companion animals have a relatively high incidence of cancers, with biological behavior, response to therapy, and response to cytotoxic agents similar to those in humans. Shorter overall lifespan and more rapid disease progression are factors contributing to the advantages of a companion animal model. In addition, the current focus is on discovering molecular targets for new therapeutic drugs to improve survival and quality of life in cancer patients.

  18. Animal models and therapeutic molecular targets of cancer: utility and limitations

    Directory of Open Access Journals (Sweden)

    Cekanova M

    2014-10-01

    Full Text Available Maria Cekanova, Kusum Rathore Department of Small Animal Clinical Sciences, College of Veterinary Medicine, The University of Tennessee, Knoxville, TN, USA Abstract: Cancer is the term used to describe over 100 diseases that share several common hallmarks. Despite prevention, early detection, and novel therapies, cancer is still the second leading cause of death in the USA. Successful bench-to-bedside translation of basic scientific findings about cancer into therapeutic interventions for patients depends on the selection of appropriate animal experimental models. Cancer research uses animal and human cancer cell lines in vitro to study biochemical pathways in these cancer cells. In this review, we summarize the important animal models of cancer with focus on their advantages and limitations. Mouse cancer models are well known, and are frequently used for cancer research. Rodent models have revolutionized our ability to study gene and protein functions in vivo and to better understand their molecular pathways and mechanisms. Xenograft and chemically or genetically induced mouse cancers are the most commonly used rodent cancer models. Companion animals with spontaneous neoplasms are still an underexploited tool for making rapid advances in human and veterinary cancer therapies by testing new drugs and delivery systems that have shown promise in vitro and in vivo in mouse models. Companion animals have a relatively high incidence of cancers, with biological behavior, response to therapy, and response to cytotoxic agents similar to those in humans. Shorter overall lifespan and more rapid disease progression are factors contributing to the advantages of a companion animal model. In addition, the current focus is on discovering molecular targets for new therapeutic drugs to improve survival and quality of life in cancer patients. Keywords: mouse cancer model, companion animal cancer model, dogs, cats, molecular targets

  19. Utilizing past and present mouse systems to engineer more relevant pancreatic cancer models.

    Science.gov (United States)

    DeCant, Brian T; Principe, Daniel R; Guerra, Carmen; Pasca di Magliano, Marina; Grippo, Paul J

    2014-01-01

    The study of pancreatic cancer has prompted the development of numerous mouse models that aim to recapitulate the phenotypic and mechanistic features of this deadly malignancy. This review accomplishes two tasks. First, it provides an overview of the models that have been used as representations of both the neoplastic and carcinoma phenotypes. Second, it presents new modeling schemes that ultimately will serve to more faithfully capture the temporal and spatial progression of the human disease, providing platforms for improved understanding of the role of non-epithelial compartments in disease etiology as well as evaluating therapeutic approaches.

  20. Utilizing Past and Present Mouse Systems to Engineer More Relevant Pancreatic Cancer Models

    Directory of Open Access Journals (Sweden)

    Brian T DeCant

    2014-12-01

    Full Text Available The study of pancreatic cancer has prompted the development of numerous mouse models that aim to recapitulate the phenotypic and mechanistic features of this deadly malignancy. This review accomplishes two tasks. First, it provides an overview of the models that have been used as representations of both the neoplastic and carcinoma phenotypes. Second, it presents new modeling schemes that ultimately will serve to more faithfully capture the temporal and spatial progression of the human disease, providing platforms for improved understanding of the role of non-epithelial compartments in disease etiology as well as evaluating therapeutic approaches.

  1. Not just a theory--the utility of mathematical models in evolutionary biology.

    Science.gov (United States)

    Servedio, Maria R; Brandvain, Yaniv; Dhole, Sumit; Fitzpatrick, Courtney L; Goldberg, Emma E; Stern, Caitlin A; Van Cleve, Jeremy; Yeh, D Justin

    2014-12-01

    Progress in science often begins with verbal hypotheses meant to explain why certain biological phenomena exist. An important purpose of mathematical models in evolutionary research, as in many other fields, is to act as “proof-of-concept” tests of the logic in verbal explanations, paralleling the way in which empirical data are used to test hypotheses. Because not all subfields of biology use mathematics for this purpose, misunderstandings of the function of proof-of-concept modeling are common. In the hope of facilitating communication, we discuss the role of proof-of-concept modeling in evolutionary biology.

  2. Not just a theory--the utility of mathematical models in evolutionary biology.

    Directory of Open Access Journals (Sweden)

    Maria R Servedio

    2014-12-01

    Full Text Available Progress in science often begins with verbal hypotheses meant to explain why certain biological phenomena exist. An important purpose of mathematical models in evolutionary research, as in many other fields, is to act as “proof-of-concept” tests of the logic in verbal explanations, paralleling the way in which empirical data are used to test hypotheses. Because not all subfields of biology use mathematics for this purpose, misunderstandings of the function of proof-of-concept modeling are common. In the hope of facilitating communication, we discuss the role of proof-of-concept modeling in evolutionary biology.

  3. Conclusions of the workshop on the ATLAS requirements on shower models

    CERN Document Server

    Bosman, M; Efthymiopoulos, I; Froidevaux, D; Gianotti, F; Kiryunin, A E; Knobloch, J; Loch, P; Osculati, B; Perini, L; Sala, P R; Seman, M

    1999-01-01

    The workshop addressed the question of shower models/packages and related issues needed for the simulation of ATLAS physics and test beam data. Part of the material discussed during the workshop is reviewed in this note. Results presented on the comparison berween ATLAS test beam data and Monte Carlo prediction of various shower models are briefly summarized. The requirements put forward by the various detector communities and first attempts to quantify them are being reviewed.

  4. System Design Description and Requirements for Modeling the Off-Gas Systems for Fuel Recycling Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Daryl R. Haefner; Jack D. Law; Troy J. Tranter

    2010-08-01

    This document provides descriptions of the off-gases evolved during spent nuclear fuel processing and the systems used to capture the gases of concern. Two reprocessing techniques are discussed, namely aqueous separations and electrochemical (pyrochemical) processing. The unit operations associated with each process are described in enough detail so that computer models to mimic their behavior can be developed. The document also lists the general requirements for the desired computer models.

  5. Risk of Cyberterrorism to Naval Ships Inport Naval Station Everett: A Model Based Project Utilizing SIAM

    National Research Council Canada - National Science Library

    Tester, Rodrick A

    2007-01-01

    .... In doing so, an influence net model was designed to discover the likelihood of a successful cyber attack However, first it was necessary to establish what the best mitigation tools are in defense...

  6. Utility and Applicability of the Sharable Content Object Reference Model (SCORM) Within Navy Higher Education

    National Research Council Canada - National Science Library

    Kohistany, Mohammad

    2004-01-01

    This thesis critically analyzes the Sharable Content Object Reference Model (SCORM) within higher education and examines SCORM's limitations within a realistic application environment versus within a theoretical/conceptual platform...

  7. Army Business Transformation: The Utility of Using Corporate Business Models within the Institutional Army

    National Research Council Canada - National Science Library

    Bailer, Jr., John J

    2007-01-01

    .... This study finds that working corporate models, such as Lean Six Sigma (LSS), are available which are already enabling the transformation of a very specific aspect within the institutional Army...

  8. Characterizing biogenous sediments using multibeam echosounder backscatter data - Estimating power law parameter utilizing various models

    Digital Repository Service at National Institute of Oceanography (India)

    Chakraborty, B.; Kodagali, V.N.

    In this paper, Helmholtz-Kirchhoff (H-K) roughness model is employed to characterize seafloor sediment and roughness parameters from the eastern sector of the Southern Oceans The multibeam- Hydroswcep system's angular-backscatter data, which...

  9. A Steam Utility Network Model for the Evaluation of Heat Integration Retrofits – A Case Study of an Oil Refinery

    Directory of Open Access Journals (Sweden)

    Sofie Marton

    2017-12-01

    Full Text Available This paper presents a real industrial example in which the steam utility network of a refinery is modelled in order to evaluate potential Heat Integration retrofits proposed for the site. A refinery, typically, has flexibility to optimize the operating strategy for the steam system depending on the operation of the main processes. This paper presents a few examples of Heat Integration retrofit measures from a case study of a large oil refinery. In order to evaluate expected changes in fuel and electricity imports to the refinery after implementation of the proposed retrofits, a steam system model has been developed. The steam system model has been tested and validated with steady state data from three different operating scenarios and can be used to evaluate how changes to steam balances at different pressure levels would affect overall steam balances, generation of shaft power in turbines, and the consumption of fuel gas.

  10. Using Deep Learning for Targeted Data Selection, Improving Satellite Observation Utilization for Model Initialization

    Science.gov (United States)

    Lee, Y. J.; Bonfanti, C. E.; Trailovic, L.; Etherton, B.; Govett, M.; Stewart, J.

    2017-12-01

    At present, a fraction of all satellite observations are ultimately used for model assimilation. The satellite data assimilation process is computationally expensive and data are often reduced in resolution to allow timely incorporation into the forecast. This problem is only exacerbated by the recent launch of Geostationary Operational Environmental Satellite (GOES)-16 satellite and future satellites providing several order of magnitude increase in data volume. At the NOAA Earth System Research Laboratory (ESRL) we are researching the use of machine learning the improve the initial selection of satellite data to be used in the model assimilation process. In particular, we are investigating the use of deep learning. Deep learning is being applied to many image processing and computer vision problems with great success. Through our research, we are using convolutional neural network to find and mark regions of interest (ROI) to lead to intelligent extraction of observations from satellite observation systems. These targeted observations will be used to improve the quality of data selected for model assimilation and ultimately improve the impact of satellite data on weather forecasts. Our preliminary efforts to identify the ROI's are focused in two areas: applying and comparing state-of-art convolutional neural network models using the analysis data from the National Center for Environmental Prediction (NCEP) Global Forecast System (GFS) weather model, and using these results as a starting point to optimize convolution neural network model for pattern recognition on the higher resolution water vapor data from GOES-WEST and other satellite. This presentation will provide an introduction to our convolutional neural network model to identify and process these ROI's, along with the challenges of data preparation, training the model, and parameter optimization.

  11. Sensitivity Analysis of Corrosion Rate Prediction Models Utilized for Reinforced Concrete Affected by Chloride

    Science.gov (United States)

    Siamphukdee, Kanjana; Collins, Frank; Zou, Roger

    2013-06-01

    Chloride-induced reinforcement corrosion is one of the major causes of premature deterioration in reinforced concrete (RC) structures. Given the high maintenance and replacement costs, accurate modeling of RC deterioration is indispensable for ensuring the optimal allocation of limited economic resources. Since corrosion rate is one of the major factors influencing the rate of deterioration, many predictive models exist. However, because the existing models use very different sets of input parameters, the choice of model for RC deterioration is made difficult. Although the factors affecting corrosion rate are frequently reported in the literature, there is no published quantitative study on the sensitivity of predicted corrosion rate to the various input parameters. This paper presents the results of the sensitivity analysis of the input parameters for nine selected corrosion rate prediction models. Three different methods of analysis are used to determine and compare the sensitivity of corrosion rate to various input parameters: (i) univariate regression analysis, (ii) multivariate regression analysis, and (iii) sensitivity index. The results from the analysis have quantitatively verified that the corrosion rate of steel reinforcement bars in RC structures is highly sensitive to corrosion duration time, concrete resistivity, and concrete chloride content. These important findings establish that future empirical models for predicting corrosion rate of RC should carefully consider and incorporate these input parameters.

  12. Studies and comparison of currently utilized models for ablation in Electrothermal-chemical guns

    Science.gov (United States)

    Jia, Shenli; Li, Rui; Li, Xingwen

    2009-10-01

    Wall ablation is a key process taking place in the capillary plasma generator in Electrothermal-Chemical (ETC) guns, whose characteristic directly decides the generator's performance. In the present article, this ablation process is theoretically studied. Currently widely used mathematical models designed to describe such process are analyzed and compared, including a recently developed kinetic model which takes into account the unsteady state in plasma-wall transition region by dividing it into two sub-layers, a Knudsen layer and a collision dominated non-equilibrium Hydrodynamic layer, a model based on Langmuir Law, as well as a simplified model widely used in arc-wall interaction process in circuit breakers, which assumes a proportional factor and an ablation enthalpy obtained empirically. Bulk plasma state and parameters are assumed to be consistent while analyzing and comparing each model, in order to take into consideration only the difference caused by model itself. Finally ablation rate is calculated in each method respectively and differences are discussed.

  13. System-Level Modeling of an ICE-powered Vehicle with Thermoelectric Waste-Heat-Utilization

    OpenAIRE

    Braig, Thomas; Ungethüm, Jörg

    2009-01-01

    There is a huge demand for heat storages for evaporation applications. Thermal storage systems are used to increase the efficiency of thermal systems by an improved adaption of energy availability and energy demand. In this paper a possible solution for modular storage systems from 200-600 °C and pressures up to 100 bar is presented. The application of steam as a working medium requires the availability of isothermal storage if charging/discharging should take place...

  14. Pareto utility

    NARCIS (Netherlands)

    Ikefuji, M.; Laeven, R.J.A.; Magnus, J.R.; Muris, C.H.M.

    2013-01-01

    In searching for an appropriate utility function in the expected utility framework, we formulate four properties that we want the utility function to satisfy. We conduct a search for such a function, and we identify Pareto utility as a function satisfying all four desired properties. Pareto utility

  15. 3D Core Model for simulation of nuclear power plants: Simulation requirements, model features, and validation

    International Nuclear Information System (INIS)

    Zerbino, H.

    1999-01-01

    In 1994-1996, Thomson Training and Simulation (TT and S) earned out the D50 Project, which involved the design and construction of optimized replica simulators for one Dutch and three German Nuclear Power Plants. It was recognized early on that the faithful reproduction of the Siemens reactor control and protection systems would impose extremely stringent demands on the simulation models, particularly the Core physics and the RCS thermohydraulics. The quality of the models, and their thorough validation, were thus essential. The present paper describes the main features of the fully 3D Core model implemented by TT and S, and its extensive validation campaign, which was defined in extremely positive collaboration with the Customer and the Core Data suppliers. (author)

  16. Quantification of Dynamic Model Validation Metrics Using Uncertainty Propagation from Requirements

    Science.gov (United States)

    Brown, Andrew M.; Peck, Jeffrey A.; Stewart, Eric C.

    2018-01-01

    The Space Launch System, NASA's new large launch vehicle for long range space exploration, is presently in the final design and construction phases, with the first launch scheduled for 2019. A dynamic model of the system has been created and is critical for calculation of interface loads and natural frequencies and mode shapes for guidance, navigation, and control (GNC). Because of the program and schedule constraints, a single modal test of the SLS will be performed while bolted down to the Mobile Launch Pad just before the first launch. A Monte Carlo and optimization scheme will be performed to create thousands of possible models based on given dispersions in model properties and to determine which model best fits the natural frequencies and mode shapes from modal test. However, the question still remains as to whether this model is acceptable for the loads and GNC requirements. An uncertainty propagation and quantification (UP and UQ) technique to develop a quantitative set of validation metrics that is based on the flight requirements has therefore been developed and is discussed in this paper. There has been considerable research on UQ and UP and validation in the literature, but very little on propagating the uncertainties from requirements, so most validation metrics are "rules-of-thumb;" this research seeks to come up with more reason-based metrics. One of the main assumptions used to achieve this task is that the uncertainty in the modeling of the fixed boundary condition is accurate, so therefore that same uncertainty can be used in propagating the fixed-test configuration to the free-free actual configuration. The second main technique applied here is the usage of the limit-state formulation to quantify the final probabilistic parameters and to compare them with the requirements. These techniques are explored with a simple lumped spring-mass system and a simplified SLS model. When completed, it is anticipated that this requirements-based validation

  17. A Conceptual Model and Process for Client-driven Agile Requirements Prioritization

    NARCIS (Netherlands)

    Racheva, Z.; Daneva, Maia; Herrmann, Andrea; Wieringa, Roelf J.

    Continuous customer-centric requirements reprioritization is essential in successfully performing agile software development. Yet, in the agile RE literature, very little is known about how agile reprioritization happens in practice. Generic conceptual models about this process are missing, which in

  18. INTEGRATED DATA CAPTURING REQUIREMENTS FOR 3D SEMANTIC MODELLING OF CULTURAL HERITAGE: THE INCEPTION PROTOCOL

    Directory of Open Access Journals (Sweden)

    R. Di Giulio

    2017-02-01

    In order to face these challenges and to start solving the issue of the large amount of captured data and time-consuming processes in the production of 3D digital models, an Optimized Data Acquisition Protocol (DAP has been set up. The purpose is to guide the processes of digitization of cultural heritage, respecting needs, requirements and specificities of cultural assets.

  19. Handling non-functional requirements in model-driven development: an ongoing industrial survey

    NARCIS (Netherlands)

    Ameller, David; Franch, Xavier; Gómez, Cristina; Araújo, João; Berntsson Svensson, Richard; Biffle, Stefan; Cabot, Jordi; Cortelessa, Vittorio; Daneva, Maia; Méndez Fernández, Daniel; Moreira, Ana; Muccini, Henry; Vallecillo, Antonio; Wimmer, Manuel; Amaral, Vasco; Brunelière, Hugo; Burgueño, Loli; Goulão, Miguel; Schätz, Bernard; Teufl, Sabine

    2015-01-01

    Model-Driven Development (MDD) is no longer a novel development paradigm. It has become mature from a research perspective and recent studies show its adoption in industry. Still, some issues remain a challenge. Among them, we are interested in the treatment of non-functional requirements (NFRs) in

  20. Projected irrigation requirements for upland crops using soil moisture model under climate change in South Korea

    Science.gov (United States)

    An increase in abnormal climate change patterns and unsustainable irrigation in uplands cause drought and affect agricultural water security, crop productivity, and price fluctuations. In this study, we developed a soil moisture model to project irrigation requirements (IR) for upland crops under cl...

  1. Teaching Tip: Using Rapid Game Prototyping for Exploring Requirements Discovery and Modeling

    Science.gov (United States)

    Dalal, Nikunj

    2012-01-01

    We describe the use of rapid game prototyping as a pedagogic technique to experientially explore and learn requirements discovery, modeling, and specification in systems analysis and design courses. Students have a natural interest in gaming that transcends age, gender, and background. Rapid digital game creation is used to build computer games…

  2. Animal models and therapeutic molecular targets of cancer: utility and limitations

    Science.gov (United States)

    Cekanova, Maria; Rathore, Kusum

    2014-01-01

    Cancer is the term used to describe over 100 diseases that share several common hallmarks. Despite prevention, early detection, and novel therapies, cancer is still the second leading cause of death in the USA. Successful bench-to-bedside translation of basic scientific findings about cancer into therapeutic interventions for patients depends on the selection of appropriate animal experimental models. Cancer research uses animal and human cancer cell lines in vitro to study biochemical pathways in these cancer cells. In this review, we summarize the important animal models of cancer with focus on their advantages and limitations. Mouse cancer models are well known, and are frequently used for cancer research. Rodent models have revolutionized our ability to study gene and protein functions in vivo and to better understand their molecular pathways and mechanisms. Xenograft and chemically or genetically induced mouse cancers are the most commonly used rodent cancer models. Companion animals with spontaneous neoplasms are still an underexploited tool for making rapid advances in human and veterinary cancer therapies by testing new drugs and delivery systems that have shown promise in vitro and in vivo in mouse models. Companion animals have a relatively high incidence of cancers, with biological behavior, response to therapy, and response to cytotoxic agents similar to those in humans. Shorter overall lifespan and more rapid disease progression are factors contributing to the advantages of a companion animal model. In addition, the current focus is on discovering molecular targets for new therapeutic drugs to improve survival and quality of life in cancer patients. PMID:25342884

  3. Understanding the relationship between Kano model's customer satisfaction scores and self-stated requirements importance.

    Science.gov (United States)

    Mkpojiogu, Emmanuel O C; Hashim, Nor Laily

    2016-01-01

    Customer satisfaction is the result of product quality and viability. The place of the perceived satisfaction of users/customers for a software product cannot be neglected especially in today competitive market environment as it drives the loyalty of customers and promotes high profitability and return on investment. Therefore understanding the importance of requirements as it is associated with the satisfaction of users/customers when their requirements are met is worth the pain considering. It is necessary to know the relationship between customer satisfactions when their requirements are met (or their dissatisfaction when their requirements are unmet) and the importance of such requirement. So many works have been carried out on customer satisfaction in connection with the importance of requirements but the relationship between customer satisfaction scores (coefficients) of the Kano model and users/customers self-stated requirements importance have not been sufficiently explored. In this study, an attempt is made to unravel the underlying relationship existing between Kano model's customer satisfaction indexes and users/customers self reported requirements importance. The results of the study indicate some interesting associations between these considered variables. These bivariate associations reveal that customer satisfaction index (SI), and average satisfaction coefficient (ASC) and customer dissatisfaction index (DI) and average satisfaction coefficient (ASC) are highly correlated (r = 96 %) and thus ASC can be used in place of either SI or DI in representing customer satisfaction scores. Also, these Kano model's customer satisfaction variables (SI, DI, and ASC) are each associated with self-stated requirements importance (IMP). Further analysis indicates that the value customers or users place on requirements that are met or on features that are incorporated into a product influences the level of satisfaction such customers derive from the product. The

  4. A Fuzzy Utility-Based Multi-Criteria Model for Evaluating Households’ Energy Conservation Performance: A Taiwanese Case Study

    Directory of Open Access Journals (Sweden)

    Sung-Lin Hsueh

    2012-08-01

    Full Text Available Industry and economy are developed to satisfy the needs and material desires of people. In addition to making high greenhouse gas emissions the responsibility of industry, individuals and families should also be held responsible for the production of greenhouse gas emissions. In this study, we applied the Delphi method, the analytical hierarchy process, utility theory, and fuzzy logic theory to establish an energy conservation assessment model for households. We also emphasize that subsidy policy makers should consider the social responsibility of households and individuals, as well as sustainability of energy conservation.

  5. Utilization of mesoscale atmospheric dynamic model PHYSIC as a meteorological forecast model in nuclear emergency response system

    International Nuclear Information System (INIS)

    Nagai, Haruyasu; Yamazawa, Hiromi

    1997-01-01

    It is advantageous for an emergency response system to have a forecast function to provide a time margin for countermeasures in case of a nuclear accident. We propose to apply an atmospheric dynamic model PHYSIC (Prognostic HYdroStatic model Including turbulence Closure model) as a meteorological forecast model in the emergency system. The model uses GPV data which are the output of the numerical weather forecast model of Japan Meteorological Agency as the initial and boundary conditions. The roles of PHYSIC are the interface between GPV data and the emergency response system and the forecast of local atmospheric phenomena within the model domain. This paper presents a scheme to use PHYSIC to forecast local wind and its performance. Horizontal grid number of PHYSIC is fixed to 50 x 50, whereas the mesh and domain sizes are determined in consideration of topography causing local winds at an objective area. The model performance was examined for the introduction of GPV data through initial and boundary conditions and the predictability of local wind field and atmospheric stability. The model performance was on an acceptable level as the forecast model. It was also recognized that improvement of cloud calculation was necessary in simulating atmospheric stability. (author)

  6. Utilizing high throughput screening data for predictive toxicology models: protocols and application to MLSCN assays

    Science.gov (United States)

    Guha, Rajarshi; Schürer, Stephan C.

    2008-06-01

    Computational toxicology is emerging as an encouraging alternative to experimental testing. The Molecular Libraries Screening Center Network (MLSCN) as part of the NIH Molecular Libraries Roadmap has recently started generating large and diverse screening datasets, which are publicly available in PubChem. In this report, we investigate various aspects of developing computational models to predict cell toxicity based on cell proliferation screening data generated in the MLSCN. By capturing feature-based information in those datasets, such predictive models would be useful in evaluating cell-based screening results in general (for example from reporter assays) and could be used as an aid to identify and eliminate potentially undesired compounds. Specifically we present the results of random forest ensemble models developed using different cell proliferation datasets and highlight protocols to take into account their extremely imbalanced nature. Depending on the nature of the datasets and the descriptors employed we were able to achieve percentage correct classification rates between 70% and 85% on the prediction set, though the accuracy rate dropped significantly when the models were applied to in vivo data. In this context we also compare the MLSCN cell proliferation results with animal acute toxicity data to investigate to what extent animal toxicity can be correlated and potentially predicted by proliferation results. Finally, we present a visualization technique that allows one to compare a new dataset to the training set of the models to decide whether the new dataset may be reliably predicted.

  7. Information support model and its impact on utility, satisfaction and loyalty of users

    Directory of Open Access Journals (Sweden)

    Sead Šadić

    2016-11-01

    Full Text Available In today’s modern age, information systems are of vital importance for successful performance of any organization. The most important role of any information system is its information support. This paper develops an information support model and presents the results of the survey examining the effects of such model. The survey was performed among the employees of Brčko District Government and comprised three phases. The first phase assesses the influence of the quality of information support and information on information support when making decisions. The second phase examines the impact of information support when making decisions on the perceived availability and user satisfaction with information support. The third phase examines the effects of perceived usefulness as well as information support satisfaction on user loyalty. The model is presented using six hypotheses, which were tested by means of a multivariate regression analysis. The demonstrated model shows that the quality of information support and information is of vital importance in the decision-making process. The perceived usefulness and customer satisfaction are of vital importance for continuous usage of information support. The model is universal, and if slightly modified, it can be used in any sphere of life where satisfaction is measured for clients and users of some service.

  8. [Utility of a statistical model of cognitive styles in attention deficit hyperactivity disorder].

    Science.gov (United States)

    López Villalobos, José Antonio; Serrano Pintado, Isabel; Sánchez-Mateos, Juan Delgado; de Llano, Jesús María Andrés; Sánchez Azón, María Isabel; Alberola López, Susana

    2011-11-01

    The purpose of this study was to determine the best statistical model of cognitive styles, based on the MFFT-20, CEFT and Stroop tests to predict attention deficit hyperactivity disorder (ADHD), analyzing the validity of the model for the diagnosis of the disease. We studied 100 ADHD cases (DSM-IV criteria) and 100 controls, age ranging between 7 and 11 years. Controls were randomly recruited and matched in age, gender and sociodemographic area with ADHD cases. On average, ADHD cases showed more impulsiveness (d: 1.28), less cognitive flexibility (d: 0.91) and more field dependence (d: 1.62) than controls. The logistic regression model that predicts ADHD best is made up of age, CEFT, MFFT-20 and Stroop tests and the formula derived from the model shows 85% sensitivity and 85% specificity for ADHD, regarding the DSM-IV criteria as the standard. The statistical model of cognitive styles presents valid indicators to diagnose ADHD, contributing to an increase in the objectivity of its analysis.

  9. Therapeutic utility of aspirin in the ApcMin/+ murine model of colon carcinogenesis

    International Nuclear Information System (INIS)

    Reuter, Brian K; Zhang, Xiao-Jing; Miller, Mark JS

    2002-01-01

    In recent years it has become evident that nonsteroidal anti-inflammatory drugs, in particular aspirin represent a potential class of cancer chemotherapeutic agents. Despite the wealth of knowledge gained from epidemiological, clinical and animal studies, the effectiveness of aspirin to treat established gastrointestinal cancer has not been determined. The present study examines the ability of aspirin to treat established polyposis in Min/+ mice. Min/+ mice with established polyposis were treated orally once daily from 12–16 weeks of age with either drug vehicle or aspirin (25 mg/kg). Upon completion of treatment, the number, location and size of intestinal tumours was determined. Additional variables examined were the number of apoptotic cells within tumours and COX activity. Administration of aspirin for 4 weeks to Min/+ mice produce no effect on tumour number compared to vehicle-treated Min/+ mice (65 ± 8 vs. 63 ± 9, respectively). In addition, aspirin had no effect on tumour size or location. However, aspirin treatment produced a greater than 2-fold (p < 0.05) increase in the number of apoptotic positive cells within tumours and significantly decreased hepatic PGE 2 content. Aspirin was found to have no effect on tumour number and size when administered to Min/+ mice with established polyposis. The findings in the present study call in to question the utility of aspirin as a stand-alone treatment for established GI cancer. However, aspirin's ability to significantly promote apoptosis may render it suitable for use in combinatorial chemotherapy

  10. Quality Requirements Put On The Inconel 625 Austenite Layer Used On The Sheet Pile Walls Of The Boiler’s Evaporator To Utilize Waste Thermally

    Directory of Open Access Journals (Sweden)

    Słania J.

    2015-06-01

    Full Text Available Quality requirements and tests taken on the surfacing layer Inconel 625 are presented in the article. The reasons of using surfacing layer Inconel 625 and technologies of its making with a particular emphasis on the CMT method are described. Quality requirements for the surfacing weld Inconel 625 are provided. Basic requirements included in the Merkblatt 1166, as well as additional requirements, which are reflected in the technical specifications of the boilers’ producers are specified.

  11. Using direct normal irradiance models and utility electrical loading to assess benefit of a concentrating solar power plant

    Science.gov (United States)

    Direct normal irradiance (DNI) is required to evaluate performance of concentrating solar energy systems. The objective of this paper is to analyze the effect of time interval (e.g. year, month, hour) on the accuracy of three different DNI models. The DNI data were measured at three different labora...

  12. Least cost, utility scale abatement from Australia's NEM (National Electricity Market). Part 1: Problem formulation and modelling

    International Nuclear Information System (INIS)

    Jeppesen, M.; Brear, M.J.; Chattopadhyay, D.; Manzie, C.; Dargaville, R.; Alpcan, T.

    2016-01-01

    This paper is the first of a two part study that considers long term, least cost, GHG (greenhouse gas) abatement pathways for an electricity system. Part 1 formulates a planning model to optimise these pathways and presents results for a single reference scenario. Part 2 applies this model to different scenarios and considers the policy implications. The planning model formulated has several constraints which are important when considering GHG abatement and widespread uptake of intermittent renewable generation. These constraints do not appear to have been integrated into a single planning model previously, and include constraints on annual GHG emissions, unit commitment, storage, plant dynamics and intermittent renewable generation. The model prioritises overall abatement, and therefore does not include a price on carbon or support for any particular technology. The model is applied to Australia's NEM (National Electricity Market) as an example. All model inputs – for technologies, demand, and meteorological data – are from the most current and authoritative public sources. As such, the results are transparently derived and both policy and technology neutral. For the reference scenario presented here, key technologies are wind from 2015, gas generation from 2030, and solar generation from 2040. - Highlights: • An electricity system planning model for least-cost long term abatement is presented. • The model has inter-temporal constraints (linearised commitment and utility storage). • The model is applied to Australia's NEM (National Electricity Market) as a case study. • Key technologies: wind from 2015, CCGT from 2030, and solar (PV & thermal) from 2040.

  13. How Many Model Evaluations Are Required To Predict The AEP Of A Wind Power Plant?

    DEFF Research Database (Denmark)

    Murcia Leon, Juan Pablo; Réthoré, Pierre-Elouan; Natarajan, Anand

    2015-01-01

    (AEP) predictions expensive. The objective of the present paper is to minimize the number of model evaluations required to capture the wind power plant's AEP using stationary wind farm flow models. Polynomial chaos techniques are proposed based on arbitrary Weibull distributed wind speed and Von Misses......Wind farm flow models have advanced considerably with the use of large eddy simulations (LES) and Reynolds averaged Navier-Stokes (RANS) computations. The main limitation of these techniques is their high computational time requirements; which makes their use for wind farm annual energy production...... distributed wind direction. The correlation between wind direction and wind speed are captured by defining Weibull-parameters as functions of wind direction. In order to evaluate the accuracy of these methods the expectation and variance of the wind farm power distributions are compared against...

  14. Utility of the CIPP Model for Evaluating an Established Career Program in a Community College.

    Science.gov (United States)

    Hecht, Alfred R.

    How useful is Stufflebeam's Context, Input, Process, Product (CIPP) model for evaluating an established career program in a community college? On the basis of a case study, advantages of using CIPP include: comprehensiveness, flexibility, integration and decision-orientation. Implementation problems include: establishing procedures for delineating…

  15. Pig Models of Neurodegenerative Disorders: Utilization in Cell Replacement-Based Preclinical Safety and Efficacy Studies

    Czech Academy of Sciences Publication Activity Database

    Doležalová, D.; Hruška-Plocháň, M.; Bjarkam, C. R.; Sorensen, J. C. H.; Cunningham, M.; Weingarten, D.; Ciacci, J. D.; Juhás, Štefan; Juhásová, Jana; Motlík, Jan; Hefferan, M. P.; Hazel, T.; Johe, K.; Carromeu, C.; Muotri, A.; Bui, J. D.; Strnádel, J.; Marsala, M.

    2014-01-01

    Roč. 522, č. 12 (2014), s. 2784-2801 ISSN 0021-9967 R&D Projects: GA TA ČR(CZ) TA01011466; GA MŠk ED2.1.00/03.0124 Institutional support: RVO:67985904 Keywords : pig * neurodegenerative models * stem cells Subject RIV: FH - Neurology Impact factor: 3.225, year: 2014

  16. Utilizing the Active and Collaborative Learning Model in the Introductory Physics Course

    Science.gov (United States)

    Nam, Nguyen Hoai

    2014-01-01

    Model of active and collaborative learning (ACLM) applied in training specific subject makes clear advantage due to the goals of knowledge, skills that students got to develop successful future job. The author exploits the learning management system (LMS) of Hanoi National University of Education (HNUE) to establish a learning environment in the…

  17. The Fixed-Effects Zero-Inflated Poisson Model with an Application to Health Care Utilization

    NARCIS (Netherlands)

    Majo, M.C.; van Soest, A.H.O.

    2011-01-01

    Response variables that are scored as counts and that present a large number of zeros often arise in quantitative health care analysis. We define a zero-in flated Poisson model with fixed-effects in both of its equations to identify respondent and health-related characteristics associated with

  18. Utilizing Natural Structure of the Research Literature in Psychology as a Model for Bibliographic Instruction.

    Science.gov (United States)

    Olivetti, L. James

    1979-01-01

    Offered as an alternative to the search strategy model for bibliographic instruction, the approach to library instruction in psychology which is described involves analysis of the natural structure of the research literature. An example using Festinger's theory of cognitive dissonance is presented. Twelve references are cited. (EJS)

  19. Real-time resource model updating in continuous mining environment utilizing online sensor data

    NARCIS (Netherlands)

    Yüksel, C.

    2017-01-01

    In mining, modelling of the deposit geology is the basis for many actions to be taken in the future, such as predictions of quality attributes, mineral resources and ore reserves, as well as mine design and long-term production planning. The essential knowledge about the raw materialproduct is based

  20. The utility of behavioral economics in expanding the free-feed model of obesity.

    Science.gov (United States)

    Rasmussen, Erin B; Robertson, Stephen H; Rodriguez, Luis R

    2016-06-01

    Animal models of obesity are numerous and diverse in terms of identifying specific neural and peripheral mechanisms related to obesity; however, they are limited when it comes to behavior. The standard behavioral measure of food intake in most animal models occurs in a free-feeding environment. While easy and cost-effective for the researcher, the free-feeding environment omits some of the most important features of obesity-related food consumption-namely, properties of food availability, such as effort and delay to obtaining food. Behavior economics expands behavioral measures of obesity animal models by identifying such behavioral mechanisms. First, economic demand analysis allows researchers to understand the role of effort in food procurement, and how physiological and neural mechanisms are related. Second, studies on delay discounting contribute to a growing literature that shows that sensitivity to delayed food- and food-related outcomes is likely a fundamental process of obesity. Together, these data expand the animal model in a manner that better characterizes how environmental factors influence food consumption. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Utilizing uncoded consultation notes from electronic medical records for predictive modeling of colorectal cancer

    NARCIS (Netherlands)

    Hoogendoorn, Mark; Szolovits, Peter; Moons, Leon M G; Numans, ME

    2016-01-01

    OBJECTIVE: Machine learning techniques can be used to extract predictive models for diseases from electronic medical records (EMRs). However, the nature of EMRs makes it difficult to apply off-the-shelf machine learning techniques while still exploiting the rich content of the EMRs. In this paper,

  2. Draft Genome Sequence of the Model Naphthalene-Utilizing Organism Pseudomonas putida OUS82

    DEFF Research Database (Denmark)

    Tay, Martin; Roizman, Dan; Cohen, Yehuda

    2014-01-01

    Pseudomonas putida OUS82 was isolated from petrol- and oil-contaminated soil in 1992, and ever since, it has been used as a model organism to study the microbial assimilation of naphthalene and phenanthrene. Here, we report the 6.7-Mb draft genome sequence of P. putida OUS82 and analyze its...

  3. A utility-based suitability framework for integrated local-scale land-use modelling

    NARCIS (Netherlands)

    Koomen, E.; Pinto Nunes Nogueira Diogo, V.; Dekkers, J.E.C.; Rietveld, P.

    2015-01-01

    Models that simulate land-use patterns often use either inductive, data-driven approaches or deductive, theory-based methods to describe the relative strength of the social, economic and biophysical forces that drive the various sectors in the land system. An integrated framework is proposed here

  4. Prevention of radiation-induced salivary gland dysfunction utilizing a CDK inhibitor in a mouse model.

    Directory of Open Access Journals (Sweden)

    Katie L Martin

    Full Text Available Treatment of head and neck cancer with radiation often results in damage to surrounding normal tissues such as salivary glands. Permanent loss of function in the salivary glands often leads patients to discontinue treatment due to incapacitating side effects. It has previously been shown that IGF-1 suppresses radiation-induced apoptosis and enhances G2/M arrest leading to preservation of salivary gland function. In an effort to recapitulate the effects of IGF-1, as well as increase the likelihood of translating these findings to the clinic, the small molecule therapeutic Roscovitine, is being tested. Roscovitine is a cyclin-dependent kinase inhibitor that acts to transiently inhibit cell cycle progression and allow for DNA repair in damaged tissues.Treatment with Roscovitine prior to irradiation induced a significant increase in the percentage of cells in the G(2/M phase, as demonstrated by flow cytometry. In contrast, mice treated with radiation exhibit no differences in the percentage of cells in G(2/M when compared to unirradiated controls. Similar to previous studies utilizing IGF-1, pretreatment with Roscovitine leads to a significant up-regulation of p21 expression and a significant decrease in the number of PCNA positive cells. Radiation treatment leads to a significant increase in activated caspase-3 positive salivary acinar cells, which is suppressed by pretreatment with Roscovitine. Administration of Roscovitine prior to targeted head and neck irradiation preserves normal tissue function in mouse parotid salivary glands, both acutely and chronically, as measured by salivary output.These studies suggest that induction of transient G(2/M cell cycle arrest by Roscovitine allows for suppression of apoptosis, thus preserving normal salivary function following targeted head and neck irradiation. This could have an important clinical impact by preventing the negative side effects of radiation therapy in surrounding normal tissues.

  5. Baseline requirements of the proposed action for the Transportation Management Division routing models

    International Nuclear Information System (INIS)

    Johnson, P.E.; Joy, D.S.

    1995-02-01

    The potential impacts associated with the transportation of hazardous materials are important to shippers, carriers, and the general public. This is particularly true for shipments of radioactive material. The shippers are primarily concerned with safety, security, efficiency, and equipment requirements. The carriers are concerned with the potential impact that radioactive shipments may have on their operations--particularly if such materials are involved in an accident. The general public has also expressed concerns regarding the safety of transporting radioactive and other hazardous materials through their communities. Because transportation routes are a central concern in hazardous material transport, the prediction of likely routes is the first step toward resolution of these issues. In response to these routing needs, several models have been developed over the past fifteen years at Oak Ridge National Laboratory (ORNL). The HIGHWAY routing model is used to predict routes for truck transportation, the INTERLINE routing model is used to predict both rail and barge routes, and the AIRPORT locator model is used to determine airports with specified criteria near a specific location. As part of the ongoing improvement of the US Department of Energy's (DOE) Environmental Management Transportation Management Division's (EM-261) computer systems and development efforts, a Baseline Requirements Assessment Session on the HIGHWAY, INTERLINE, and AIRPORT models was held at ORNL on April 27, 1994. The purpose of this meeting was to discuss the existing capabilities of the models and data bases and to review enhancements of the models and data bases to expand their usefulness. The results of the Baseline Requirements Assessment Section will be discussed in this report. The discussions pertaining to the different models are contained in separate sections

  6. The realisation of legal protection requirements with the aid of models of nuclear facilities

    International Nuclear Information System (INIS)

    Wildberg, D.W.; Herrmann, H.J.

    1978-08-01

    In the Federal Republic of Germany, the model-based planning, construction and operation of nuclear facilities is still in its initial stages. Based on a few examples, the authors show that with the atomic energy legislature and with the laws in the conventional sector, the legislator had enacted requirements at a relatively early stage for the protection of the individual person in the facility and for the population at large in the vicinity of the facility. However, in the realization of these protection requirements, there are still problems, and these are often very basic in nature. The best solution here seems to be to tackle the problems with the help of models. This would permit subjects like serviceability, testability, use of external personnel, spatial distribution of redundancies, rescue of injured persons, fire protection measures, physical protection and the dismantling of facilities, which are multifarious in nature and have overlapping requirements, to be presented and discussed in greater depth and detail. The positive aspects of the use of models are presented, and the advantages and disadvantages of models are discussed in detail. Finally, the variety of models, which can ben used during the different phases of a nuclear facility, are discussed, and some remarks are made regarding the costs of models. One section of the report deals with examples of the practical use of models: Models have proved themselves in the past in the construction of refineries and chemical plants, and have successfully demonstrated their suitability in the field of nuclear technology. The examples of these need not be limited to those in the Federal Republic of Germany. (orig.) 891 HP [de

  7. Modeling Requirements for Simulating the Effects of Extreme Acts of Terrorism: A White Paper

    Energy Technology Data Exchange (ETDEWEB)

    Allen, M.; Hiebert-Dodd, K.; Marozas, D.; Paananen, O.; Pryor, R.J.; Reinert, R.K.

    1998-10-01

    This white paper presents the initial requirements for developing a new computer model for simulating the effects of extreme acts of terrorism in the United States. General characteristics of the model are proposed and the level of effort to prepare a complete written description of the model, prior to coding, is detailed. The model would simulate the decision processes and interactions of complex U. S. systems engaged in responding to and recovering from four types of terrorist incidents. The incident scenarios span the space of extreme acts of terrorism that have the potential to affect not only the impacted area, but also the entire nation. The model would be useful to decision-makers in assessing and analyzing the vulnerability of the nation's complex infrastructures, in prioritizing resources to reduce risk, and in planning strategies for immediate response and for subsequent recovery from terrorist incidents.

  8. On meeting capital requirements with a chance-constrained optimization model.

    Science.gov (United States)

    Atta Mills, Ebenezer Fiifi Emire; Yu, Bo; Gu, Lanlan

    2016-01-01

    This paper deals with a capital to risk asset ratio chance-constrained optimization model in the presence of loans, treasury bill, fixed assets and non-interest earning assets. To model the dynamics of loans, we introduce a modified CreditMetrics approach. This leads to development of a deterministic convex counterpart of capital to risk asset ratio chance constraint. We pursue the scope of analyzing our model under the worst-case scenario i.e. loan default. The theoretical model is analyzed by applying numerical procedures, in order to administer valuable insights from a financial outlook. Our results suggest that, our capital to risk asset ratio chance-constrained optimization model guarantees banks of meeting capital requirements of Basel III with a likelihood of 95 % irrespective of changes in future market value of assets.

  9. Requirements for UML and OWL Integration Tool for User Data Consistency Modeling and Testing

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard; Oleshchuk, V. A.

    2003-01-01

    The amount of data available on the Internet is continuously increasing, consequentially there is a growing need for tools that help to analyse the data. Testing of consistency among data received from different sources is made difficult by the number of different languages and schemas being used....... In this paper we analyze requirements for a tool that support integration of UML models and ontologies written in languages like the W3C Web Ontology Language (OWL). The tool can be used in the following way: after loading two legacy models into the tool, the tool user connects them by inserting modeling......, an important part of this technique is attaching of OCL expressions to special boolean class attributes that we call consistency attributes. The resulting integration model can be used for automatic consistency testing of two instances of the legacy models by automatically instantiate the whole integration...

  10. Derivative Trade Optimizing Model Utilizing GP Based on Behavioral Finance Theory

    Science.gov (United States)

    Matsumura, Koki; Kawamoto, Masaru

    This paper proposed a new technique which makes the strategy trees for the derivative (option) trading investment decision based on the behavioral finance theory and optimizes it using evolutionary computation, in order to achieve high profitability. The strategy tree uses a technical analysis based on a statistical, experienced technique for the investment decision. The trading model is represented by various technical indexes, and the strategy tree is optimized by the genetic programming(GP) which is one of the evolutionary computations. Moreover, this paper proposed a method using the prospect theory based on the behavioral finance theory to set psychological bias for profit and deficit and attempted to select the appropriate strike price of option for the higher investment efficiency. As a result, this technique produced a good result and found the effectiveness of this trading model by the optimized dealings strategy.

  11. Utilization of poplar wood sawdust for heavy metals removal from model solutions

    Directory of Open Access Journals (Sweden)

    Demcak Stefan

    2017-06-01

    Full Text Available Some kinds of natural organic materials have a potential for removal of heavy metal ions from wastewater. It is well known that cellulosic waste materials or by-products can be used as cheap adsorbents in chemical treatment process. In this paper, poplar wood sawdust were used for removal of Cu(II, Zn(II and Fe(II ions from model solutions with using the static and dynamic adsorption experiments. Infrared spectrometry of poplar wood sawdust confirmed the presence of the functional groups which correspond with hemicelluloses, cellulose and lignin. At static adsorption was achieved approximately of 80 % efficiency for all treated model solutions. Similar efficiency of the adsorption processes was reached after 5 min at dynamic condition. The highest efficiency of Cu(II removal (98 % was observed after 30 min of dynamic adsorption. Changes of pH values confirmed a mechanism of ion exchange on the beginning of the adsorption process.

  12. Utilization of ICU Data to Improve 30 and 60 Day HENRE Mortality Models, Revision 1

    Science.gov (United States)

    2017-05-12

    gender and available medical treatment (Stricklin, 2016). However, when burn is included, this model does not consider individual demographic...indicating that the NLR only weakly discriminates between surviving and non-surviving patients in this study. Kaplan-Meier curves of patients grouped by...for demographics (age and gender ) in the context of 60-day radiation-induced mortality. In the future, we would like to account for demographics in the

  13. Modelling, Simulation and Optimisation of Utility – Service Provision for Households: Case Studies

    OpenAIRE

    Strzelecka, A.; Skworcow, P.; Ulanicki, B.

    2014-01-01

    In the research presented in this paper household case studies were considered. Main objective of this research is to investigate models and algorithms for alternative approaches to current utility–service provision. This paper is focused on case studies that consider standard solutions to utility–service provision problems and propose improvements to these solutions. Results are obtained using a simulation system developed in C#. The simulation system evaluates feasibility of proposed candid...

  14. Studying the personality profile of drug addicts by utilizing two models of Cloninger and Eysneck

    Directory of Open Access Journals (Sweden)

    2008-11-01

    Full Text Available Objectives: Identifying personality factors of tendency to drugs can be helpful in better recognition and treatment of drug-dependency and also by providing consulting and psychological services can relatively prevents from vulnerable people to be addicted. This research therefore aims to investigate personality profile of substance dependent by using personality models of Cloninger and Eysenck. Methods: 100 substance-dependent and 100 normal men selected by available sampling method and completed Temperament and Character Inventory of Cloninger (TCI and Eysenck Personality Questionnaire _ Revised (EPQ-R. Also, some democratic information regarding substance- dependents collected. Research data were analyzed by inferential and descriptive statistics. Findings: This research showed that there are significant differences in temperament dimensions of novelty seeking and harm avoidance and character dimensions of self-direction and cooperativeness of Cloninger model and neuroticism and psychosis dimensions of Eysenck model between substance- dependent men and normal men. Results: Comparing in normal men, substance dependent men gained higher scores at novelty seeking and neuroticism and psychosis, and lower scores at self-direction and cooperativeness.

  15. PRAGMATICS DRIVEN LAND COVER SERVICE COMPOSITION UTILIZING BEHAVIOR-INTENTION MODEL

    Directory of Open Access Journals (Sweden)

    H. Wu

    2016-06-01

    Full Text Available Web service composition is one of the key issues to develop a global land cover (GLC information service portal. Aiming at the defect that traditional syntax and semantic service compositionare difficult to take pragmatic information into account, the paper firstly analyses three tiers of web service language and their succession relations, discusses the conceptual model of pragmatic web service, and proposes the idea of pragmatics-oriented adaptive composition method based on the analysis of some examples. On this basis it puts forward the pragmatic web service model based on Behavior-Intention through presetting and expression of service usability, users' intention, and other pragmatic information, develops the on-demand assembly method based on the agent theory and matching and reconstruction method on heterogeneous message, solves the key technological issue of algorithm applicability and heterogeneous message transformation in the process of covering web service composition on the ground, applies these methods into service combination, puts forward the pragmatic driven service composition method based on behavior-intention model, and effectively settles the issue of coordination and interaction of composite service invocation.

  16. Zebrafish erythropoiesis and the utility of fish as models of anemia.

    Science.gov (United States)

    Kulkeaw, Kasem; Sugiyama, Daisuke

    2012-12-20

    Erythrocytes contain oxygen-carrying hemoglobin to all body cells. Impairments in the generation of erythrocytes, a process known as erythropoiesis, or in hemoglobin synthesis alter cell function because of decreased oxygen supply and lead to anemic diseases. Thus, understanding how erythropoiesis is regulated during embryogenesis and adulthood is important to develop novel therapies for anemia. The zebrafish, Danio rerio, provides a powerful model for such study. Their small size and the ability to generate a large number of embryos enable large-scale analysis, and their transparency facilitates the visualization of erythroid cell migration. Importantly, the high conservation of hematopoietic genes among vertebrates and the ability to successfully transplant hematopoietic cells into fish have enabled the establishment of models of human anemic diseases in fish. In this review, we summarize the current progress in our understanding of erythropoiesis on the basis of zebrafish studies and highlight fish models of human anemias. These analyses could enable the discovery of novel drugs as future therapies.

  17. Predicting adherence to antiretroviral therapy among pregnant women in Guyana: Utility of the Health Belief Model.

    Science.gov (United States)

    Vitalis, Deborah

    2017-07-01

    Barriers to antiretroviral therapy (ART) adherence among pregnant women are varied and complex. This study explored the constructs of a theoretical model, the Health Belief Model (HBM) to understand and predict ART adherence among pregnant women in Guyana. A cross-sectional study surveyed 108 pregnant women attending 11 primary care clinics. ART adherence ranging from the past weekend to three months was assessed through self-reports, and health beliefs with the Adherence Determinants Questionnaire (ADQ). Constructs with sufficient variation in responses were tested for association with the level of adherence using Spearman's rank correlation coefficient and test. Sixty-seven per cent (72) of the women reported being always adherent. Although there was positive endorsement of ART treatment and adherence, the HBM did not help in understanding or predicting ART adherence in this population. Only one item from the perceived susceptibility construct was significantly associated (p = 0.009) with adherence. Interventions are warranted to address ART adherence in this population, as 19% of the women were recently non-adherent. Although the ADQ did not contribute to a deeper understanding or provide insight into pathways that can be targeted for intervention, theoretical models can play a key role in identifying these pathways.

  18. Aging influence on grey matter structural associations within the default mode network utilizing Bayesian network modeling

    Directory of Open Access Journals (Sweden)

    Yan eWang

    2014-05-01

    Full Text Available Recent neuroimaging studies have revealed normal aging-related alterations in functional and structural brain networks such as the default mode network (DMN. However, less is understood about specific brain structural dependencies or interactions between brain regions within the DMN in the normal aging process. In this study, using Bayesian network (BN modeling, we analyzed grey matter volume data from 109 young and 82 old subjects to characterize the influence of aging on associations between core brain regions within the DMN. Furthermore, we investigated the discriminability of the aging-associated BN models for the young and old groups. Compared to their young counterparts, the old subjects showed significant reductions in connections from right inferior temporal cortex (ITC to medial prefrontal cortex (mPFC, right hippocampus (HP to right ITC, and mPFC to posterior cingulate cortex (PCC and increases in connections from left HP to mPFC and right inferior parietal cortex (IPC to right ITC. Moreover, the classification results showed that the aging-related BN models could predict group membership with 88.48% accuracy, 88.07% sensitivity and 89.02% specificity. Our findings suggest that structural associations within the DMN may be affected by normal aging and provide crucial information about aging effects on brain structural networks.

  19. Discussing Landscape Compositional Scenarios Generated with Maximization of Non-Expected Utility Decision Models Based on Weighted Entropies

    Directory of Open Access Journals (Sweden)

    José Pinto Casquilho

    2017-02-01

    Full Text Available The search for hypothetical optimal solutions of landscape composition is a major issue in landscape planning and it can be outlined in a two-dimensional decision space involving economic value and landscape diversity, the latter being considered as a potential safeguard to the provision of services and externalities not accounted in the economic value. In this paper, we use decision models with different utility valuations combined with weighted entropies respectively incorporating rarity factors associated to Gini-Simpson and Shannon measures. A small example of this framework is provided and discussed for landscape compositional scenarios in the region of Nisa, Portugal. The optimal solutions relative to the different cases considered are assessed in the two-dimensional decision space using a benchmark indicator. The results indicate that the likely best combination is achieved by the solution using Shannon weighted entropy and a square root utility function, corresponding to a risk-averse behavior associated to the precautionary principle linked to safeguarding landscape diversity, anchoring for ecosystem services provision and other externalities. Further developments are suggested, mainly those relative to the hypothesis that the decision models here outlined could be used to revisit the stability-complexity debate in the field of ecological studies.

  20. Improving sand and gravel utilization and land-use planning. - 3D-modelling gravel resources with geospatial data.

    Science.gov (United States)

    Rolstad Libach, Lars; Wolden, Knut; Dagestad, Atle; Eskil Larsen, Bjørn

    2017-04-01

    The Norwegian aggregate industry produces approximately 14 million tons of sand and gravel aggregates annually to a value of approximately 100 million Euros. Utilization of aggregates are often linked to land-use conflicts and complex environmental impacts at the extraction site. These topics are managed on a local municipal level in Norway. The Geological Survey of Norway has a database and a web map service with information about sand and gravel deposits with considerable volumes and an importance evaluation. Some of the deposits covers large areas where the land-use conflicts are high. To ease and improve land-use planning, safeguard other important resources like groundwater and sustainable utilization of sand and gravel resources - there is a need for more detailed information of already mapped important resources. Detailed 3D-models of gravel deposits is a tool for a better land-use- and resource management. By combining seismic, GPR and resistivity geophysical profile data, borehole data, quaternary maps and lidar surface data, it has been possible to make 3D-models of deposits and to further research the possibilities for distinguishing different qualities and volumes. Good datasets and a detailed resource map is a prerequisite to assess geological resources for planners, extractors and neighbours. Future challenges lies in use of, often old, geophysical data, and combining these. What kind of information is it possible to grasp from depth-data that actually argues for a more detailed delineation of resources?