WorldWideScience

Sample records for based depletion methodology

  1. Development of a practical fuel management system for PSBR based on advanced three-dimensional Monte Carlo coupled depletion methodology

    Science.gov (United States)

    Tippayakul, Chanatip

    The main objective of this research is to develop a practical fuel management system for the Pennsylvania State University Breazeale research reactor (PSBR) based on several advanced Monte Carlo coupled depletion methodologies. Primarily, this research involved two major activities: model and method developments and analyses and validations of the developed models and methods. The starting point of this research was the utilization of the earlier developed fuel management tool, TRIGSIM, to create the Monte Carlo model of core loading 51 (end of the core loading). It was found when comparing the normalized power results of the Monte Carlo model to those of the current fuel management system (using HELIOS/ADMARC-H) that they agreed reasonably well (within 2%--3% differences on average). Moreover, the reactivity of some fuel elements was calculated by the Monte Carlo model and it was compared with measured data. It was also found that the fuel element reactivity results of the Monte Carlo model were in good agreement with the measured data. However, the subsequent task of analyzing the conversion from the core loading 51 to the core loading 52 using TRIGSIM showed quite significant difference of each control rod worth between the Monte Carlo model and the current methodology model. The differences were mainly caused by inconsistent absorber atomic number densities between the two models. Hence, the model of the first operating core (core loading 2) was revised in light of new information about the absorber atomic densities to validate the Monte Carlo model with the measured data. With the revised Monte Carlo model, the results agreed better to the measured data. Although TRIGSIM showed good modeling and capabilities, the accuracy of TRIGSIM could be further improved by adopting more advanced algorithms. Therefore, TRIGSIM was planned to be upgraded. The first task of upgrading TRIGSIM involved the improvement of the temperature modeling capability. The new TRIGSIM was

  2. Development of a Reliable Fuel Depletion Methodology for the HTR-10 Spent Fuel Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Kiwhan [Los Alamos National Laboratory; Beddingfield, David H. [Los Alamos National Laboratory; Geist, William H. [Los Alamos National Laboratory; Lee, Sang-Yoon [unaffiliated

    2012-07-03

    A technical working group formed in 2007 between NNSA and CAEA to develop a reliable fuel depletion method for HTR-10 based on MCNPX and to analyze the isotopic inventory and radiation source terms of the HTR-10 spent fuel. Conclusions of this presentation are: (1) Established a fuel depletion methodology and demonstrated its safeguards application; (2) Proliferation resistant at high discharge burnup ({approx}80 GWD/MtHM) - Unfavorable isotopics, high number of pebbles needed, harder to reprocess pebbles; (3) SF should remain under safeguards comparable to that of LWR; and (4) Diversion scenarios not considered, but can be performed.

  3. A SAS2H/KENO-V Methodology for 3D Full Core depletion analysis

    International Nuclear Information System (INIS)

    Milosevic, M.; Greenspan, E.; Vujic, J.; Petrovic, B.

    2003-04-01

    This paper describes the use of a SAS2H/KENO-V methodology for 3D full core depletion analysis and illustrates its capabilities by applying it to burnup analysis of the IRIS core benchmarks. This new SAS2H/KENO-V sequence combines a 3D Monte Carlo full core calculation of node power distribution and a 1D Wigner-Seitz equivalent cell transport method for independent depletion calculation of each of the nodes. This approach reduces by more than an order of magnitude the time required for getting comparable results using the MOCUP code system. The SAS2H/KENO-V results for the asymmetric IRIS core benchmark are in good agreement with the results of the ALPHA/PHOENIX/ANC code system. (author)

  4. Adding trend data to Depletion-Based Stock Reduction Analysis

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A Bayesian model of Depletion-Based Stock Reduction Analysis (DB-SRA), informed by a time series of abundance indexes, was developed, using the Sampling Importance...

  5. Evaluation of acute tryptophan depletion and sham depletion with a gelatin-based collagen peptide protein mixture

    DEFF Research Database (Denmark)

    Stenbæk, Dea Siggaard; Einarsdottir, H S; Goregliad-Fjaellingsdal, T

    2016-01-01

    Acute Tryptophan Depletion (ATD) is a dietary method used to modulate central 5-HT to study the effects of temporarily reduced 5-HT synthesis. The aim of this study is to evaluate a novel method of ATD using a gelatin-based collagen peptide (CP) mixture. We administered CP-Trp or CP+Trp mixtures...

  6. Electrofishing mark-recapture and depletion methodologies evoke behavioral and physiological changes in cutthroat trout

    Science.gov (United States)

    Mesa, M. G.; Schreck, C.B.

    1989-01-01

    We examined the behavioral and physiological responses of wild and hatchery-reared cutthroat trout Oncorhynchus clarki subjected to a single electroshock, electroshock plus marking, and multiple electroshocks in natural and artificial streams. In a natural stream, cutthroat trout released after capture by electrofishing and marking showed distinct behavioral changes: fish immediately sought cover, remained relatively inactive, did not feed, and were easily approached by a diver. An average of 3–4 h was required for 50% of the fish to return to a seemingly normal mode of behavior, although responses varied widely among collection sites. Using the depletion method, we observed little change in normal behavior offish remaining in the stream section (i.e., uncaptured fish) after successive passes with electrofishing gear. In an artificial stream, hatchery-reared and wild cutthroat trout immediately decreased their rates of feeding and aggression after they were electroshocked and marked. Hatchery fish generally recovered in 2–3 h; wild fish required at least 24 h to recover. Analysis of feeding and aggression data by hierarchical rank revealed no distinct recovery trends among hatchery fish of different ranks; among wild cutthroat trout, however, socially dominant fish seemed to recover faster than intermediate and subordinate fish. Physiological indicators of stress (plasma cortisol and blood lactic acid) increased significantly in cutthroat trout subjected to electroshock plus marking or single or multiple electroshocks. As judged by the magnitude of the greatest change in cortisol and lactate, multiple electroshocks elicited the most severe stress response; however, plasma concentrations of both substances had returned to unstressed control levels by 6 h after treatment. It was evident that electrofishing and the procedures involved with estimating fish population size elicited a general stress response that was manifested not only physiologically but also

  7. Development of a micro-depletion model to us WIMS properties in history-based local-parameter calculations in RFSP

    International Nuclear Information System (INIS)

    Shen, W.

    2004-01-01

    A micro-depletion model has been developed and implemented in the *SIMULATE module of RFSP to use WIMS-calculated lattice properties in history-based local-parameter calculations. A comparison between the micro-depletion and WIMS results for each type of lattice cross section and for the infinite-lattice multiplication factor was also performed for a fuel similar to that which may be used in the ACR fuel. The comparison shows that the micro-depletion calculation agrees well with the WIMS-IST calculation. The relative differences in k-infinity are within ±0.5 mk and ±0.9 mk for perturbation and depletion calculations, respectively. The micro-depletion model gives the *SIMULATE module of RFSP the capability to use WIMS-calculated lattice properties in history-based local-parameter calculations without resorting to the Simple-Cell-Methodology (SCM) surrogate for CANDU core-tracking simulations. (author)

  8. Mechanism-based biomarker gene sets for glutathione depletion-related hepatotoxicity in rats

    International Nuclear Information System (INIS)

    Gao Weihua; Mizukawa, Yumiko; Nakatsu, Noriyuki; Minowa, Yosuke; Yamada, Hiroshi; Ohno, Yasuo; Urushidani, Tetsuro

    2010-01-01

    Chemical-induced glutathione depletion is thought to be caused by two types of toxicological mechanisms: PHO-type glutathione depletion [glutathione conjugated with chemicals such as phorone (PHO) or diethyl maleate (DEM)], and BSO-type glutathione depletion [i.e., glutathione synthesis inhibited by chemicals such as L-buthionine-sulfoximine (BSO)]. In order to identify mechanism-based biomarker gene sets for glutathione depletion in rat liver, male SD rats were treated with various chemicals including PHO (40, 120 and 400 mg/kg), DEM (80, 240 and 800 mg/kg), BSO (150, 450 and 1500 mg/kg), and bromobenzene (BBZ, 10, 100 and 300 mg/kg). Liver samples were taken 3, 6, 9 and 24 h after administration and examined for hepatic glutathione content, physiological and pathological changes, and gene expression changes using Affymetrix GeneChip Arrays. To identify differentially expressed probe sets in response to glutathione depletion, we focused on the following two courses of events for the two types of mechanisms of glutathione depletion: a) gene expression changes occurring simultaneously in response to glutathione depletion, and b) gene expression changes after glutathione was depleted. The gene expression profiles of the identified probe sets for the two types of glutathione depletion differed markedly at times during and after glutathione depletion, whereas Srxn1 was markedly increased for both types as glutathione was depleted, suggesting that Srxn1 is a key molecule in oxidative stress related to glutathione. The extracted probe sets were refined and verified using various compounds including 13 additional positive or negative compounds, and they established two useful marker sets. One contained three probe sets (Akr7a3, Trib3 and Gstp1) that could detect conjugation-type glutathione depletors any time within 24 h after dosing, and the other contained 14 probe sets that could detect glutathione depletors by any mechanism. These two sets, with appropriate scoring

  9. Risk-based methodology for USNRC inspections

    International Nuclear Information System (INIS)

    Wong, S.M.; Holahan, G.M.; Chung, J.W.; Johnson, M.R.

    1995-01-01

    This paper describes the development and trial applications of a risk-based methodology to enhance the inspection processes for US nuclear power plants. Objectives of risk-based methods to complement prescriptive engineering approaches in US Nuclear Regulatory Commission (USNRC) inspection programs are presented. Insights from time-dependent risk profiles of plant configurational from Individual Plant Evaluation (IPE) studies were integrated to develop a framework for optimizing inspection efforts in NRC regulatory initiatives. Lessons learned from NRC pilot applications of the risk-based methodology for evaluation of the effectiveness of operational risk management programs at US nuclear power plant sites are also discussed

  10. Towards an MDA-based development methodology

    NARCIS (Netherlands)

    Gavras, Anastasius; Belaunde, Mariano; Ferreira Pires, Luis; Andrade Almeida, João; Oquendo, Flavio; Warboys, Brian C.; Morrison, Ron

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  11. A Goal based methodology for HAZOP analysis

    DEFF Research Database (Denmark)

    Rossing, Netta Liin; Lind, Morten; Jensen, Niels

    2010-01-01

    This paper presents a goal based methodology for HAZOP studies in which a functional model of the plant is used to assist in a functional decomposition of the plant starting from the purpose of the plant and continuing down to the function of a single node, e.g. a pipe section. This approach lead...

  12. EXPERIMENTAL ACIDIFICATION CAUSES SOIL BASE-CATION DEPLETION AT THE BEAR BROOK WATERSHED IN MAINE

    Science.gov (United States)

    There is concern that changes in atmospheric deposition, climate, or land use have altered the biogeochemistry of forests causing soil base-cation depletion, particularly Ca. The Bear Brook Watershed in Maine (BBWM) is a paired watershed experiment with one watershed subjected to...

  13. Experimental Acidification Causes Soil Base-Cation Depletion at the Bear Brook Watershed in Maine

    Science.gov (United States)

    Ivan J. Fernandez; Lindsey E. Rustad; Stephen A. Norton; Jeffrey S. Kahl; Bernard J. Cosby

    2003-01-01

    There is concern that changes in atmospheric deposition, climate, or land use have altered the biogeochemistry of forests causing soil base-cation depletion, particularly Ca. The Bear Brook Watershed in Maine (BBWM) is a paired watershed experiment with one watershed subjected to elevated N and S deposition through bimonthly additions of (NH4)2SO4. Quantitative soil...

  14. Methodology for assessing laser-based equipment

    Science.gov (United States)

    Pelegrina-Bonilla, Gabriel; Hermsdorf, Jörg; Thombansen, Ulrich; Abels, Peter; Kaierle, Stefan; Neumann, Jörg

    2017-10-01

    Methodologies for the assessment of technology's maturity are widely used in industry and research. Probably the best known are technology readiness levels (TRLs), initially pioneered by the National Aeronautics and Space Administration (NASA). At the beginning, only descriptively defined TRLs existed, but over time, automated assessment techniques in the form of questionnaires emerged in order to determine TRLs. Originally TRLs targeted equipment for space applications, but the demands on industrial relevant equipment are partly different in terms of, for example, overall costs, product quantities, or the presence of competitors. Therefore, we present a commonly valid assessment methodology with the aim of assessing laser-based equipment for industrial use, in general. The assessment is carried out with the help of a questionnaire, which allows for a user-friendly and easy accessible way to monitor the progress from the lab-proven state to the application-ready product throughout the complete development period. The assessment result is presented in a multidimensional metric in order to reveal the current specific strengths and weaknesses of the equipment development process, which can be used to direct the remaining development process of the equipment in the right direction.

  15. Methodology base and problems of information technologies

    Science.gov (United States)

    Sovetov, Boris Y.

    1993-04-01

    Information product qualitative forming and effective use is the aim of any information technology. Information technology as a system provides both computer-aided problem solving for the user and automation of information processes, which in turn support the problem solving process. That's why the information technology methods are the methods for data transmission, processing, and storage. The tools of methodology, mathematics, algorithms, hardware, software, and information are the tools of information technology. We propose to differ between global, basic, and applied information technologies depending on information product significance and characteristics of models, methods, and tools used. The global technology is aimed to use information resources in the social sphere as a whole. The basic technology is oriented on the application sphere (industry, scientific research, design, training). Transition towards new information technology should have in its concern business area model merged with the formal model of problem solving: computing organization based on the data concept; user's intellectual interface development.

  16. Depleted uranium

    International Nuclear Information System (INIS)

    Huffer, E.; Nifenecker, H.

    2001-02-01

    This document deals with the physical, chemical and radiological properties of the depleted uranium. What is the depleted uranium? Why do the military use depleted uranium and what are the risk for the health? (A.L.B.)

  17. Soil nutrients, aboveground productivity and vegetative diversity after 10 years of experimental acidification and base cation depletion

    Science.gov (United States)

    Mary Beth Adams; James A. Burger

    2010-01-01

    Soil acidification and base cation depletion are concerns for those wishing to manage central Appalachian hardwood forests sustainably. In this research, 2 experiments were established in 1996 and 1997 in two forest types common in the central Appalachian hardwood forests, to examine how these important forests respond to depletion of nutrients such as calcium and...

  18. Comparison of two lung clearance models based on the dissolution rates of oxidized depleted uranium

    International Nuclear Information System (INIS)

    Crist, K.C.

    1984-10-01

    An in-vitro dissolution study was conducted on two respirable oxidized depleted uranium samples. The dissolution rates generated from this study were then utilized in the International Commission on Radiological Protection Task Group lung clearance model and a lung clearance model proposed by Cuddihy. Predictions from both models based on the dissolution rates of the amount of oxidized depleted uranium that would be cleared to blood from the pulmonary region following an inhalation exposure were compared. It was found that the predictions made by both models differed considerably. The difference between the predictions was attributed to the differences in the way each model perceives the clearance from the pulmonary region. 33 references, 11 figures, 9 tables

  19. Comparison of two lung clearance models based on the dissolution rates of oxidized depleted uranium

    Energy Technology Data Exchange (ETDEWEB)

    Crist, K.C.

    1984-10-01

    An in-vitro dissolution study was conducted on two respirable oxidized depleted uranium samples. The dissolution rates generated from this study were then utilized in the International Commission on Radiological Protection Task Group lung clearance model and a lung clearance model proposed by Cuddihy. Predictions from both models based on the dissolution rates of the amount of oxidized depleted uranium that would be cleared to blood from the pulmonary region following an inhalation exposure were compared. It was found that the predictions made by both models differed considerably. The difference between the predictions was attributed to the differences in the way each model perceives the clearance from the pulmonary region. 33 references, 11 figures, 9 tables.

  20. Methodologies for Crawler Based Web Surveys.

    Science.gov (United States)

    Thelwall, Mike

    2002-01-01

    Describes Web survey methodologies used to study the content of the Web, and discusses search engines and the concept of crawling the Web. Highlights include Web page selection methodologies; obstacles to reliable automatic indexing of Web sites; publicly indexable pages; crawling parameters; and tests for file duplication. (Contains 62…

  1. Highly linear silicon traveling wave Mach-Zehnder carrier depletion modulator based on differential drive.

    Science.gov (United States)

    Streshinsky, Matthew; Ayazi, Ali; Xuan, Zhe; Lim, Andy Eu-Jin; Lo, Guo-Qiang; Baehr-Jones, Tom; Hochberg, Michael

    2013-02-11

    We present measurements of the nonlinear distortions of a traveling-wave silicon Mach-Zehnder modulator based on the carrier depletion effect. Spurious free dynamic range for second harmonic distortion of 82 dB·Hz(1/2) is seen, and 97 dB·Hz(2/3) is measured for intermodulation distortion. This measurement represents an improvement of 20 dB over the previous best result in silicon. We also show that the linearity of a silicon traveling wave Mach-Zehnder modulator can be improved by differentially driving it. These results suggest silicon may be a suitable platform for analog optical applications.

  2. Ultra Depleted Mantle at the Gakkel Ridge Based on Hafnium and Neodymium Isotopes

    Science.gov (United States)

    Salters, V. J.; Dick, H. J.

    2011-12-01

    basalts. The relatively depleted nature of the peridotites requires that a relatively large amount of peridotite has to contribute to the aggregated basaltic melt. Apart from documenting the heterogeneous nature of the MORB mantle, it also indicates that in addition to MORB like mantle a by far more depleted mantle exists. Based on abyssal peridotite trace element compositions and based on melting calculations, these extreme peridotites could have a complicated history and this might not be the first time they are passing through a ridge melting regime. It is likely they may represent ancient residual lithosphere. Because these peridotites are already depleted they will contribute little in terms of major elements or incompatible trace elements to the melts. The Hf-Nd isotope variations in MORB, whereby MORB from individual ridge segments form parallel arrays offset in Hf-isotopic composition, can also be explained by the existence of a highly depleted component like ancient Residual Lithosphere: called ReLish (Salters et al., 2011, G3). Liu et al (2008) Nature, 452, 311-315 Stracke et al. (2011) Earth Plan. Sci. Lett. 308, 359-368.

  3. Methodology for reliability based condition assessment

    International Nuclear Information System (INIS)

    Mori, Y.; Ellingwood, B.

    1993-08-01

    Structures in nuclear power plants may be exposed to aggressive environmental effects that cause their strength to decrease over an extended period of service. A major concern in evaluating the continued service for such structures is to ensure that in their current condition they are able to withstand future extreme load events during the intended service life with a level of reliability sufficient for public safety. This report describes a methodology to facilitate quantitative assessments of current and future structural reliability and performance of structures in nuclear power plants. This methodology takes into account the nature of past and future loads, and randomness in strength and in degradation resulting from environmental factors. An adaptive Monte Carlo simulation procedure is used to evaluate time-dependent system reliability. The time-dependent reliability is sensitive to the time-varying load characteristics and to the choice of initial strength and strength degradation models but not to correlation in component strengths within a system. Inspection/maintenance strategies are identified that minimize the expected future costs of keeping the failure probability of a structure at or below an established target failure probability during its anticipated service period

  4. A Risk-Based Sensor Placement Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Ronald W [ORNL; Kulesz, James J [ORNL

    2006-08-01

    A sensor placement methodology is proposed to solve the problem of optimal location of sensors or detectors to protect population against the exposure to and effects of known and/or postulated chemical, biological, and/or radiological threats. Historical meteorological data are used to characterize weather conditions as wind speed and direction pairs with the percentage of occurrence of the pairs over the historical period. The meteorological data drive atmospheric transport and dispersion modeling of the threats, the results of which are used to calculate population at risk against standard exposure levels. Sensor locations are determined via a dynamic programming algorithm where threats captured or detected by sensors placed in prior stages are removed from consideration in subsequent stages. Moreover, the proposed methodology provides a quantification of the marginal utility of each additional sensor or detector. Thus, the criterion for halting the iterative process can be the number of detectors available, a threshold marginal utility value, or the cumulative detection of a minimum factor of the total risk value represented by all threats.

  5. An object-based methodology for knowledge representation

    Energy Technology Data Exchange (ETDEWEB)

    Kelsey, R.L. [Los Alamos National Lab., NM (United States)|New Mexico State Univ., Las Cruces, NM (United States); Hartley, R.T. [New Mexico State Univ., Las Cruces, NM (United States); Webster, R.B. [Los Alamos National Lab., NM (United States)

    1997-11-01

    An object based methodology for knowledge representation is presented. The constructs and notation to the methodology are described and illustrated with examples. The ``blocks world,`` a classic artificial intelligence problem, is used to illustrate some of the features of the methodology including perspectives and events. Representing knowledge with perspectives can enrich the detail of the knowledge and facilitate potential lines of reasoning. Events allow example uses of the knowledge to be represented along with the contained knowledge. Other features include the extensibility and maintainability of knowledge represented in the methodology.

  6. A Systematic Methodology for Design of Emulsion Based Chemical Products

    DEFF Research Database (Denmark)

    Mattei, Michele; Kontogeorgis, Georgios; Gani, Rafiqul

    2012-01-01

    a hierarchical approach starting with the identification of the needs to be satisfied by the emulsified product and then building up the formulation by adding one-by-one the different classes of chemicals. A structured database together with dedicated property prediction models and evaluation criteria......A systematic methodology for emulsion based chemical product design is presented. The methodology employs a model-based product synthesis/design stage and a modelexperiment based further refinement and/or validation stage. In this paper only the first stage is presented. The methodology employs...

  7. Validation Methodology for Agent-based Simulations

    Science.gov (United States)

    2007-06-01

    ISAAC, Pythagoras , MANA, consider decision rules, knowledge-based systems, cellular automata, population dynamics • Discussion about VV&A Goal of...the target of interest Directly applicable to IW problem set In addition to ISAAC, Pythagoras , MANA, consider decision rules, knowledge-based systems

  8. Value and Vision-based Methodology in Integrated Design

    DEFF Research Database (Denmark)

    Tollestrup, Christian

    is divided in three; the systemic unfolding of the Value and Vision-based methodology, the structured presentation of practical implementation of the methodology and finally the analysis and conclusion regarding the value transformation, phenomena and learning aspects of the methodology.......The research theme of this thesis is based on a paradigm shift in design which outlines two new premises for design. First, the design must be based on a clearly defined set of values due to the socio-cultural development in our rich modern western society. A massive overload of products offer...... of this thesis is the value transformation from an explicit set of values to a product concept using a vision based concept development methodology based on the Pyramid Model (Lerdahl, 2001) in a design team context. The aim of this thesis is to examine how the process of value transformation is occurring within...

  9. Methodology for testing and validating knowledge bases

    Science.gov (United States)

    Krishnamurthy, C.; Padalkar, S.; Sztipanovits, J.; Purves, B. R.

    1987-01-01

    A test and validation toolset developed for artificial intelligence programs is described. The basic premises of this method are: (1) knowledge bases have a strongly declarative character and represent mostly structural information about different domains, (2) the conditions for integrity, consistency, and correctness can be transformed into structural properties of knowledge bases, and (3) structural information and structural properties can be uniformly represented by graphs and checked by graph algorithms. The interactive test and validation environment have been implemented on a SUN workstation.

  10. A Metadata based Knowledge Discovery Methodology for Seeding Translational Research.

    Science.gov (United States)

    Kothari, Cartik R; Payne, Philip R O

    2015-01-01

    In this paper, we present a semantic, metadata based knowledge discovery methodology for identifying teams of researchers from diverse backgrounds who can collaborate on interdisciplinary research projects: projects in areas that have been identified as high-impact areas at The Ohio State University. This methodology involves the semantic annotation of keywords and the postulation of semantic metrics to improve the efficiency of the path exploration algorithm as well as to rank the results. Results indicate that our methodology can discover groups of experts from diverse areas who can collaborate on translational research projects.

  11. An Innovative Fuzzy-Logic-Based Methodology for Trend Identification

    International Nuclear Information System (INIS)

    Wang Xin; Tsoukalas, Lefteri H.; Wei, Thomas Y.C.; Reifman, Jaques

    2001-01-01

    A new fuzzy-logic-based methodology for on-line signal trend identification is introduced. The methodology may be used for detecting the onset of nuclear power plant (NPP) transients at the earliest possible time and could be of great benefit to diagnostic, maintenance, and performance-monitoring programs. Although signal trend identification is complicated by the presence of noise, fuzzy methods can help capture important features of on-line signals, integrate the information included in these features, and classify incoming NPP signals into increasing, decreasing, and steady-state trend categories. A computer program named PROTREN is developed and tested for the purpose of verifying this methodology using NPP and simulation data. The results indicate that the new fuzzy-logic-based methodology is capable of detecting transients accurately, it identifies trends reliably and does not misinterpret a steady-state signal as a transient one

  12. Optimal (Solvent) Mixture Design through a Decomposition Based CAMD methodology

    OpenAIRE

    Achenie, L.; Karunanithi, Arunprakash T.; Gani, Rafiqul

    2004-01-01

    Computer Aided Molecular/Mixture design (CAMD) is one of the most promising techniques for solvent design and selection. A decomposition based CAMD methodology has been formulated where the mixture design problem is solved as a series of molecular and mixture design sub-problems. This approach is able to overcome most of the difficulties associated with the solution of mixture design problems. The new methodology has been illustrated with the help of a case study involving the design of solve...

  13. R and D of a pixel sensor based on 0.15μm fully depleted SOI technology

    International Nuclear Information System (INIS)

    Tsuboyama, Toru; Arai, Yasuo; Fukuda, Koichi; Hara, Kazuhiko; Hayashi, Hirokazu; Hazumi, Masashi; Ida, Jiro; Ikeda, Hirokazu; Ikegami, Yoichi; Ishino, Hirokazu; Kawasaki, Takeo; Kohriki, Takashi; Komatsubara, Hirotaka; Martin, Elena; Miyake, Hideki; Mochizuki, Ai; Ohno, Morifumi; Saegusa, Yuuji; Tajima, Hiro; Tajima, Osamu

    2007-01-01

    Development of a monolithic pixel detector based on SOI (silicon on insulator) technology was started at KEK in 2005. The substrate of the SOI wafer is used as a radiation sensor. At end of 2005, we submitted several test-structure group (TEG) chips for the 150 nm, fully depleted CMOS process. The TEG designs and preliminary results are presented

  14. Methodological Approaches for Estimating Gross Regional Product after Taking into Account Depletion of Natural Resources, Environmental Pollution and Human Capital Aspects

    Directory of Open Access Journals (Sweden)

    Boris Alengordovich Korobitsyn

    2015-09-01

    Full Text Available A key indicator of the System of National Accounts of Russia at a regional scale is Gross Regional Product characterizing the value of goods and services produced in all sectors of the economy in a country and intended for final consumption, capital formation and net exports (excluding imports. From a sustainability perspective, the most weakness of GRP is that it ignores depreciation of man-made assets, natural resource depletion, environmental pollution and degradation, and potential social costs such as poorer health due to exposure to occupational hazards. Several types of alternative approaches to measuring socio-economic progress are considering for six administrative units of the Ural Federal District for the period 2006–2014. Proposed alternatives to GRP as a measure of social progress are focused on natural resource depletion, environmental externalities and some human development aspects. The most promising is the use of corrected macroeconomic indicators similar to the “genuine savings” compiled by the World Bank. Genuine savings are defined in this paper as net savings (net gross savings minus consumption of fixed capital minus the consumption of natural non-renewable resources and the monetary evaluations of damages resulting from air pollution, water pollution and waste disposal. Two main groups of non renewable resources are considered: energy resources (uranium ore, oil and natural gas and mineral resources (iron ore, copper, and aluminum. In spite of various shortcomings, this indicator represents a considerable improvement over GRP information. For example, while GRP demonstrates steady growth between 2006 and 2014 for the main Russian oil- and gas-producing regions — Hanty-Mansi and Yamalo-Nenets Autonomous Okrugs, genuine savings for these regions decreased over all period. It means that their resource-based economy could not be considered as being on a sustainable path even in the framework of

  15. A Java-Web-Based-Learning Methodology, Case Study ...

    African Journals Online (AJOL)

    A Java-Web-Based-Learning Methodology, Case Study : Waterborne diseases. The recent advances in web technologies have opened new opportunities for computer-based-education. One can learn independently of time and place constraints, and have instantaneous access to relevant updated material at minimal cost.

  16. Towards an MDA-based development methodology for distributed applications

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Gavras, A.; Belaunde, M.; Ferreira Pires, Luis; Andrade Almeida, João

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  17. Improvements in EBR-2 core depletion calculations

    International Nuclear Information System (INIS)

    Finck, P.J.; Hill, R.N.; Sakamoto, S.

    1991-01-01

    The need for accurate core depletion calculations in Experimental Breeder Reactor No. 2 (EBR-2) is discussed. Because of the unique physics characteristics of EBR-2, it is difficult to obtain accurate and computationally efficient multigroup flux predictions. This paper describes the effect of various conventional and higher order schemes for group constant generation and for flux computations; results indicate that higher-order methods are required, particularly in the outer regions (i.e. the radial blanket). A methodology based on Nodal Equivalence Theory (N.E.T.) is developed which allows retention of the accuracy of a higher order solution with the computational efficiency of a few group nodal diffusion solution. The application of this methodology to three-dimensional EBR-2 flux predictions is demonstrated; this improved methodology allows accurate core depletion calculations at reasonable cost. 13 refs., 4 figs., 3 tabs

  18. A Model-Based Prognostics Methodology For Electrolytic Capacitors Based On Electrical Overstress Accelerated Aging

    Data.gov (United States)

    National Aeronautics and Space Administration — A remaining useful life prediction methodology for electrolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical...

  19. A Model-based Prognostics Methodology for Electrolytic Capacitors Based on Electrical Overstress Accelerated Aging

    Data.gov (United States)

    National Aeronautics and Space Administration — A remaining useful life prediction methodology for elec- trolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical...

  20. Radiation protection optimization using a knowledge based methodology

    International Nuclear Information System (INIS)

    Reyes-Jimenez, J.; Tsoukalas, L.H.

    1991-01-01

    This paper presents a knowledge based methodology for radiological planning and radiation protection optimization. The cost-benefit methodology described on International Commission of Radiation Protection Report No. 37 is employed within a knowledge based framework for the purpose of optimizing radiation protection and plan maintenance activities while optimizing radiation protection. 1, 2 The methodology is demonstrated through an application to a heating ventilation and air conditioning (HVAC) system. HVAC is used to reduce radioactivity concentration levels in selected contaminated multi-compartment models at nuclear power plants when higher than normal radiation levels are detected. The overall objective is to reduce personnel exposure resulting from airborne radioactivity, when routine or maintenance access is required in contaminated areas. 2 figs, 15 refs

  1. Potential of decaying wood to restore root-available base cations in depleted forest soils

    Science.gov (United States)

    Walter C. Shortle; Kevin T. Smith; Jody Jellison; Jonathan S. Schilling

    2012-01-01

    The depletion of root-available Ca in northern forest soils exposed to decades of increased acid deposition adversely affects forest health and productivity. Laboratory studies indicated the potential of wood-decay fungi to restore lost Ca. This study presents changes in concentration of Ca, Mg, and K in sapwood of red spruce (Picea rubens Sarg.),...

  2. Estimation of residue depletion of cyadox and its marker residue in edible tissues of pigs using physiologically based pharmacokinetic modelling.

    Science.gov (United States)

    Huang, Lingli; Lin, Zhoumeng; Zhou, Xuan; Zhu, Meiling; Gehring, Ronette; Riviere, Jim E; Yuan, Zonghui

    2015-01-01

    Physiologically based pharmacokinetic (PBPK) models are powerful tools to predict tissue distribution and depletion of veterinary drugs in food animals. However, most models only simulate the pharmacokinetics of the parent drug without considering their metabolites. In this study, a PBPK model was developed to simultaneously describe the depletion in pigs of the food animal antimicrobial agent cyadox (CYA), and its marker residue 1,4-bisdesoxycyadox (BDCYA). The CYA and BDCYA sub-models included blood, liver, kidney, gastrointestinal tract, muscle, fat and other organ compartments. Extent of plasma-protein binding, renal clearance and tissue-plasma partition coefficients of BDCYA were measured experimentally. The model was calibrated with the reported pharmacokinetic and residue depletion data from pigs dosed by oral gavage with CYA for five consecutive days, and then extrapolated to exposure in feed for two months. The model was validated with 14 consecutive day feed administration data. This PBPK model accurately simulated CYA and BDCYA in four edible tissues at 24-120 h after both oral exposure and 2-month feed administration. There was only slight overestimation of CYA in muscle and BDCYA in kidney at earlier time points (6-12 h) when dosed in feed. Monte Carlo analysis revealed excellent agreement between the estimated concentration distributions and observed data. The present model could be used for tissue residue monitoring of CYA and BDCYA in food animals, and provides a foundation for developing PBPK models to predict residue depletion of both parent drugs and their metabolites in food animals.

  3. Introduction Rudolf P. Botha In "Methodological bases of a progress ...

    African Journals Online (AJOL)

    In "Methodological bases of a progress~ve mentalism" (Botha 1980; hence- ..... structures. But, as shown above, this choice of terminology is entirely irrelevant to an assessment of the force of the relevant arguments in MB. In conjunction with the .... to be real elements of processing models, grammatical rules cannot be.

  4. A Discrepancy-Based Methodology for Nuclear Training Program Evaluation.

    Science.gov (United States)

    Cantor, Jeffrey A.

    1991-01-01

    A three-phase comprehensive process for commercial nuclear power training program evaluation is presented. The discrepancy-based methodology was developed after the Three Mile Island nuclear reactor accident. It facilitates analysis of program components to identify discrepancies among program specifications, actual outcomes, and industry…

  5. Modernising educational programmes in ICT based on the Tuning methodology

    Directory of Open Access Journals (Sweden)

    Alexander Bedny

    2014-07-01

    Full Text Available An analysis is presented of the experience of modernising undergraduate educational programs using the TUNING methodology, based on the example of the area of studies “Fundamental computer science and information technology” (FCSIT implemented at Lobachevsky State University of Nizhni Novgorod (Russia. The algorithm for reforming curricula for the subject area of information technology in accordance with the TUNING methodology is explained. A comparison is drawn between the existing Russian and European standards in the area of ICT education, including the European e-Competence Framework, with the focus on relevant competences. Some guidelines for the preparation of educational programmes are also provided.

  6. Optimal (Solvent) Mixture Design through a Decomposition Based CAMD methodology

    DEFF Research Database (Denmark)

    Achenie, L.; Karunanithi, Arunprakash T.; Gani, Rafiqul

    2004-01-01

    Computer Aided Molecular/Mixture design (CAMD) is one of the most promising techniques for solvent design and selection. A decomposition based CAMD methodology has been formulated where the mixture design problem is solved as a series of molecular and mixture design sub-problems. This approach is...... is able to overcome most of the difficulties associated with the solution of mixture design problems. The new methodology has been illustrated with the help of a case study involving the design of solvent-anti solvent binary mixtures for crystallization of Ibuprofen....

  7. Deep-depletion physics-based analytical model for scanning capacitance microscopy carrier profile extraction

    International Nuclear Information System (INIS)

    Wong, K. M.; Chim, W. K.

    2007-01-01

    An approach for fast and accurate carrier profiling using deep-depletion analytical modeling of scanning capacitance microscopy (SCM) measurements is shown for an ultrashallow p-n junction with a junction depth of less than 30 nm and a profile steepness of about 3 nm per decade change in carrier concentration. In addition, the analytical model is also used to extract the SCM dopant profiles of three other p-n junction samples with different junction depths and profile steepnesses. The deep-depletion effect arises from rapid changes in the bias applied between the sample and probe tip during SCM measurements. The extracted carrier profile from the model agrees reasonably well with the more accurate carrier profile from inverse modeling and the dopant profile from secondary ion mass spectroscopy measurements

  8. Sensitivity theory for reactor burnup analysis based on depletion perturbation theory

    International Nuclear Information System (INIS)

    Yang, Wonsik.

    1989-01-01

    The large computational effort involved in the design and analysis of advanced reactor configurations motivated the development of Depletion Perturbation Theory (DPT) for general fuel cycle analysis. The work here focused on two important advances in the current methods. First, the adjoint equations were developed for using the efficient linear flux approximation to decouple the neutron/nuclide field equations. And second, DPT was extended to the constrained equilibrium cycle which is important for the consistent comparison and evaluation of alternative reactor designs. Practical strategies were formulated for solving the resulting adjoint equations and a computer code was developed for practical applications. In all cases analyzed, the sensitivity coefficients generated by DPT were in excellent agreement with the results of exact calculations. The work here indicates that for a given core response, the sensitivity coefficients to all input parameters can be computed by DPT with a computational effort similar to a single forward depletion calculation

  9. Stimulated emission depletion-based raster image correlation spectroscopy reveals biomolecular dynamics in live cells.

    Science.gov (United States)

    Hedde, Per Niklas; Dörlich, René M; Blomley, Rosmarie; Gradl, Dietmar; Oppong, Emmanuel; Cato, Andrew C B; Nienhaus, G Ulrich

    2013-01-01

    Raster image correlation spectroscopy is a powerful tool to study fast molecular dynamics such as protein diffusion or receptor-ligand interactions inside living cells and tissues. By analysing spatio-temporal correlations of fluorescence intensity fluctuations from raster-scanned microscopy images, molecular motions can be revealed in a spatially resolved manner. Because of the diffraction-limited optical resolution, however, conventional raster image correlation spectroscopy can only distinguish larger regions of interest and requires low fluorophore concentrations in the nanomolar range. Here, to overcome these limitations, we combine raster image correlation spectroscopy with stimulated emission depletion microscopy. With imaging experiments on model membranes and live cells, we show that stimulated emission depletion-raster image correlation spectroscopy offers an enhanced multiplexing capability because of the enhanced spatial resolution as well as access to 10-100 times higher fluorophore concentrations.

  10. A strategy for selective detection based on interferent depleting and redox cycling using the plane-recessed microdisk array electrodes

    International Nuclear Information System (INIS)

    Zhu Feng; Yan Jiawei; Lu Miao; Zhou Yongliang; Yang Yang; Mao Bingwei

    2011-01-01

    Highlights: → A novel strategy based on a combination of interferent depleting and redox cycling is proposed for the plane-recessed microdisk array electrodes. → The strategy break up the restriction of selectively detecting a species that exhibits reversible reaction in a mixture with one that exhibits an irreversible reaction. → The electrodes enhance the current signal by redox cycling. → The electrodes can work regardless of the reversibility of interfering species. - Abstract: The fabrication, characterization and application of the plane-recessed microdisk array electrodes for selective detection are demonstrated. The electrodes, fabricated by lithographic microfabrication technology, are composed of a planar film electrode and a 32 x 32 recessed microdisk array electrode. Different from commonly used redox cycling operating mode for array configurations such as interdigitated array electrodes, a novel strategy based on a combination of interferent depleting and redox cycling is proposed for the electrodes with an appropriate configuration. The planar film electrode (the plane electrode) is used to deplete the interferent in the diffusion layer. The recessed microdisk array electrode (the microdisk array), locating within the diffusion layer of the plane electrode, works for detecting the target analyte in the interferent-depleted diffusion layer. In addition, the microdisk array overcomes the disadvantage of low current signal for a single microelectrode. Moreover, the current signal of the target analyte that undergoes reversible electron transfer can be enhanced due to the redox cycling between the plane electrode and the microdisk array. Based on the above working principle, the plane-recessed microdisk array electrodes break up the restriction of selectively detecting a species that exhibits reversible reaction in a mixture with one that exhibits an irreversible reaction, which is a limitation of single redox cycling operating mode. The

  11. Guidelines for reporting evaluations based on observational methodology.

    Science.gov (United States)

    Portell, Mariona; Anguera, M Teresa; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana

    2015-01-01

    Observational methodology is one of the most suitable research designs for evaluating fidelity of implementation, especially in complex interventions. However, the conduct and reporting of observational studies is hampered by the absence of specific guidelines, such as those that exist for other evaluation designs. This lack of specific guidance poses a threat to the quality and transparency of these studies and also constitutes a considerable publication hurdle. The aim of this study thus was to draw up a set of proposed guidelines for reporting evaluations based on observational methodology. The guidelines were developed by triangulating three sources of information: observational studies performed in different fields by experts in observational methodology, reporting guidelines for general studies and studies with similar designs to observational studies, and proposals from experts in observational methodology at scientific meetings. We produced a list of guidelines grouped into three domains: intervention and expected outcomes, methods, and results. The result is a useful, carefully crafted set of simple guidelines for conducting and reporting observational studies in the field of program evaluation.

  12. A Network Based Methodology to Reveal Patterns in Knowledge Transfer

    Directory of Open Access Journals (Sweden)

    Orlando López-Cruz

    2015-12-01

    Full Text Available This paper motivates, presents and demonstrates in use a methodology based in complex network analysis to support research aimed at identification of sources in the process of knowledge transfer at the interorganizational level. The importance of this methodology is that it states a unified model to reveal knowledge sharing patterns and to compare results from multiple researches on data from different periods of time and different sectors of the economy. This methodology does not address the underlying statistical processes. To do this, national statistics departments (NSD provide documents and tools at their websites. But this proposal provides a guide to model information inferences gathered from data processing revealing links between sources and recipients of knowledge being transferred and that the recipient detects as main source to new knowledge creation. Some national statistics departments set as objective for these surveys the characterization of innovation dynamics in firms and to analyze the use of public support instruments. From this characterization scholars conduct different researches. Measures of dimensions of the network composed by manufacturing firms and other organizations conform the base to inquiry the structure that emerges from taking ideas from other organizations to incept innovations. These two sets of data are actors of a two- mode-network. The link between two actors (network nodes, one acting as the source of the idea. The second one acting as the destination comes from organizations or events organized by organizations that “provide” ideas to other group of firms. The resulting demonstrated design satisfies the objective of being a methodological model to identify sources in knowledge transfer of knowledge effectively used in innovation.

  13. New event-based methodology to improve photolithography productivity

    Science.gov (United States)

    Chang, Simon; Boehm, Mark A.

    2000-10-01

    As photolithography processes continue to increase in complexity, in order to maintain anticipated productivity it has become more challenging to increase the throughput, lower the rework rate and improve tool utilization. Manufacturing automation system deployed in a production site provides a source to monitor the photolithography process, the operation efficiency and the health of equipment. However, it is found to be a time-consuming and difficult process to analyze the large amount of data and determine the exact source of productivity hitter. Based on these needs, a new methodology is proposed in this paper to quantify the productivity hitter versus process, operation, and equipment. A tool is developed from this methodology to track through every tool operation time and operation steps based on information of tool event logs (event-based) via network. By applying the tool, an easy to view information can be quickly derived from the event log data, for rapid decision making such as lot disposition, recipe optimization, and equipment function check. The event-based methodology is introduced in this paper. In addition, several examples are studied by using this tool in ASIC type production environment. The effects of track delay, hidden overhead time, poor lot queuing and excessive error-assist time, etc. are studied and quantified. By applying the tool, the time taken to accurately locate the root cause of productivity hitter is significantly reduced. Based on the analysis, the guidelines are given to optimize the tool utilization and raw throughput, and the productivity is improved accordingly. Future works including fully integrated into in-house manufacturing automation are discussed.

  14. Risk-based Methodology for Validation of Pharmaceutical Batch Processes.

    Science.gov (United States)

    Wiles, Frederick

    2013-01-01

    or runs required to demonstrate that a pharmaceutical process is operating in a validated state should be based on sound statistical principles. The old rule of "three consecutive batches and you're done" is no longer sufficient. The guidance, however, does not provide any specific methodology for determining the number of runs required, and little has been published to augment this shortcoming. The paper titled "Risk-based Methodology for Validation of Pharmaceutical Batch Processes" describes a statistically sound methodology for determining when a statistically valid number of validation runs has been acquired based on risk assessment and calculation of process capability.

  15. MS-based analytical methodologies to characterize genetically modified crops.

    Science.gov (United States)

    García-Cañas, Virginia; Simó, Carolina; León, Carlos; Ibáñez, Elena; Cifuentes, Alejandro

    2011-01-01

    The development of genetically modified crops has had a great impact on the agriculture and food industries. However, the development of any genetically modified organism (GMO) requires the application of analytical procedures to confirm the equivalence of the GMO compared to its isogenic non-transgenic counterpart. Moreover, the use of GMOs in foods and agriculture faces numerous criticisms from consumers and ecological organizations that have led some countries to regulate their production, growth, and commercialization. These regulations have brought about the need of new and more powerful analytical methods to face the complexity of this topic. In this regard, MS-based technologies are increasingly used for GMOs analysis to provide very useful information on GMO composition (e.g., metabolites, proteins). This review focuses on the MS-based analytical methodologies used to characterize genetically modified crops (also called transgenic crops). First, an overview on genetically modified crops development is provided, together with the main difficulties of their analysis. Next, the different MS-based analytical approaches applied to characterize GM crops are critically discussed, and include "-omics" approaches and target-based approaches. These methodologies allow the study of intended and unintended effects that result from the genetic transformation. This information is considered to be essential to corroborate (or not) the equivalence of the GM crop with its isogenic non-transgenic counterpart. Copyright © 2010 Wiley Periodicals, Inc.

  16. Development of a statistically based access delay timeline methodology.

    Energy Technology Data Exchange (ETDEWEB)

    Rivera, W. Gary; Robinson, David Gerald; Wyss, Gregory Dane; Hendrickson, Stacey M. Langfitt

    2013-02-01

    The charter for adversarial delay is to hinder access to critical resources through the use of physical systems increasing an adversarys task time. The traditional method for characterizing access delay has been a simple model focused on accumulating times required to complete each task with little regard to uncertainty, complexity, or decreased efficiency associated with multiple sequential tasks or stress. The delay associated with any given barrier or path is further discounted to worst-case, and often unrealistic, times based on a high-level adversary, resulting in a highly conservative calculation of total delay. This leads to delay systems that require significant funding and personnel resources in order to defend against the assumed threat, which for many sites and applications becomes cost prohibitive. A new methodology has been developed that considers the uncertainties inherent in the problem to develop a realistic timeline distribution for a given adversary path. This new methodology incorporates advanced Bayesian statistical theory and methodologies, taking into account small sample size, expert judgment, human factors and threat uncertainty. The result is an algorithm that can calculate a probability distribution function of delay times directly related to system risk. Through further analysis, the access delay analyst or end user can use the results in making informed decisions while weighing benefits against risks, ultimately resulting in greater system effectiveness with lower cost.

  17. An integrated science-based methodology to assess potential ...

    Science.gov (United States)

    There is an urgent need for broad and integrated studies that address the risks of engineered nanomaterials (ENMs) along the different endpoints of the society, environment, and economy (SEE) complex adaptive system. This article presents an integrated science-based methodology to assess the potential risks of engineered nanomaterials. To achieve the study objective, two major tasks are accomplished, knowledge synthesis and algorithmic computational methodology. The knowledge synthesis task is designed to capture “what is known” and to outline the gaps in knowledge from ENMs risk perspective. The algorithmic computational methodology is geared toward the provision of decisions and an understanding of the risks of ENMs along different endpoints for the constituents of the SEE complex adaptive system. The approach presented herein allows for addressing the formidable task of assessing the implications and risks of exposure to ENMs, with the long term goal to build a decision-support system to guide key stakeholders in the SEE system towards building sustainable ENMs and nano-enabled products. The following specific aims are formulated to achieve the study objective: (1) to propose a system of systems (SoS) architecture that builds a network management among the different entities in the large SEE system to track the flow of ENMs emission, fate and transport from the source to the receptor; (2) to establish a staged approach for knowledge synthesis methodo

  18. Investment Strategies Optimization based on a SAX-GA Methodology

    CERN Document Server

    Canelas, António M L; Horta, Nuno C G

    2013-01-01

    This book presents a new computational finance approach combining a Symbolic Aggregate approXimation (SAX) technique with an optimization kernel based on genetic algorithms (GA). While the SAX representation is used to describe the financial time series, the evolutionary optimization kernel is used in order to identify the most relevant patterns and generate investment rules. The proposed approach considers several different chromosomes structures in order to achieve better results on the trading platform The methodology presented in this book has great potential on investment markets.

  19. Memristive device based on a depletion-type SONOS field effect transistor

    Science.gov (United States)

    Himmel, N.; Ziegler, M.; Mähne, H.; Thiem, S.; Winterfeld, H.; Kohlstedt, H.

    2017-06-01

    State-of-the-art SONOS (silicon-oxide-nitride-oxide-polysilicon) field effect transistors were operated in a memristive switching mode. The circuit design is a variation of the MemFlash concept and the particular properties of depletion type SONOS-transistors were taken into account. The transistor was externally wired with a resistively shunted pn-diode. Experimental current-voltage curves show analog bipolar switching characteristics within a bias voltage range of ±10 V, exhibiting a pronounced asymmetric hysteresis loop. The experimental data are confirmed by SPICE simulations. The underlying memristive mechanism is purely electronic, which eliminates an initial forming step of the as-fabricated cells. This fact, together with reasonable design flexibility, in particular to adjust the maximum R ON/R OFF ratio, makes these cells attractive for neuromorphic applications. The relative large set and reset voltage around ±10 V might be decreased by using thinner gate-oxides. The all-electric operation principle, in combination with an established silicon manufacturing process of SONOS devices at the Semiconductor Foundry X-FAB, promise reliable operation, low parameter spread and high integration density.

  20. Lithium Depletion in Solar-like Stars: Effect of Overshooting Based on Realistic Multi-dimensional Simulations

    International Nuclear Information System (INIS)

    Baraffe, I.; Pratt, J.; Goffrey, T.; Constantino, T.; Viallet, M.; Folini, D.; Popov, M. V.; Walder, R.

    2017-01-01

    We study lithium depletion in low-mass and solar-like stars as a function of time, using a new diffusion coefficient describing extra-mixing taking place at the bottom of a convective envelope. This new form is motivated by multi-dimensional fully compressible, time-implicit hydrodynamic simulations performed with the MUSIC code. Intermittent convective mixing at the convective boundary in a star can be modeled using extreme value theory, a statistical analysis frequently used for finance, meteorology, and environmental science. In this Letter, we implement this statistical diffusion coefficient in a one-dimensional stellar evolution code, using parameters calibrated from multi-dimensional hydrodynamic simulations of a young low-mass star. We propose a new scenario that can explain observations of the surface abundance of lithium in the Sun and in clusters covering a wide range of ages, from ∼50 Myr to ∼4 Gyr. Because it relies on our physical model of convective penetration, this scenario has a limited number of assumptions. It can explain the observed trend between rotation and depletion, based on a single additional assumption, namely, that rotation affects the mixing efficiency at the convective boundary. We suggest the existence of a threshold in stellar rotation rate above which rotation strongly prevents the vertical penetration of plumes and below which rotation has small effects. In addition to providing a possible explanation for the long-standing problem of lithium depletion in pre-main-sequence and main-sequence stars, the strength of our scenario is that its basic assumptions can be tested by future hydrodynamic simulations.

  1. Lithium Depletion in Solar-like Stars: Effect of Overshooting Based on Realistic Multi-dimensional Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Baraffe, I.; Pratt, J.; Goffrey, T.; Constantino, T.; Viallet, M. [Astrophysics Group, University of Exeter, Exeter EX4 4QL (United Kingdom); Folini, D.; Popov, M. V.; Walder, R., E-mail: i.baraffe@ex.ac.uk [Ecole Normale Supérieure de Lyon, CRAL, UMR CNRS 5574, F-69364 Lyon Cedex 07 (France)

    2017-08-10

    We study lithium depletion in low-mass and solar-like stars as a function of time, using a new diffusion coefficient describing extra-mixing taking place at the bottom of a convective envelope. This new form is motivated by multi-dimensional fully compressible, time-implicit hydrodynamic simulations performed with the MUSIC code. Intermittent convective mixing at the convective boundary in a star can be modeled using extreme value theory, a statistical analysis frequently used for finance, meteorology, and environmental science. In this Letter, we implement this statistical diffusion coefficient in a one-dimensional stellar evolution code, using parameters calibrated from multi-dimensional hydrodynamic simulations of a young low-mass star. We propose a new scenario that can explain observations of the surface abundance of lithium in the Sun and in clusters covering a wide range of ages, from ∼50 Myr to ∼4 Gyr. Because it relies on our physical model of convective penetration, this scenario has a limited number of assumptions. It can explain the observed trend between rotation and depletion, based on a single additional assumption, namely, that rotation affects the mixing efficiency at the convective boundary. We suggest the existence of a threshold in stellar rotation rate above which rotation strongly prevents the vertical penetration of plumes and below which rotation has small effects. In addition to providing a possible explanation for the long-standing problem of lithium depletion in pre-main-sequence and main-sequence stars, the strength of our scenario is that its basic assumptions can be tested by future hydrodynamic simulations.

  2. Lithium Depletion in Solar-like Stars: Effect of Overshooting Based on Realistic Multi-dimensional Simulations

    Science.gov (United States)

    Baraffe, I.; Pratt, J.; Goffrey, T.; Constantino, T.; Folini, D.; Popov, M. V.; Walder, R.; Viallet, M.

    2017-08-01

    We study lithium depletion in low-mass and solar-like stars as a function of time, using a new diffusion coefficient describing extra-mixing taking place at the bottom of a convective envelope. This new form is motivated by multi-dimensional fully compressible, time-implicit hydrodynamic simulations performed with the MUSIC code. Intermittent convective mixing at the convective boundary in a star can be modeled using extreme value theory, a statistical analysis frequently used for finance, meteorology, and environmental science. In this Letter, we implement this statistical diffusion coefficient in a one-dimensional stellar evolution code, using parameters calibrated from multi-dimensional hydrodynamic simulations of a young low-mass star. We propose a new scenario that can explain observations of the surface abundance of lithium in the Sun and in clusters covering a wide range of ages, from ˜50 Myr to ˜4 Gyr. Because it relies on our physical model of convective penetration, this scenario has a limited number of assumptions. It can explain the observed trend between rotation and depletion, based on a single additional assumption, namely, that rotation affects the mixing efficiency at the convective boundary. We suggest the existence of a threshold in stellar rotation rate above which rotation strongly prevents the vertical penetration of plumes and below which rotation has small effects. In addition to providing a possible explanation for the long-standing problem of lithium depletion in pre-main-sequence and main-sequence stars, the strength of our scenario is that its basic assumptions can be tested by future hydrodynamic simulations.

  3. Methodology of citrate-based biomaterial development and application

    Science.gov (United States)

    Tran, M. Richard

    Biomaterials play central roles in modern strategies of regenerative medicine and tissue engineering. Attempts to find tissue-engineered solutions to cure various injuries or diseases have led to an enormous increase in the number of polymeric biomaterials over the past decade. The breadth of new materials arises from the multiplicity of anatomical locations, cell types, and mode of application, which all place application-specific requirements on the biomaterial. Unfortunately, many of the currently available biodegradable polymers are limited in their versatility to meet the wide range of requirements for tissue engineering. Therefore, a methodology of biomaterial development, which is able to address a broad spectrum of requirements, would be beneficial to the biomaterial field. This work presents a methodology of citrate-based biomaterial design and application to meet the multifaceted needs of tissue engineering. We hypothesize that (1) citric acid, a non-toxic metabolic product of the body (Krebs Cycle), can be exploited as a universal multifunctional monomer and reacted with various diols to produce a new class of soft biodegradable elastomers with the flexibility to tune the material properties of the resulting material to meet a wide range of requirements; (2) the newly developed citrate-based polymers can be used as platform biomaterials for the design of novel tissue engineering scaffolding; and (3) microengineering approaches in the form thin scaffold sheets, microchannels, and a new porogen design can be used to generate complex cell-cell and cell-microenvironment interactions to mimic tissue complexity and architecture. To test these hypotheses, we first developed a methodology of citrate-based biomaterial development through the synthesis and characterization of a family of in situ crosslinkable and urethane-doped elastomers, which are synthesized using simple, cost-effective strategies and offer a variety methods to tailor the material properties to

  4. Bayesian-network-based fault diagnosis methodology of subsea jumper

    Science.gov (United States)

    Cai, Baoping; Liu, Yonghong; Huang, Lei; Hu, Song; Xue, Haitao; Wang, Jiaxing

    2017-10-01

    The paper proposes a Bayesian-network-based real-time fault diagnosis methodology of M-shaped subsea jumper. Finite element models of a typical M-shaped subsea jumper system are built to get the data for diagnosis. Netica is Bayesian-network -based software and is used to construct diagnosis models of the jumper in two main loading conditions which are falling objects and seabed moving. The results show that the accuracy of falling objects diagnosis model with four faults is 100%, and the accuracy of seabed moving diagnosis model with two faults is also 100%. Combine the two models into one and the accuracy of combined model is 96.59%. The effectiveness of the proposed method is validated.

  5. Methodology for cloud-based design of robots

    Science.gov (United States)

    Ogorodnikova, O. M.; Vaganov, K. A.; Putimtsev, I. D.

    2017-09-01

    This paper presents some important results for cloud-based designing a robot arm by a group of students. Methodology for the cloud-based design was developed and used to initiate interdisciplinary project about research and development of a specific manipulator. The whole project data files were hosted by Ural Federal University data center. The 3D (three-dimensional) model of the robot arm was created using Siemens PLM software (Product Lifecycle Management) and structured as a complex mechatronics product by means of Siemens Teamcenter thin client; all processes were performed in the clouds. The robot arm was designed in purpose to load blanks up to 1 kg into the work space of the milling machine for performing student's researches.

  6. Depletion Calculations Based on Perturbations. Application to the Study of a Rep-Like Assembly at Beginning of Cycle with TRIPOLI-4®.

    Science.gov (United States)

    Dieudonne, Cyril; Dumonteil, Eric; Malvagi, Fausto; M'Backé Diop, Cheikh

    2014-06-01

    For several years, Monte Carlo burnup/depletion codes have appeared, which couple Monte Carlo codes to simulate the neutron transport to deterministic methods, which handle the medium depletion due to the neutron flux. Solving Boltzmann and Bateman equations in such a way allows to track fine 3-dimensional effects and to get rid of multi-group hypotheses done by deterministic solvers. The counterpart is the prohibitive calculation time due to the Monte Carlo solver called at each time step. In this paper we present a methodology to avoid the repetitive and time-expensive Monte Carlo simulations, and to replace them by perturbation calculations: indeed the different burnup steps may be seen as perturbations of the isotopic concentration of an initial Monte Carlo simulation. In a first time we will present this method, and provide details on the perturbative technique used, namely the correlated sampling. In a second time the implementation of this method in the TRIPOLI-4® code will be discussed, as well as the precise calculation scheme a meme to bring important speed-up of the depletion calculation. Finally, this technique will be used to calculate the depletion of a REP-like assembly, studied at beginning of its cycle. After having validated the method with a reference calculation we will show that it can speed-up by nearly an order of magnitude standard Monte-Carlo depletion codes.

  7. Methodology of Maqamat Hamadani and Hariri Based on Buseman’s statistical methodology

    Directory of Open Access Journals (Sweden)

    Hamed Sedghi

    2016-02-01

    Full Text Available Abstract  Stylistics can be defined as analysis and interpretation of the expression and different forms of speech, based on linguistic elements. German theorist , Buseman , devised his theses in statistical style based on the ratio of verbs to adjectives, generalizing a variety of literary and non- literary genera in German Language and Literature .   According to Buseman , increasing the number of verbs in verses, makes the text closer to literary style. Therefore, increasing the number of adjectives, makes the text closer to scientific or subjective style.   The most important achievement of statistical methodology of Buseman can be considered as: a comparison of author’s style, literary periods and genres; b studying the language system and variety of words; and, c classification and grading differences or similarities of literary, periods and genres.   The purpose of this study: Stylistic analysis of Maqamat Hamadani and al-Hariri based on the statistical model of Buseman.   The question proposed in this study : a How effective is the use of the statistical methods , in identifying and analyzing a variety of literature including Maqamat? b How effective is the quantitative scope of verbs and adjectives , as two key parameters in determining the style of literary works? And c Which element of fiction is most effective in enriching the literary approach ?   Specific research method is statistical – analysis ; We initially discovered and classified the number of verbs and adjectives in the Fifty- one of Hamadani Maqamehs and Fifty of Hariri Maqamehs ; then the scope of verbs and adjectives usage is shown in the form of tables and graphs . After that , we will assess the style of the literary work , based on the use of verbs and adjectives.   The research findings show : All Hamadani and Hariri Maqamat s quantitatively benefit from high -active approach, for the use of the verb . At 46 Maqameh Hamadani , and 48 Maqameh Hariri the

  8. Fuzzy-Rule-Based Object Identification Methodology for NAVI System

    Directory of Open Access Journals (Sweden)

    Yaacob Sazali

    2005-01-01

    Full Text Available We present an object identification methodology applied in a navigation assistance for visually impaired (NAVI system. The NAVI has a single board processing system (SBPS, a digital video camera mounted headgear, and a pair of stereo earphones. The captured image from the camera is processed by the SBPS to generate a specially structured stereo sound suitable for vision impaired people in understanding the presence of objects/obstacles in front of them. The image processing stage is designed to identify the objects in the captured image. Edge detection and edge-linking procedures are applied in the processing of image. A concept of object preference is included in the image processing scheme and this concept is realized using a fuzzy-rule base. The blind users are trained with the stereo sound produced by NAVI for achieving a collision-free autonomous navigation.

  9. Fuzzy-Rule-Based Object Identification Methodology for NAVI System

    Science.gov (United States)

    Nagarajan, R.; Sainarayanan, G.; Yaacob, Sazali; Porle, Rosalyn R.

    2005-12-01

    We present an object identification methodology applied in a navigation assistance for visually impaired (NAVI) system. The NAVI has a single board processing system (SBPS), a digital video camera mounted headgear, and a pair of stereo earphones. The captured image from the camera is processed by the SBPS to generate a specially structured stereo sound suitable for vision impaired people in understanding the presence of objects/obstacles in front of them. The image processing stage is designed to identify the objects in the captured image. Edge detection and edge-linking procedures are applied in the processing of image. A concept of object preference is included in the image processing scheme and this concept is realized using a fuzzy-rule base. The blind users are trained with the stereo sound produced by NAVI for achieving a collision-free autonomous navigation.

  10. Effect of depletion layer width on electrical properties of semiconductive thin film gas sensor: a numerical study based on the gradient-distributed oxygen vacancy model

    Science.gov (United States)

    Liu, Jianqiao; Lu, Yiting; Cui, Xiao; Jin, Guohua; Zhai, Zhaoxia

    2016-03-01

    The effects of depletion layer width on the semiconductor gas sensors were investigated based on the gradient-distributed oxygen vacancy model, which provided numerical descriptions for the sensor properties. The potential barrier height, sensor resistance, and response to target gases were simulated to reveal their dependences on the depletion layer width. According to the simulation, it was possible to improve the sensor response by enlarging the width of depletion layer without changing the resistance of the gas sensor under the special circumstance. The different performances between resistance and response could provide a bright expectation that the design and fabrication of gas sensing devices could be economized. The simulation results were validated by the experimental performances of SnO2 thin film gas sensors, which were prepared by the sol-gel technique. The dependences of sensor properties on depletion layer width were observed to be in agreement with the simulations.

  11. METHODOLOGICAL BASES OF PUBLIC ADMINISTRATION OF PUBLIC DEVELOPMENT IN UKRAINE

    Directory of Open Access Journals (Sweden)

    Kyrylo Ohdanskyi

    2016-11-01

    Full Text Available An author in the article examines theoretical bases in the question of dynamics of community development. According to classic canons a dynamic process on any of levels of management hierarchy can be presented as a complex of changes of its ecological, economic and social components. For today, national politics in the field of realization of conception of community development does not take into account most theoretical works, which testify that in our country the mechanism of its effective adjusting is not yet created. In connection to this the author of the article accents the attention on the necessity of the use in modern Ukraine realities of the effective approaches to government control of community development. As the subject of research of the article the author chose the analysis of process of community development and methodological bases for the choice of variants for a management by this process. System approach is chosen by author as a research methodology. The aim. Analysis of theoretical bases and developing of the new approaches to the government administration of community development. An author divides the process of community development by constituents: social, economic and ecological components. From the indicated warning it is necessary to take into account the objective necessity of developing of the new conceptual approaches to the elaboration of tool of adjusting of community development. For the decision of this task the author of the article it is suggested to use the category “dynamics”. An author in the article does the analysis of different interpretations of term “dynamics”and offers his own interpretation in the context of community development. Our researches confirm that, mainly, it is methodologically possible to form the blocks of quantitative and quality factors of specific different information of ecological, economic and social character. Author’s researches confirm that it is methodologically

  12. Depleted fully monolithic CMOS pixel detectors using a column based readout architecture for the ATLAS Inner Tracker upgrade

    OpenAIRE

    Wang, T.; Barbero, M.; Berdalovic, I.; Bespin, C.; Bhat, S.; Breugnon, P.; Caicedo, I.; Cardella, R.; Chen, Z.; Degerli, Y.; Egidos, N.; Godiot, S.; Guilloux, F.; Hemperek, T.; Hirono, T.

    2017-01-01

    Depleted monolithic active pixel sensors (DMAPS), which exploit high voltage and/or high resistivity add-ons of modern CMOS technologies to achieve substantial depletion in the sensing volume, have proven to have high radiation tolerance towards the requirements of ATLAS in the high-luminosity LHC era. Depleted fully monolithic CMOS pixels with fast readout architectures are currently being developed as promising candidates for the outer pixel layers of the future ATLAS Inner Tracker, which w...

  13. Methodology of Maqamat Hamadani and Hariri Based on Buseman’s statistical methodology

    Directory of Open Access Journals (Sweden)

    Hamed Sedghi

    2016-01-01

    Full Text Available  Abstract  Stylistics can be defined as analysis and interpretation of the expression and different forms of speech, based on linguistic elements. German theorist , Buseman , devised his theses in statistical style based on the ratio of verbs to adjectives, generalizing a variety of literary and non- literary genera in German Language and Literature . Â  According to Buseman , increasing the number of verbs in verses, makes the text closer to literary style. Therefore, increasing the number of adjectives, makes the text closer to scientific or subjective style.   The most important achievement of statistical methodology of Buseman can be considered as: a comparison of author’s style, literary periods and genres; b studying the language system and variety of words; and, c classification and grading differences or similarities of literary, periods and genres.   The purpose of this study: Stylistic analysis of Maqamat Hamadani and al-Hariri based on the statistical model of Buseman.   The question proposed in this study : a How effective is the use of the statistical methods , in identifying and analyzing a variety of literature including Maqamat? b How effective is the quantitative scope of verbs and adjectives , as two key parameters in determining the style of literary works? And c Which element of fiction is most effective in enriching the literary approach ?   Specific research method is statistical – analysis ; We initially discovered and classified the number of verbs and adjectives in the Fifty- one of Hamadani Maqamehs and Fifty of Hariri Maqamehs ; then the scope of verbs and adjectives usage is shown in the form of tables and graphs . After that , we will assess the style of the literary work , based on the use of verbs and adjectives.   The research findings show : All Hamadani and Hariri Maqamat s quantitatively benefit from high -active approach, for the use of the verb . At 46 Maqameh Hamadani , and

  14. A methodology for physically based rockfall hazard assessment

    Directory of Open Access Journals (Sweden)

    G. B. Crosta

    2003-01-01

    Full Text Available Rockfall hazard assessment is not simple to achieve in practice and sound, physically based assessment methodologies are still missing. The mobility of rockfalls implies a more difficult hazard definition with respect to other slope instabilities with minimal runout. Rockfall hazard assessment involves complex definitions for "occurrence probability" and "intensity". This paper is an attempt to evaluate rockfall hazard using the results of 3-D numerical modelling on a topography described by a DEM. Maps portraying the maximum frequency of passages, velocity and height of blocks at each model cell, are easily combined in a GIS in order to produce physically based rockfall hazard maps. Different methods are suggested and discussed for rockfall hazard mapping at a regional and local scale both along linear features or within exposed areas. An objective approach based on three-dimensional matrixes providing both a positional "Rockfall Hazard Index" and a "Rockfall Hazard Vector" is presented. The opportunity of combining different parameters in the 3-D matrixes has been evaluated to better express the relative increase in hazard. Furthermore, the sensitivity of the hazard index with respect to the included variables and their combinations is preliminarily discussed in order to constrain as objective as possible assessment criteria.

  15. The effect of self-depleting in UV photodetector based on simultaneously fabricated TiO2/NiO pn heterojunction and Ni/Au composite electrode

    Science.gov (United States)

    Zhang, Dezhong; Liu, Chunyu; Xu, Ruiliang; Yin, Bo; Chen, Yu; Zhang, Xindong; Gao, Fengli; Ruan, Shengping

    2017-09-01

    A novel dark self-depleting ultraviolet (UV) photodetector based on a TiO2/NiO pn heterojunction was demonstrated and exhibited lower dark current (I dark) and noise. Both the NiO layer and Ni/Au composite electrode were fabricated by a smart, one-step oxidation method which was first employed in the fabrication of the UV photodetector. In dark, the depleted pn heterojunction structure effectively reduced the majority carrier density in TiO2/NiO films, demonstrating a high resistance state and contributing to a lower I dark of 0.033 nA, two orders of magnitude lower than that of the single-material devices. Under UV illumination, the interface self-depleting effect arising from the dissociation and accumulation of photogenerated carriers was eliminated, ensuring loss-free responsivity (R) and a remarkable specific detectivity (D*) of 1.56 × 1014 cm Hz1/2 W-1 for the optimal device. The device with the structure of ITO/TiO2/NiO/Au was measured to prove the mechanisms of interface self-depleting in dark and elimination of the depletion layer under UV illumination. Meanwhile, shortened decay time was achieved in the pn heterojunction UV photodetector. This suggests that the self-depleting devices possess the potential to further enhance photodetection performance.

  16. A facile Arsenazo III based assay for monitoring rare earth element depletion from cultivation media of methanotrophic and methylotrophic bacteria.

    Science.gov (United States)

    Hogendoorn, Carmen; Roszczenko-Jasińska, Paula; Martinez-Gomez, N Cecilia; de Graaff, Johann; Grassl, Patrick; Pol, Arjan; Op den Camp, Huub J M; Daumann, Lena J

    2018-02-16

    Recently, methanotrophic and methylotrophic bacteria were found to utilize rare earth elements (REE). To monitor the REE-content in culture media of these bacteria we have developed a rapid screening method using the Arsenazo III (AS III) dye for spectrophotometric REE-detection in the low μM (0.1-10 μM) range. We designed this assay to follow La III and Eu III depletion from the culture medium by the acidophilic verrucomicrobial methanotroph Methylacidiphilum fumariolicum SolV. The assay can also be modified to screen the uptake of other REE such as Pr III or to monitor the depletion of La III from growth media in neutrophilic methylotrophs such as Methylobacterium extorquens AM1. The AS III assay presents a convenient and fast detection method for REE levels in culture media and is a sensitive alternative to inductively coupled plasma mass spectrometry (ICP-MS) or atomic absorption spectroscopy (AAS). Importance REE-dependent bacterial metabolism is a quickly emerging field and while the importance of REE for both methanotropic and methylotrophic bacteria is now firmly established, many important questions, such as how these insoluble elements are taken up into cells, are still unanswered. Here, an Arsenazo III dye based assay has been developed for fast, specific and sensitive determination of REE content in different culture media. This assay presents a useful tool for optimizing cultivation protocols as well as for routine REE monitoring during bacterial growth without the need for specialized analytical instrumentation. Furthermore, this assay has the potential to promote the discovery of other REE-dependent microorganisms and can help to elucidate the mechanisms for acquisition of REE by methanotrophic and methylotrophic bacteria. Copyright © 2018 Hogendoorn et al.

  17. Study on adaptive BTT reentry speed depletion guidance law based on BP neural network

    Science.gov (United States)

    Zheng, Zongzhun; Wang, Yongji; Wu, Hao

    2007-11-01

    Reentry guidance is one of the key technologies in hypersonic vehicle research field. In addition to the constraints on its final position coordinates, the vehicle must also impact the target from a specified direction with high precision. And therefore the adaptability of guidance law is critical to control the velocity of hypersonic vehicle and firing accuracy properly in different surroundings of large airspace. In this paper, a new adaptive guidance strategy based on Back Propagation (BP) neural network for the reentry mission of a generic hypersonic vehicle is presented. Depending on the nicer self-learn ability of BP neural network, the guidance law considers the influence of biggish mis-modeling of aerodynamics, structure error and other initial disturbances on the flight capability of vehicle. Consequently, terminal position accuracy and velocity are guaranteed, while many constraints are satisfied. Numerical simulation results clearly bring out the fact that the proposed reentry guidance law based on BP neural network is rational and effective.

  18. Consensus-based methodology for detection communities in multilayered networks

    Science.gov (United States)

    Karimi-Majd, Amir-Mohsen; Fathian, Mohammad; Makrehchi, Masoud

    2018-03-01

    Finding groups of network users who are densely related with each other has emerged as an interesting problem in the area of social network analysis. These groups or so-called communities would be hidden behind the behavior of users. Most studies assume that such behavior could be understood by focusing on user interfaces, their behavioral attributes or a combination of these network layers (i.e., interfaces with their attributes). They also assume that all network layers refer to the same behavior. However, in real-life networks, users' behavior in one layer may differ from their behavior in another one. In order to cope with these issues, this article proposes a consensus-based community detection approach (CBC). CBC finds communities among nodes at each layer, in parallel. Then, the results of layers should be aggregated using a consensus clustering method. This means that different behavior could be detected and used in the analysis. As for other significant advantages, the methodology would be able to handle missing values. Three experiments on real-life and computer-generated datasets have been conducted in order to evaluate the performance of CBC. The results indicate superiority and stability of CBC in comparison to other approaches.

  19. Development of risk-based decision methodology for facility design.

    Science.gov (United States)

    2014-06-01

    This report develops a methodology for CDOT to use in the risk analysis of various types of facilities and provides : illustrative examples for the use of the proposed framework. An overview of the current practices and applications to : illustrate t...

  20. Quantifying urban/industrial emissions of greenhouse and ozone-depleting gases based on atmospheric observations

    Science.gov (United States)

    Barnes, Diana Hart

    2000-11-01

    Background and pollution trends and cycles of fourteen trace gases over the Northeastern U.S. are inferred from continuous atmospheric observations at the Harvard Forest research station located in Petersham, Massachusetts. This site receives background `clean' air from the northwest (Canada) and `dirty' polluted air from the southwest (New York City-Washington, D.C. corridor). Mixing ratios of gases regulated by the Montreal Protocol or other policies (CO, PCE, CFC11, CFC12, CFC113, CH 3CCl3, CCl4, and Halon-1211) and of those not subject to restrictions (H2, CH4, CHCl3, TCE, N2O, and SF6) were measured over the three-year period, 1996 to 1998, every 24 minutes by a fully automated gas chromatographic instrument with electron capture detectors. Evidence for polar vortex venting is found consistently in the month of June of the background seasonal cycles. The ratio of CO and PCE enhancements borne on southwesterly winds are in excellent agreement with county-level EPA and sales-based inventories for the New York City-Washington, D.C. region. From this firm footing, we use CO and PCE as reference compounds to determine the urban/industrial source strengths for the other species. A broad historical and geographic study of emissions reveals that the international treaty has by and large been a success. Locally, despite the passing of the 1996 Montreal Protocol ban, only emissions of CFC12 and CH3CCl3 are abating. Though source strengths are waning, the sources are not spent and continued releases to the atmosphere may be expected for some years to come. For CH3CCl3, whose rate of decline is central to our understanding of atmospheric processes, we estimate that absolute concentrations may persist until around the year 2010. The long-term high frequency time series of hydrogen provided here represents the first such data set of its kind. The H2 diurnal cycle is established and explained in terms of its sources and sinks. The ratio of H2 to CO in pollution plumes is

  1. Cathepsin Activity-Based Probes and Inhibitor for Preclinical Atherosclerosis Imaging and Macrophage Depletion.

    Directory of Open Access Journals (Sweden)

    Ihab Abd-Elrahman

    Full Text Available Cardiovascular disease is the leading cause of death worldwide, mainly due to an increasing prevalence of atherosclerosis characterized by inflammatory plaques. Plaques with high levels of macrophage infiltration are considered "vulnerable" while those that do not have significant inflammation are considered stable; cathepsin protease activity is highly elevated in macrophages of vulnerable plaques and contributes to plaque instability. Establishing novel tools for non-invasive molecular imaging of macrophages in plaques could aid in preclinical studies and evaluation of therapeutics. Furthermore, compounds that reduce the macrophage content within plaques should ultimately impact care for this disease.We have applied quenched fluorescent cathepsin activity-based probes (ABPs to a murine atherosclerosis model and evaluated their use for in vivo imaging using fluorescent molecular tomography (FMT, as well as ex vivo fluorescence imaging and fluorescent microscopy. Additionally, freshly dissected human carotid plaques were treated with our potent cathepsin inhibitor and macrophage apoptosis was evaluated by fluorescent microscopy.We demonstrate that our ABPs accurately detect murine atherosclerotic plaques non-invasively, identifying cathepsin activity within plaque macrophages. In addition, our cathepsin inhibitor selectively induced cell apoptosis of 55%±10% of the macrophage within excised human atherosclerotic plaques.Cathepsin ABPs present a rapid diagnostic tool for macrophage detection in atherosclerotic plaque. Our inhibitor confirms cathepsin-targeting as a promising approach to treat atherosclerotic plaque inflammation.

  2. ENACTED SOFTWARE DEVELOPMENT PROCESS BASED ON AGILE AND AGENT METHODOLOGIES

    OpenAIRE

    DR. NACHAMAI. M; M. SENTHIL VADIVU; VINITA TAPASKAR

    2011-01-01

    Software Engineering gives the procedures and practices to be followed in the software development and acts as a backbone for computer science engineering techniques. This paper deals with current trends in software engineering methodologies, Agile and Agent Oriented software development process. Agile Methodology is to meet the needs of dynamic changing requirements of the customers. This model is iterative and incremental and accepts the changes in requirements at any stage of development. ...

  3. A Dwarf-based Scalable Big Data Benchmarking Methodology

    OpenAIRE

    Gao, Wanling; Wang, Lei; Zhan, Jianfeng; Luo, Chunjie; Zheng, Daoyi; Jia, Zhen; Xie, Biwei; Zheng, Chen; Yang, Qiang; Wang, Haibin

    2017-01-01

    Different from the traditional benchmarking methodology that creates a new benchmark or proxy for every possible workload, this paper presents a scalable big data benchmarking methodology. Among a wide variety of big data analytics workloads, we identify eight big data dwarfs, each of which captures the common requirements of each class of unit of computation while being reasonably divorced from individual implementations. We implement the eight dwarfs on different software stacks, e.g., Open...

  4. Investigating Land Movements of Saline Soils by SAR Based Methodologies

    Science.gov (United States)

    Magagnini, L.; Teatini, P.; Strozzi, T.; Ulazzi, E.; Simeoni, U.

    2011-12-01

    Solonchaks, more commonly known as saline soils, are a soil variety confined to the arid and semi-arid climatic zones. Theseflat areas are characterized by a shallow water table and an evapotranspiration considerably greater than precipitation. Salts dissolved in the soil moisture remain behind after evaporation/transpiration of the water and accumulate at the soil surface. Detecting ground displacement by SAR-based methodologies is challenging in these regions. On one hand, solonchaks have a stable soil structure becausea salt crust is well developed and are usually uncultivated. On the other hand, earth depressions are usually waterlogged due to groundwater capillary rise and hygroscopic water absorbed bysaltparticles. Moreover, sparse vegetation is present even if limited to halophytic shrubs. Although poorly developed, the assessment of land subsidence can be of interest when, as in the northern coast of the Caspian Sea, Kazakhstan, large exploitation of subsurface natural resources are planned. Due to the lack of traditional monitoring surveys,SAR-based interferometry represents the unique methodology that can be used to investigate the recent/present ground displacements of this large region. With a temperature ranging from-25 to +42°C and a precipitation less than 200 mm/yr, large depressions with solonchak in them characterize the whole area. The presence of salt-affected soils is in close relation to the oscillations of the sea level and the massive presence of salt domes. Due to the extreme flatness of the coastland, on the order of 0.001%, even a small land sinking produces a significant inland encroachment of the sea. Small BAseline Subset (SBAS) and Interferometric Point Target Analysis (IPTA) have been applied to understand the capability SAR-based techniques of monitoring land displacements in these environments. The SBAS approach is developed to maximize the spatial and temporal coherence through the construction of small baseline interferograms

  5. Reliability analysis for power supply system in a reprocessing facility based on GO methodology

    International Nuclear Information System (INIS)

    Wang Renze

    2014-01-01

    GO methodology was applied to analyze the reliability of power supply system in a typical reprocessing facility. Based on the fact that tie breakers are set in the system, tie breaker operator was defined. Then GO methodology modeling and quantitative analysis were performed sequently, minimal cut sets and average unavailability of the system were obtained. Parallel analysis between GO methodology and fault tree methodology was also performed. The results showed that setup of tie breakers was rational and necessary and that the modeling was much easier and the chart was much more succinct for GO methodology parallel with fault tree methodology to analyze the reliability of the power supply system. (author)

  6. Lean methodology: an evidence-based practice approach for healthcare improvement.

    Science.gov (United States)

    Johnson, Pauline M; Patterson, Claire J; OʼConnell, Mary P

    2013-12-10

    Lean methodology, an evidence-based practice approach adopted from Toyota, is grounded on the pillars of respect for people and continuous improvement. This article describes the use of Lean methodology to improve healthcare outcomes for patients with community-acquired pneumonia. Nurse practitioners and other clinicians should be knowledgeable about this methodology and become leaders in Lean transformation.

  7. Preparing for budget-based payment methodologies: global payment and episode-based payment.

    Science.gov (United States)

    Hudson, Mark E

    2015-10-01

    Use of budget-based payment methodologies (capitation and episode-based bundled payment) has been demonstrated to drive value in healthcare delivery. With a focus on high-volume, high-cost surgical procedures, inclusion of anaesthesiology services in these methodologies is likely. This review provides a summary of budget-based payment methodologies and practical information necessary for anaesthesiologists to prepare for participation in these programmes. Although few examples of anaesthesiologists' participation in these models exist, an understanding of the structure of these programmes and opportunities for participation are available. Prospective preparation in developing anaesthesiology-specific bundled payment profiles and early participation in pathway development associated with selected episodes of care are essential for successful participation as a gainsharing partner. With significant opportunity to contribute to care coordination and cost management, anaesthesiology can play an important role in budget-based payment programmes and should expect to participate as full gainsharing partners. Precise costing methodologies and accurate economic modelling, along with identification of quality management and cost control opportunities, will help identify participation opportunities and appropriate payment and gainsharing agreements. Anaesthesiology-specific examples with budget-based payment models are needed to help guide increased participation in these programmes.

  8. Development of Proliferation Resistance Assessment Methodology Based on International Standards

    International Nuclear Information System (INIS)

    Lee, Yong Deok; Lee, Jung Won; Lee, Kwang Seok

    2009-03-01

    Proliferation resistance is one of the requirement to be met in GEN IV and INPRO for next generation nuclear energy system. Internationally, the evaluation methodology on PR had been already initiated from 1980, but the systematic development was started at 2000s. In Korea, for the export of nuclear energy system and the increase of international credibility and transparence of domestic nuclear system and fuel cycle development, the independent development of PR evaluation methodology was started in 2007 as a nuclear long term R and D project and the development is being performed for the model of PR evaluation methodology. In 1st year, comparative study of GEN-IV/INPRO, PR indicator development, quantification of indicator and evaluation model development, analysis of technology system and international technology development trend had been performed. In 2nd year, feasibility study of indicator, allowable limit of indicator, review of technical requirement of indicator were done. The results of PR evaluation must be applied in the beginning of conceptual design of nuclear system. Through the technology development of PR evaluation methodology, the methodology will be applied in the regulatory requirement for authorization and permission to be developed

  9. Base line definitions and methodological lessons from Zimbabwe

    International Nuclear Information System (INIS)

    Maya, R.S.

    1995-01-01

    The UNEP Greenhouse Gas Abatement Costing Studies carried out under the management of the UNEP Collaborating Centre On Energy and Environment at Risoe National Laboratories in Denmark has placed effort in generating methodological approaches to assessing the cost of abatement activities to reduce CO 2 emissions. These efforts have produced perhaps the most comprehensive set of methodological approaches to defining and assessing the cost of greenhouse gas abatement. Perhaps the most importance aspect of the UNEP study which involved teams of researchers from ten countries is the mix of countries in which the studies were conducted and hence the representation of views and concepts from researchers in these countries particularly those from developing countries namely, Zimbabwe, India, Venezuela, Brazil, Thailand and Senegal. Methodological lessons from Zimbabwe, therefore, would have benefited from the interactions with methodological experiences form the other participating countries. Methodological lessons from the Zimbabwean study can be placed in two categories. One relates to the modelling of tools to analyze economic trends and the various factors studied in order to determine the unit cost of CO 2 abatement. The other is the definition of factors influencing the levels of emissions reducible and those realised under specific economic trends. (au)

  10. Lab Scale Study of the Depletion of Mullite/Corundum-Based Refractories Trough Reaction with Scaffold Materials

    International Nuclear Information System (INIS)

    Stjernberg, J; Antti, M-L; Ion, J C; Lindblom, B

    2011-01-01

    To investigate the mechanisms underlying the depletion of mullite/corundum-based refractory bricks used in rotary kilns for iron ore pellet production, the reaction mechanisms between scaffold material and refractory bricks have been studied on the laboratory-scale. Alkali additions were used to enhance the reaction rates between the materials. The morphological changes and active chemical reactions at the refractory/scaffold material interface in the samples were characterized using scanning electron microscopy (SEM), thermal analysis (TA) and X-ray diffraction (XRD). No reaction products of alkali and hematite (Fe 2 O 3 ) were detected; however, alkali dissolves the mullite in the bricks. Phases such as nepheline (Na 2 O·Al 2 O 3 ·2SiO 2 ), kalsilite (K 2 O·Al 2 O 3 ·2SiO 2 ), leucite (K 2 O·Al 2 O 3 ·4SiO 2 ) and potassium β-alumina (K 2 O·11Al 2 O 3 ) were formed as a consequence of reactions between alkali and the bricks.

  11. Methodological bases of innovative training of specialists in nanotechnology field

    Directory of Open Access Journals (Sweden)

    FIGOVSKY Oleg Lvovich

    2016-10-01

    Full Text Available The performance of innovative training system aimed at highly intellectual specialists in the area of nanotechnologies for Kazakhstan’s economy demands establishment and development of nanotechnological market in the country, teaching of innovative engineering combined with consistent research, integration of trained specialists with latest technologies and sciences at the international level. Methodological aspects of training competitive specialists for nanotechnological field are specific. The paper presents methodological principles of innovative training of specialists for science-intensive industry that were realized according to grant given by the Ministry of Education and Science of the Republic of Kazakhstan.

  12. Unbiased Selective Isolation of Protein N-Terminal Peptides from Complex Proteome Samples Using Phospho Tagging PTAG) and TiO2-based Depletion

    NARCIS (Netherlands)

    Mommen, G.P.M.; Waterbeemd, van de B.; Meiring, H.D.; Kersten, G.; Heck, A.J.R.; Jong, de A.P.J.M.

    2012-01-01

    A positional proteomics strategy for global N-proteome analysis is presented based on phospho tagging (PTAG) of internal peptides followed by depletion by titanium dioxide (TiO2) affinity chromatography. Therefore, N-terminal and lysine amino groups are initially completely dimethylated with

  13. The multiphase flow system used in exploiting depleted reservoirs: water-based Micro-bubble drilling fluid

    Science.gov (United States)

    Li-hui, Zheng; Xiao-qing, He; Li-xia, Fu; Xiang-chun, Wang

    2009-02-01

    Water-based micro-bubble drilling fluid, which is used to exploit depleted reservoirs, is a complicated multiphase flow system that is composed of gas, water, oil, polymer, surfactants and solids. The gas phase is separate from bulk water by two layers and three membranes. They are "surface tension reducing membrane", "high viscosity layer", "high viscosity fixing membrane", "compatibility enhancing membrane" and "concentration transition layer of liner high polymer (LHP) & surfactants" from every gas phase centre to the bulk water. "Surface tension reducing membrane", "high viscosity layer" and "high viscosity fixing membrane" bond closely to pack air forming "air-bag", "compatibility enhancing membrane" and "concentration transition layer of LHP & surfactants" absorb outside "air-bag" to form "incompact zone". From another point of view, "air-bag" and "incompact zone" compose micro-bubble. Dynamic changes of "incompact zone" enable micro-bubble to exist lonely or aggregate together, and lead the whole fluid, which can wet both hydrophilic and hydrophobic surface, to possess very high viscosity at an extremely low shear rate but to possess good fluidity at a higher shear rate. When the water-based micro-bubble drilling fluid encounters leakage zones, it will automatically regulate the sizes and shapes of the bubbles according to the slot width of fracture, the height of cavern as well as the aperture of openings, or seal them by making use of high viscosity of the system at a very low shear rate. Measurements of the rheological parameters indicate that water-based micro-bubble drilling fluid has very high plastic viscosity, yield point, initial gel, final gel and high ratio of yield point and plastic viscosity. All of these properties make the multiphase flow system meet the requirements of petroleum drilling industry. Research on interface between gas and bulk water of this multiphase flow system can provide us with information of synthesizing effective agents to

  14. Radiochemical Analysis Methodology for uranium Depletion Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Scatena-Wachel DE

    2007-01-09

    This report provides sufficient material for a test sponsor with little or no radiochemistry background to understand and follow physics irradiation test program execution. Most irradiation test programs employ similar techniques and the general details provided here can be applied to the analysis of other irradiated sample types. Aspects of program management directly affecting analysis quality are also provided. This report is not an in-depth treatise on the vast field of radiochemical analysis techniques and related topics such as quality control. Instrumental technology is a very fast growing field and dramatic improvements are made each year, thus the instrumentation described in this report is no longer cutting edge technology. Much of the background material is still applicable and useful for the analysis of older experiments and also for subcontractors who still retain the older instrumentation.

  15. Methodological Innovation in Practice-Based Design Doctorates

    Science.gov (United States)

    Yee, Joyce S. R.

    2010-01-01

    This article presents a selective review of recent design PhDs that identify and analyse the methodological innovation that is occurring in the field, in order to inform future provision of research training. Six recently completed design PhDs are used to highlight possible philosophical and practical models that can be adopted by future PhD…

  16. Integrated Methodology for Information System Change Control Based on Enterprise Architecture Models

    Directory of Open Access Journals (Sweden)

    Pirta Ruta

    2015-12-01

    Full Text Available The information system (IS change management and governance, according to the best practices, are defined and described in several international methodologies, standards, and frameworks (ITIL, COBIT, ValIT etc.. These methodologies describe IS change management aspects from the viewpoint of their particular enterprise resource management area. The areas are mainly viewed in a partly isolated environment, and the integration of the existing methodologies is insufficient for providing unified and controlled methodological support for holistic IS change management. In this paper, an integrated change management methodology is introduced. The methodology consists of guidelines for IS change control by integrating the following significant resource management areas – information technology (IT governance, change management and enterprise architecture (EA change management. In addition, the methodology includes lists of controls applicable at different phases. The approach is based on re-use and fusion of principles used by related methodologies as well as on empirical observations about typical IS change management mistakes in enterprises.

  17. Knowledge-based and model-based hybrid methodology for comprehensive waste minimization in electroplating plants

    Science.gov (United States)

    Luo, Keqin

    1999-11-01

    The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and

  18. Depleted uranium management alternatives

    Energy Technology Data Exchange (ETDEWEB)

    Hertzler, T.J.; Nishimoto, D.D.

    1994-08-01

    This report evaluates two management alternatives for Department of Energy depleted uranium: continued storage as uranium hexafluoride, and conversion to uranium metal and fabrication to shielding for spent nuclear fuel containers. The results will be used to compare the costs with other alternatives, such as disposal. Cost estimates for the continued storage alternative are based on a life-cycle of 27 years through the year 2020. Cost estimates for the recycle alternative are based on existing conversion process costs and Capital costs for fabricating the containers. Additionally, the recycle alternative accounts for costs associated with intermediate product resale and secondary waste disposal for materials generated during the conversion process.

  19. Compliance strategy for statistically based neutron overpower protection safety analysis methodology

    International Nuclear Information System (INIS)

    Holliday, E.; Phan, B.; Nainer, O.

    2009-01-01

    The methodology employed in the safety analysis of the slow Loss of Regulation (LOR) event in the OPG and Bruce Power CANDU reactors, referred to as Neutron Overpower Protection (NOP) analysis, is a statistically based methodology. Further enhancement to this methodology includes the use of Extreme Value Statistics (EVS) for the explicit treatment of aleatory and epistemic uncertainties, and probabilistic weighting of the initial core states. A key aspect of this enhanced NOP methodology is to demonstrate adherence, or compliance, with the analysis basis. This paper outlines a compliance strategy capable of accounting for the statistical nature of the enhanced NOP methodology. (author)

  20. Agent-based Modeling Methodology for Analyzing Weapons Systems

    Science.gov (United States)

    2015-03-26

    objective is to attack the boost phase of ballistic missiles using the Airborne Weapons Layer concept (AWL) (Corbett, 2013) and ( Rood , Chilton, Campbell...and analysis techniques used in this research. Chapter 4 provides analysis of the simulation model to illustrate the methodology in Chapter 3 and to... techniques , and procedures. The purpose of our research is to study the use of a new missile system within an air combat environment. Therefore, the

  1. EUROCONTROL-Systemic Occurrence Analysis Methodology (SOAM)-A 'Reason'-based organisational methodology for analysing incidents and accidents

    International Nuclear Information System (INIS)

    Licu, Tony; Cioran, Florin; Hayward, Brent; Lowe, Andrew

    2007-01-01

    The Safety Occurrence Analysis Methodology (SOAM) developed for EUROCONTROL is an accident investigation methodology based on the Reason Model of organisational accidents. The purpose of a SOAM is to broaden the focus of an investigation from human involvement issues, also known as 'active failures of operational personnel' under Reason's original model, to include analysis of the latent conditions deeper within the organisation that set the context for the event. Such an approach is consistent with the tenets of Just Culture in which people are encouraged to provide full and open information about how incidents occurred, and are not penalised for errors. A truly systemic approach is not simply a means of transferring responsibility for a safety occurrence from front-line employees to senior managers. A consistent philosophy must be applied, where the investigation process seeks to correct deficiencies wherever they may be found, without attempting to apportion blame or liability

  2. Transcriptome-based identification of pro- and antioxidative gene expression in kidney cortex of nitric oxide-depleted rats

    NARCIS (Netherlands)

    Wesseling, Sebastiaan; Joles, Jaap A.; van Goor, Harry; Bluyssen, Hans A.; Kemmeren, Patrick; Holstege, Frank C.; Koomans, Hein A.; Braam, Branko

    2007-01-01

    Nitric oxide (NO) depletion in rats induces severe endothelial dysfunction within 4 days. Subsequently, hypertension and renal injury develop, which are ameliorated by alpha-tocopherol (VitE) cotreatment. The hypothesis of the present study was that NO synthase (NOS) inhibition induces a renal

  3. Performance-based methodology for the fire safe design of insulation materials in energy efficient buildings

    OpenAIRE

    Hidalgo-Medina, Juan Patricio

    2015-01-01

    This thesis presents a methodology to determine failure criteria of building insulation materials in the event of a fire that is specific to each typology of insulation material used. This methodology is based on material characterisation and assessment of fire performance of the most common insulation materials used in construction. Current methodologies give a single failure criterion independent of the nature of the material – this can lead to uneven requirements when addres...

  4. Simulation Environment Based on the Universal Verification Methodology

    CERN Document Server

    AUTHOR|(SzGeCERN)697338

    2016-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC desi...

  5. Legal and methodological bases of comprehensive forensic enquiry of pornography

    Directory of Open Access Journals (Sweden)

    Berdnikov D.V.

    2016-03-01

    Full Text Available The article gives an analysis of the legal definition of pornography. The author identified descriptive and target criteria groups which are required for the analysis and analyses the content of descriptive criteria of pornography and the way how they should be documented. Fixing attention to the anatomical and physiological characteristics of the sexual relations is determine as necessary target criterion. It is noted that the term "pornography" is a legal and cannot be subject of expertise. That is why author underlined some methodological basis of complex psycho-linguistic and psycho-art expertise. The article presents general issue depends on expert conclusion and studies cases where the research is necessary to involve doctors, as well as criteria for expert's opinion. Besides that, author defined subject, object and main tasks of psychological studies of pornographic information.

  6. Design of an Emulsion-based Personal Detergent through a Model-based Chemical Product Design Methodology

    DEFF Research Database (Denmark)

    Mattei, Michele; Hill, Michael; Kontogeorgis, Georgios

    2013-01-01

    An extended systematic methodology for the design of emulsion-based Chemical products is presented. The methodology consists of a model-based framework involving seven sequential hierarchical steps: starting with the identification of the needs to be satisfied by the product and then adding one-b...... to obtain one or more candidate formulations. A conceptual casestudy representing a personal detergent is presented to highlight the methodology....

  7. Knowledge-based operation guidance system for nuclear power plants based on generic task methodology

    International Nuclear Information System (INIS)

    Yamada, Naoyuki; Chandrasekaran, B.; Bhatnager, R.

    1989-01-01

    A knowledge-based system for operation guidance of nuclear power plants is proposed. The Dynamic Procedure Management System (DPMS) is designed and developed to assist human operators interactively by selecting and modifying predefined operation procedures in a dynamic situation. Unlike most operation guidance systems, DPMS has been built based on Generic Task Methodology, which makes the overall framework of the system perspicuous and also lets domain knowledge be represented in a natural way. This paper describes the organization of the system, the definition of each task, and the form and organization of knowledge, followed by an application example. (author)

  8. 77 FR 59625 - NIH Evidence-Based Methodology Workshop on Polycystic Ovary; Syndrome

    Science.gov (United States)

    2012-09-28

    ... Methodology Workshop on Polycystic Ovary; Syndrome Notice Notice is hereby given of the National Institutes of Health (NIH) Evidence-based Methodology Workshop on Polycystic Ovary Syndrome, to be held December 3-5.... on December 4 and at 8:30 a.m. on December 5. The workshop will be open to the public. Polycystic...

  9. A Methodology for Integrating Computer-Based Learning Tools in Science Curricula

    Science.gov (United States)

    Papadouris, Nicos; Constantinou, Constantinos P.

    2009-01-01

    This paper demonstrates a methodology for effectively integrating computer-based learning tools in science teaching and learning. This methodology provides a means of systematic analysis to identify the capabilities of particular software tools and to formulate a series of competencies relevant to physical science that could be developed by means…

  10. D2.1 - An EA Active, Problem Based Learning Methodology - EAtrain2

    DEFF Research Database (Denmark)

    Ryberg, Thomas; Georgsen, Marianne; Buus, Lillian

    This deliverable reports on the work undertaken in work package 2 with the key objective to develop a learning methodology for web 2.0 mediated Enterprise Architecture (EA) learning building on a problem based learning (PBL) approach. The deliverable reports not only on the methodology but also...

  11. A Methodology for Unit Testing Actors in Proprietary Discrete Event Based Simulators

    Science.gov (United States)

    2008-12-01

    0. The WAIT state performs three simple arith- metic operations involving the incoming packet: calculating Figure 4: An Example actor implementation...A METHODOLOGY FOR UNIT TESTING ACTORS IN PROPRIETARY DISCRETE EVENT BASED SIMULATIONS Mark E. Coyne Scott R. Graham Kenneth M. Hopkinson Stuart H...This paper presents a dependency injection based, unit test- ing methodology for unit testing components, or actors , involved in discrete event based

  12. Drag &Drop, Mixed-Methodology-based Lab-on-Chip Design Optimization Software, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall objective is to develop a ?mixed-methodology?, drag and drop, component library (fluidic-lego)-based, system design and optimization tool for complex...

  13. Integrated vehicle-based safety systems light-vehicle field operational test, methodology and results report.

    Science.gov (United States)

    2010-12-01

    "This document presents the methodology and results from the light-vehicle field operational test conducted as part of the Integrated Vehicle-Based Safety Systems program. These findings are the result of analyses performed by the University of Michi...

  14. A methodology to assess the contribution of biorefineries to a sustainable bio-based economy

    International Nuclear Information System (INIS)

    Maga, Daniel

    2015-01-01

    Within this thesis for the first time an integrative methodology to assess the sustainability of biorefineries and bio-based products has been developed which is based on a fundamental understanding of sustainability as presented in the Brundtland report. The applied integrative concept of sustainability as developed by the Institute for Technology Assessment and Systems Analysis (ITAS) overcomes the widespread thinking in three pillars of sustainability and opens up new perspectives. The methodology developed addresses innovative life cycle assessment evaluation methods on midpoint level as well as on the area of protection and adopts state-of-the-art assessment procedures e.g. to determine water deprivation. It goes far beyond the scope of conventional LCA studies and examines effects on human health, on the environment, on the development of knowledge and physical capital, and on regional development and acceptance. In order to validate the developed method it was applied to an algae biorefinery currently under development and construction in the south of Spain. For this assessment for the first time extensive process data was collected of a real algae biorefinery which uses municipal waste water as a culture medium for microalgae. The use of waste water allows to reduce the demand for fresh water and avoids additional fertilisation of microalgae. Moreover, the analysed algae biorefinery replaces conventional waste water treatment by a biological purification and produces biogas by an anaerobic pretreatment of waste water as well as by anaerobic digestion of algae. After several purification steps the biogas can be used as automotive fuel and thus contributes to further development and increased use of biofuels. On the one hand the sustainability assessment shows that this way of waste water treatment contributes to climate protection and to the conservation of fossil energy carrier. On the other hand approximately ten times more land is needed and twenty times

  15. A methodology to assess the contribution of biorefineries to a sustainable bio-based economy

    Energy Technology Data Exchange (ETDEWEB)

    Maga, Daniel

    2015-07-01

    Within this thesis for the first time an integrative methodology to assess the sustainability of biorefineries and bio-based products has been developed which is based on a fundamental understanding of sustainability as presented in the Brundtland report. The applied integrative concept of sustainability as developed by the Institute for Technology Assessment and Systems Analysis (ITAS) overcomes the widespread thinking in three pillars of sustainability and opens up new perspectives. The methodology developed addresses innovative life cycle assessment evaluation methods on midpoint level as well as on the area of protection and adopts state-of-the-art assessment procedures e.g. to determine water deprivation. It goes far beyond the scope of conventional LCA studies and examines effects on human health, on the environment, on the development of knowledge and physical capital, and on regional development and acceptance. In order to validate the developed method it was applied to an algae biorefinery currently under development and construction in the south of Spain. For this assessment for the first time extensive process data was collected of a real algae biorefinery which uses municipal waste water as a culture medium for microalgae. The use of waste water allows to reduce the demand for fresh water and avoids additional fertilisation of microalgae. Moreover, the analysed algae biorefinery replaces conventional waste water treatment by a biological purification and produces biogas by an anaerobic pretreatment of waste water as well as by anaerobic digestion of algae. After several purification steps the biogas can be used as automotive fuel and thus contributes to further development and increased use of biofuels. On the one hand the sustainability assessment shows that this way of waste water treatment contributes to climate protection and to the conservation of fossil energy carrier. On the other hand approximately ten times more land is needed and twenty times

  16. A methodology to support multidisciplinary model-based water management

    NARCIS (Netherlands)

    Scholten, H.; Kassahun, A.; Refsgaard, J.C.; Kargas, Th.; Gavardinas, C.; Beulens, A.J.M.

    2007-01-01

    Quality assurance in model based water management is needed because of some frequently perceived shortcomings, e.g. a lack of mutual understanding between modelling team members, malpractice and a tendency of modellers to oversell model capabilities. Initiatives to support quality assurance focus on

  17. Studying boat-based bear viewing: Methodological challenges and solutions

    Science.gov (United States)

    Sarah Elmeligi

    2007-01-01

    Wildlife viewing, a growing industry throughout North America, holds much potential for increased revenue and public awareness regarding species conservation. In Alaska and British Columbia, grizzly bear (Ursus arctos) viewing is becoming more popular, attracting tourists from around the world. Viewing is typically done from a land-based observation...

  18. Comparing econometric and survey-based methodologies in measuring offshoring

    DEFF Research Database (Denmark)

    Refslund, Bjarke

    2016-01-01

    such as the national or regional level. Most macro analyses are based on proxies and trade statistics with limitations. Drawing on unique Danish survey data, this article demonstrates how survey data can provide important insights into the national scale and impacts of offshoring, including changes of employment...

  19. ICT-Based, Cross-Cultural Communication: A Methodological Perspective

    Science.gov (United States)

    Larsen, Niels; Bruselius-Jensen, Maria; Danielsen, Dina; Nyamai, Rachael; Otiende, James; Aagaard-Hansen, Jens

    2014-01-01

    The article discusses how cross-cultural communication based on information and communication technologies (ICT) may be used in participatory health promotion as well as in education in general. The analysis draws on experiences from a health education research project with grade 6 (approx. 12 years) pupils in Nairobi (Kenya) and Copenhagen…

  20. Methodology for new product development based on formalization ...

    African Journals Online (AJOL)

    Stages of the creation of a new product and a formalization structure of engineering process are described. The paper analyzes the concept of an engineering design as an evolutionary and creative thinking which combines a scientific approach and creativity based on intuition. It was proved that an ability to design is both ...

  1. Relational and Object-Oriented Methodology in Data Bases Systems

    Directory of Open Access Journals (Sweden)

    Marian Pompiliu CRISTESCU

    2006-01-01

    Full Text Available Database programming languages integrate concepts of databases and programming languages to provide both implementation tools for data-intensive applications and high-level user interfaces to databases. Frequently, database programs contain a large amount of application knowledge which is hidden in the procedural code and thus difficult to maintain with changing data and user views. This paper presents a first attempt to improve the situation by supporting the integrated definition and management of data and rules based on a setoriented and predicative approach. The use of database technology for integrated fact and rule base management is shown to have some important advantages in terms of fact and rule integrity, question-answering, and explanation of results.

  2. Biomarkers of Acute Stroke Etiology (BASE) Study Methodology.

    Science.gov (United States)

    Jauch, Edward C; Barreto, Andrew D; Broderick, Joseph P; Char, Doug M; Cucchiara, Brett L; Devlin, Thomas G; Haddock, Alison J; Hicks, William J; Hiestand, Brian C; Jickling, Glen C; June, Jeff; Liebeskind, David S; Lowenkopf, Ted J; Miller, Joseph B; O'Neill, John; Schoonover, Tim L; Sharp, Frank R; Peacock, W Frank

    2017-05-05

    Acute ischemic stroke affects over 800,000 US adults annually, with hundreds of thousands more experiencing a transient ischemic attack. Emergent evaluation, prompt acute treatment, and identification of stroke or TIA (transient ischemic attack) etiology for specific secondary prevention are critical for decreasing further morbidity and mortality of cerebrovascular disease. The Biomarkers of Acute Stroke Etiology (BASE) study is a multicenter observational study to identify serum markers defining the etiology of acute ischemic stroke. Observational trial of patients presenting to the hospital within 24 h of stroke onset. Blood samples are collected at arrival, 24, and 48 h later, and RNA gene expression is utilized to identify stroke etiology marker candidates. The BASE study began January 2014. At the time of writing, there are 22 recruiting sites. Enrollment is ongoing, expected to hit 1000 patients by March 2017. The BASE study could potentially aid in focusing the initial diagnostic evaluation to determine stroke etiology, with more rapidly initiated targeted evaluations and secondary prevention strategies.Clinical Trial Registration Clinicaltrials.gov NCT02014896 https://clinicaltrials.gov/ct2/show/NCT02014896?term=biomarkers+of+acute+stroke+etiology&rank=1.

  3. Depleted fully monolithic CMOS pixel detectors using a column based readout architecture for the ATLAS Inner Tracker upgrade

    Science.gov (United States)

    Wang, T.; Barbero, M.; Berdalovic, I.; Bespin, C.; Bhat, S.; Breugnon, P.; Caicedo, I.; Cardella, R.; Chen, Z.; Degerli, Y.; Egidos, N.; Godiot, S.; Guilloux, F.; Hemperek, T.; Hirono, T.; Krüger, H.; Kugathasan, T.; Hügging, F.; Marin Tobon, C. A.; Moustakas, K.; Pangaud, P.; Schwemling, P.; Pernegger, H.; Pohl, D.-L.; Rozanov, A.; Rymaszewski, P.; Snoeys, W.; Wermes, N.

    2018-03-01

    Depleted monolithic active pixel sensors (DMAPS), which exploit high voltage and/or high resistivity add-ons of modern CMOS technologies to achieve substantial depletion in the sensing volume, have proven to have high radiation tolerance towards the requirements of ATLAS in the high-luminosity LHC era. DMAPS integrating fast readout architectures are currently being developed as promising candidates for the outer pixel layers of the future ATLAS Inner Tracker, which will be installed during the phase II upgrade of ATLAS around year 2025. In this work, two DMAPS prototype designs, named LF-Monopix and TJ-Monopix, are presented. LF-Monopix was fabricated in the LFoundry 150 nm CMOS technology, and TJ-Monopix has been designed in the TowerJazz 180 nm CMOS technology. Both chips employ the same readout architecture, i.e. the column drain architecture, whereas different sensor implementation concepts are pursued. The paper makes a joint description of the two prototypes, so that their technical differences and challenges can be addressed in direct comparison. First measurement results for LF-Monopix will also be shown, demonstrating for the first time a fully functional fast readout DMAPS prototype implemented in the LFoundry technology.

  4. Publishing FAIR Data: an exemplar methodology utilizing PHI-base

    Directory of Open Access Journals (Sweden)

    Alejandro eRodríguez Iglesias

    2016-05-01

    Full Text Available Pathogen-Host interaction data is core to our understanding of disease processes and their molecular/genetic bases. Facile access to such core data is particularly important for the plant sciences, where individual genetic and phenotypic observations have the added complexity of being dispersed over a wide diversity of plant species versus the relatively fewer host species of interest to biomedical researchers. Recently, an international initiative interested in scholarly data publishing proposed that all scientific data should be FAIR - Findable, Accessible, Interoperable, and Reusable. In this work, we describe the process of migrating a database of notable relevance to the plant sciences - the Pathogen-Host Interaction Database (PHI-base - to a form that conforms to each of the FAIR Principles. We discuss the technical and architectural decisions, and the migration pathway, including observations of the difficulty and/or fidelity of each step. We examine how multiple FAIR principles can be addressed simultaneously through careful design decisions, including making data FAIR for both humans and machines with minimal duplication of effort. We note how FAIR data publishing involves more than data reformatting, requiring features beyond those exhibited by most life science Semantic Web or Linked Data resources. We explore the value-added by completing this FAIR data transformation, and then test the result through integrative questions that could not easily be asked over traditional Web-based data resources. Finally, we demonstrate the utility of providing explicit and reliable access to provenance information, which we argue enhances citation rates by encouraging and facilitating transparent scholarly reuse of these valuable data holdings.

  5. Publishing FAIR Data: An Exemplar Methodology Utilizing PHI-Base.

    Science.gov (United States)

    Rodríguez-Iglesias, Alejandro; Rodríguez-González, Alejandro; Irvine, Alistair G; Sesma, Ane; Urban, Martin; Hammond-Kosack, Kim E; Wilkinson, Mark D

    2016-01-01

    Pathogen-Host interaction data is core to our understanding of disease processes and their molecular/genetic bases. Facile access to such core data is particularly important for the plant sciences, where individual genetic and phenotypic observations have the added complexity of being dispersed over a wide diversity of plant species vs. the relatively fewer host species of interest to biomedical researchers. Recently, an international initiative interested in scholarly data publishing proposed that all scientific data should be "FAIR"-Findable, Accessible, Interoperable, and Reusable. In this work, we describe the process of migrating a database of notable relevance to the plant sciences-the Pathogen-Host Interaction Database (PHI-base)-to a form that conforms to each of the FAIR Principles. We discuss the technical and architectural decisions, and the migration pathway, including observations of the difficulty and/or fidelity of each step. We examine how multiple FAIR principles can be addressed simultaneously through careful design decisions, including making data FAIR for both humans and machines with minimal duplication of effort. We note how FAIR data publishing involves more than data reformatting, requiring features beyond those exhibited by most life science Semantic Web or Linked Data resources. We explore the value-added by completing this FAIR data transformation, and then test the result through integrative questions that could not easily be asked over traditional Web-based data resources. Finally, we demonstrate the utility of providing explicit and reliable access to provenance information, which we argue enhances citation rates by encouraging and facilitating transparent scholarly reuse of these valuable data holdings.

  6. JOB SHOP METHODOLOGY BASED ON AN ANT COLONY

    Directory of Open Access Journals (Sweden)

    OMAR CASTRILLON

    2009-01-01

    Full Text Available The purpose of this study is to reduce the total process time (Makespan and to increase the machines working time, in a job shop environment, using a heuristic based on ant colony optimization. This work is developed in two phases: The first stage describes the identification and definition of heuristics for the sequential processes in the job shop. The second stage shows the effectiveness of the system in the traditional programming of production. A good solution, with 99% efficiency is found using this technique.

  7. Application of risk-based methodologies to prioritize safety resources

    International Nuclear Information System (INIS)

    Rahn, F.J.; Sursock, J.P.; Hosler, J.

    1993-01-01

    The Electric Power Research Institute (EPRI) started a program entitled risk-based prioritization in 1992. The purpose of this program is to provide generic technical support to the nuclear power industry relative to its recent initiatives in the area of operations and maintenance (O ampersand M) cost control using state-of-the-art risk methods. The approach uses probabilistic risk assessment (PRA), or similar techniques, to allocate resources commensurate with the risk posed by nuclear plant operations. Specifically, those items or events that have high risk significance would receive the most attention, while those with little risk content would command fewer resources. As quantified in a companion paper,close-quote the potential O ampersand M cost reduction inherent in this approach is very large. Furthermore, risk-based methods should also lead to safety improvements. This paper outlines the way that the EPRI technical work complements the technical, policy, and regulatory initiatives taken by others in the industry and provides an example of the approach as used to prioritize motor-operated valve (MOV) testing in response to US Nuclear Regulatory Commission (NRC) Generic Letter 89-10

  8. Methodology for risk-based analysis of technical specifications

    International Nuclear Information System (INIS)

    Vesely, W.E.; Gaertner, J.P.; Wagner, D.P.

    1985-01-01

    Part of the effort by EPRI to apply probabilistic risk assessment methods and results to the solution of utility problems involves the investigation of methods for risk-based analysis of technical specifications. The culmination of this investigation is the SOCRATES computer code developed by Battelle's Columbus Laboratories to assist in the evaluation of technical specifications of nuclear power plants. The program is designed to use information found in PRAs to re-evaluate risk for changes in component allowed outage times (AOTs) and surveillance test intervals (STIs). The SOCRATES program is a unique and important tool for technical specification evaluations. The detailed component unavailability model allows a detailed analysis of AOT and STI contributions to risk. Explicit equations allow fast and inexpensive calculations. Because the code is designed to accept ranges of parameters and to save results of calculations that do not change during the analysis, sensitivity studies are efficiently performed and results are clearly displayed

  9. MEASURE OF LANDSCAPE HETEROGENEITY BY AGENT-BASED METHODOLOGY

    Directory of Open Access Journals (Sweden)

    E. Wirth

    2016-06-01

    Full Text Available With the rapid increase of the world's population, the efficient food production is one of the key factors of the human survival. Since biodiversity and heterogeneity is the basis of the sustainable agriculture, the authors tried to measure the heterogeneity of a chosen landscape. The EU farming and subsidizing policies (EEA, 2014 support landscape heterogeneity and diversity, nevertheless exact measurements and calculations apart from statistical parameters (standard deviation, mean, do not really exist. In the present paper the authors’ goal is to find an objective, dynamic method that measures landscape heterogeneity. It is achieved with the so called agent-based modelling, where randomly dispatched dynamic scouts record the observed land cover parameters and sum up the features of a new type of land. During the simulation the agents collect a Monte Carlo integral as a diversity landscape potential which can be considered as the unit of the ‘greening’ measure. As a final product of the ABM method, a landscape potential map is obtained that can serve as a tool for objective decision making to support agricultural diversity.

  10. Measure of Landscape Heterogeneity by Agent-Based Methodology

    Science.gov (United States)

    Wirth, E.; Szabó, Gy.; Czinkóczky, A.

    2016-06-01

    With the rapid increase of the world's population, the efficient food production is one of the key factors of the human survival. Since biodiversity and heterogeneity is the basis of the sustainable agriculture, the authors tried to measure the heterogeneity of a chosen landscape. The EU farming and subsidizing policies (EEA, 2014) support landscape heterogeneity and diversity, nevertheless exact measurements and calculations apart from statistical parameters (standard deviation, mean), do not really exist. In the present paper the authors' goal is to find an objective, dynamic method that measures landscape heterogeneity. It is achieved with the so called agent-based modelling, where randomly dispatched dynamic scouts record the observed land cover parameters and sum up the features of a new type of land. During the simulation the agents collect a Monte Carlo integral as a diversity landscape potential which can be considered as the unit of the `greening' measure. As a final product of the ABM method, a landscape potential map is obtained that can serve as a tool for objective decision making to support agricultural diversity.

  11. Tornado missile simulation and design methodology. Volume 2: model verification and data base updates. Final report

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments

  12. Design methodology for bio-based processing: Biodiesel and fatty alcohol production

    DEFF Research Database (Denmark)

    Simasatikul, Lida; Arpornwichanop, Amornchai; Gani, Rafiqul

    2012-01-01

    A systematic design methodology is developed for producing two main products plus side products starting with one or more bio-based renewable source. A superstructure that includes all possible reaction and separation operations is generated through thermodynamic insights and available data. The ....... Economic analysis and net present value are determined to find the best economically and operationally feasible process. The application of the methodology is presented through a case study involving biodiesel and fatty alcohol productions....

  13. Design methodology for bio-based processing: Biodiesel and fatty alcohol production

    DEFF Research Database (Denmark)

    Simasatikul, Lida; Arpornwichanopa, Amornchai; Gani, Rafiqul

    2013-01-01

    A systematic design methodology is developed for producing multiple main products plus side products starting with one or more bio-based renewable source. A superstructure that includes all possible reaction and separation operations is generated through thermodynamic insights and available data........ Economic analysis and net present value are determined to find the best economically and operationally feasible process. The application of the methodology is presented through a case study involving biodiesel and fatty alcohol productions....

  14. Fuzzy Clustering based Methodology for Multidimensional Data Analysis in Computational Forensic Domain

    OpenAIRE

    Kilian Stoffel; Paul Cotofrei; Dong Han

    2012-01-01

    As interdisciplinary domain requiring advanced and innovative methodologies the computational forensics domain is characterized by data being simultaneously large scaled and uncertain multidimensional and approximate. Forensic domain experts trained to discover hidden pattern from crime data are limited in their analysis without the assistance of a computational intelligence approach. In this paper a methodology and an automatic procedure based on fuzzy set theory and designed to infer precis...

  15. Methodological Aspects of Building Science-based Sports Training System for Taekwondo Sportsmen

    Directory of Open Access Journals (Sweden)

    Ananchenko Konstantin

    2016-10-01

    Full Text Available The authors have solved topical scientific problems in the article: 1 the research base in the construction of theoretical and methodological foundations of sports training, based on taekwondo has been analysed; 2 the organization and methodological requirements for the training sessions of taekwondo have been researched; 3 the necessity of interaction processes of natural development and adaptation to physical activity of young taekwondo sportsmen has been grounded; 4 the necessity of scientific evidence of building young fighters training loads in microcycles, based on their individualization has been proved.

  16. 5.0. Depletion, activation, and spent fuel source terms

    Energy Technology Data Exchange (ETDEWEB)

    Wieselquist, William A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-04-01

    SCALE’s general depletion, activation, and spent fuel source terms analysis capabilities are enabled through a family of modules related to the main ORIGEN depletion/irradiation/decay solver. The nuclide tracking in ORIGEN is based on the principle of explicitly modeling all available nuclides and transitions in the current fundamental nuclear data for decay and neutron-induced transmutation and relies on fundamental cross section and decay data in ENDF/B VII. Cross section data for materials and reaction processes not available in ENDF/B-VII are obtained from the JEFF-3.0/A special purpose European activation library containing 774 materials and 23 reaction channels with 12,617 neutron-induced reactions below 20 MeV. Resonance cross section corrections in the resolved and unresolved range are performed using a continuous-energy treatment by data modules in SCALE. All nuclear decay data, fission product yields, and gamma-ray emission data are developed from ENDF/B-VII.1 evaluations. Decay data include all ground and metastable state nuclides with half-lives greater than 1 millisecond. Using these data sources, ORIGEN currently tracks 174 actinides, 1149 fission products, and 974 activation products. The purpose of this chapter is to describe the stand-alone capabilities and underlying methodology of ORIGEN—as opposed to the integrated depletion capability it provides in all coupled neutron transport/depletion sequences in SCALE, as described in other chapters.

  17. Vision-based methodology for collaborative management of qualitative criteria in design

    DEFF Research Database (Denmark)

    Tollestrup, Christian

    2006-01-01

    A Vision-based methodology is proposed as part of the management of qualitative criteria for design in early phases of the product development process for team based organisations. Focusing on abstract values and qualities for the product establishes a shared vision for the product amongst team m...

  18. The Safer Choices Project: Methodological Issues in School-Based Health Promotion Intervention Research.

    Science.gov (United States)

    Basen-Engquist, Karen; Parcel, Guy S.; Harrist, Ronald; Kirby, Douglas; Coyle, Karin; Banspach, Stephen; Rugg, Deborah

    1997-01-01

    Uses Safer Choices--a school-based program for preventing HIV, sexually transmitted diseases, and pregnancy--to examine methodological issues in large-scale school-based health promotion research, discussing randomization of small numbers of units; reasons for using a cohort or cross-sectional design; and analysis of data by appropriate…

  19. Integration of process design and controller design for chemical processes using model-based methodology

    DEFF Research Database (Denmark)

    Abd.Hamid, Mohd-Kamaruddin; Sin, Gürkan; Gani, Rafiqul

    2010-01-01

    In this paper, a novel systematic model-based methodology for performing integrated process design and controller design (IPDC) for chemical processes is presented. The methodology uses a decomposition method to solve the IPDC typically formulated as a mathematical programming (optimization with ...... that satisfy design, control and cost criteria. The advantage of the proposed methodology is that it is systematic, makes use of thermodynamic-process knowledge and provides valuable insights to the solution of IPDC problems in chemical engineering practice.......In this paper, a novel systematic model-based methodology for performing integrated process design and controller design (IPDC) for chemical processes is presented. The methodology uses a decomposition method to solve the IPDC typically formulated as a mathematical programming (optimization...... with constraints) problem. Accordingly the optimization problem is decomposed into four sub-problems: (i) pre-analysis, (ii) design analysis, (iii) controller design analysis, and (iv) final selection and verification, which are relatively easier to solve. The methodology makes use of thermodynamic-process...

  20. DInSAR-Based Detection of Land Subsidence and Correlation with Groundwater Depletion in Konya Plain, Turkey

    Directory of Open Access Journals (Sweden)

    Fabiana Caló

    2017-01-01

    Full Text Available In areas where groundwater overexploitation occurs, land subsidence triggered by aquifer compaction is observed, resulting in high socio-economic impacts for the affected communities. In this paper, we focus on the Konya region, one of the leading economic centers in the agricultural and industrial sectors in Turkey. We present a multi-source data approach aimed at investigating the complex and fragile environment of this area which is heavily affected by groundwater drawdown and ground subsidence. In particular, in order to analyze the spatial and temporal pattern of the subsidence process we use the Small BAseline Subset DInSAR technique to process two datasets of ENVISAT SAR images spanning the 2002–2010 period. The produced ground deformation maps and associated time-series allow us to detect a wide land subsidence extending for about 1200 km2 and measure vertical displacements reaching up to 10 cm in the observed time interval. DInSAR results, complemented with climatic, stratigraphic and piezometric data as well as with land-cover changes information, allow us to give more insights on the impact of climate changes and human activities on groundwater resources depletion and land subsidence.

  1. A calculational procedure for neutronic and depletion analysis of Molten-Salt reactors based on SCALE6/TRITON

    International Nuclear Information System (INIS)

    Sheu, R.J.; Chang, J.S.; Liu, Y.-W. H.

    2011-01-01

    Molten-Salt Reactors (MSRs) represent one of the selected categories in the GEN-IV program. This type of reactor is distinguished by the use of liquid fuel circulating in and out of the core, which makes it possible for online refueling and salt processing. However, this operation characteristic also complicates the modeling and simulation of reactor core behaviour using conventional neutronic codes. The TRITON sequence in the SCALE6 code system has been designed to provide the combined capabilities of problem-dependent cross-section processing, rigorous treatment of neutron transport, and coupled with the ORIGEN-S depletion calculations. In order to accommodate the simulation of dynamic refueling and processing scheme, an in-house program REFRESH together with a run script are developed for carrying out a series of stepwise TRITON calculations, that makes the work of analyzing the neutronic properties and performance of a MSR core design easier. As a demonstration and cross check, we have applied this method to reexamine the conceptual design of Molten Salt Actinide Recycler & Transmuter (MOSART). This paper summarizes the development of the method and preliminary results of its application on MOSART. (author)

  2. 77 FR 70451 - Report of the Evidence-Based Methodology Workshop on Polycystic Ovary Syndrome-Request for Comments

    Science.gov (United States)

    2012-11-26

    ... Methodology Workshop on Polycystic Ovary Syndrome--Request for Comments SUMMARY: The National Institutes of... Evidence-Based Methodology Workshop on Polycystic Ovary Syndrome, to be held December [[Page 70452

  3. Development of a methodology for cost determination of wastewater treatment based on functional diagram

    International Nuclear Information System (INIS)

    Lamas, Wendell Q.; Silveira, Jose L.; Giacaglia, Giorgio E.O.; Reis, Luiz O.M.

    2009-01-01

    This work describes a methodology developed for determination of costs associated to products generated in a small wastewater treatment station for sanitary wastewater from a university campus. This methodology begins with plant component units identification, relating their fluid and thermodynamics features for each point marked in its process diagram. Following, its functional diagram is developed and its formulation is elaborated, in exergetic base, describing all equations for these points, which are the constraints for exergetic production cost problem and are used in equations to determine the costs associated to products generated in SWTS. This methodology was applied to a hypothetical system based on SWTS former parts and presented consistent results when compared to expected values based on previous exergetic expertise.

  4. Forest soil nutrient status after 10 years of experimental acidification and base cation depletion : results from 2 long-term soil productivity sites in the central Appalachians

    Energy Technology Data Exchange (ETDEWEB)

    Adams, M.B. [United States Dept. of Agriculture Forest Service, Parsons, WV (United States); Burger, J.A. [Virginia Tech University, Blacks Burg, VA (United States)

    2010-07-01

    This study assessed the hypothesis that soil based cation depletion is an effect of acidic deposition in forests located in the central Appalachians. The effects of experimentally induced base cation depletion were evaluated in relation to long-term soil productivity and the sustainability of forest stands. Whole-tree harvesting was conducted along with the removal of dead wood litter in order to remove all aboveground nutrients. Ammonium sulfate fertilizer was added at annual rates of 40.6 kg S/ha and 35.4 kg N/h in order to increase the leaching of calcium (Ca) and magnesium (Mg) from the soil. A randomized complete block design was used in 4 or 5 treatment applications in a mixed hardwood experimental forest located in West Virginia and in a cherry-maple forest located in a national forest in West Virginia. Soils were sampled over a 10-year period. The study showed that significant changes in soil Mg, N and some other nutrients occurred over time. However, biomass did not differ significantly among the different treatment options used.

  5. Systematic screening methodology and energy efficient design of ionic liquid-based separation processes

    DEFF Research Database (Denmark)

    Kulajanpeng, Kusuma; Suriyapraphadilok, Uthaiporn; Gani, Rafiqul

    2016-01-01

    A systematic methodology for the screening of ionic liquids (ILs) as entrainers and for the design of ILs-based separation processes in various homogeneous binary azeotropic mixtures has been developed. The methodology focuses on the homogeneous binary aqueous azeotropic systems (for example, water...... based on a combination of criteria such as stability, toxicity, and their environmental impacts. All best ILs were used as entrainers, and an extractive distillation column (EDC) and ionic liquid recovery column were designed and simulated with a process simulator to determine the overall energy...

  6. ORGANIZATION OF FUTURE ENGINEERS' PROJECT-BASED LEARNING WHEN STUDYING THE PROJECT MANAGEMENT METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Halyna V. Lutsenko

    2015-02-01

    Full Text Available The peculiarities of modern world experience of implementation of project-based learning in engineering education have been considered. The potential role and place of projects in learning activity have been analyzed. The methodology of organization of project-based activity of engineering students when studying the project management methodology and computer systems of project management has been proposed. The requirements to documentation and actual results of students' projects have been described in detail. The requirements to computer-aided systems of project management developed by using Microsoft Project in the scope of diary scheduling and resources planning have been formulated.

  7. Addressing Ozone Layer Depletion

    Science.gov (United States)

    Access information on EPA's efforts to address ozone layer depletion through regulations, collaborations with stakeholders, international treaties, partnerships with the private sector, and enforcement actions under Title VI of the Clean Air Act.

  8. Development of the MCNPX depletion capability: A Monte Carlo linked depletion method that automates the coupling between MCNPX and CINDER90 for high fidelity burnup calculations

    Science.gov (United States)

    Fensin, Michael Lorne

    and supported radiation transport code, for further development of a Monte Carlo-linked depletion methodology which is essential to the future development of advanced reactor technologies that exceed the limitations of current deterministic based methods.

  9. The Integration of Project-Based Methodology into Teaching in Machine Translation

    Science.gov (United States)

    Madkour, Magda

    2016-01-01

    This quantitative-qualitative analytical research aimed at investigating the effect of integrating project-based teaching methodology into teaching machine translation on students' performance. Data was collected from the graduate students in the College of Languages and Translation, at Imam Muhammad Ibn Saud Islamic University, Riyadh, Saudi…

  10. The Idea of National HRD: An Analysis Based on Economics and Theory Development Methodology

    Science.gov (United States)

    Wang, Greg G.; Swanson, Richard A.

    2008-01-01

    Recent human resource development (HRD) literature focuses attention on national HRD (NHRD) research and represents problems in both HRD identity and research methodology. Based on a review of development economics and international development literature, this study analyzes the existing NHRD literature with respect to the theory development…

  11. Project-based learning in organizations : towards a methodology for learning in groups

    NARCIS (Netherlands)

    Poell, R.F.; van der Krogt, F.J.

    2003-01-01

    This article introduces a methodology for employees in organizations to set up and carry out their own group learning projects. It is argued that employees can use project-based learning to make their everyday learning more systematic at times, without necessarily formalizing it. The article

  12. How to create a methodology of conceptual visualization based on experiential cognitive science and diagrammatology

    DEFF Research Database (Denmark)

    Toft, Birthe

    2013-01-01

    Based on the insights of experiential cognitive science and of diagrammatology as defined by Charles S. Peirce and Frederik Stjernfelt, this article analyses the iconic links connecting visualizations of Stjernfelt diagrams with human perception and action and starts to lay the theoretical...... groundwork for a methodology of conceptiual visualization...

  13. Teaching Research Methodology Using a Project-Based Three Course Sequence Critical Reflections on Practice

    Science.gov (United States)

    Braguglia, Kay H.; Jackson, Kanata A.

    2012-01-01

    This article presents a reflective analysis of teaching research methodology through a three course sequence using a project-based approach. The authors reflect critically on their experiences in teaching research methods courses in an undergraduate business management program. The introduction of a range of specific techniques including student…

  14. Generic Methodology for Field Calibration of Nacelle-Based Wind Lidars

    DEFF Research Database (Denmark)

    Borraccino, Antoine; Courtney, Michael; Wagner, Rozenn

    2016-01-01

    by the geometry of the scanning trajectory and the lidar inclination. The line-of-sight velocity is calibrated in atmospheric conditions by comparing it to a reference quantity based on classic instrumentation such as cup anemometers and wind vanes. The generic methodology was tested on two commercially developed...

  15. EA Training 2.0 Newsletter #3 - EA Active, Problem Based Learning Methodology

    DEFF Research Database (Denmark)

    Buus, Lillian; Ryberg, Thomas; Sroga, Magdalena

    2010-01-01

    The main products of the project are innovative, active problem-based learning methodology for EA education and training, EA courses for university students and private and public sector employees, and an Enterprise Architecture competence ontology including a complete specification of skills...

  16. Measuring the Differences between Traditional Learning and Game-Based Learning Using Electroencephalography (EEG) Physiologically Based Methodology

    Science.gov (United States)

    Chen, Ching-Huei

    2017-01-01

    Students' cognitive states can reflect a learning experience that results in engagement in an activity. In this study, we used electroencephalography (EEG) physiologically based methodology to evaluate students' levels of attention and relaxation, as well as their learning performance within a traditional and game-based learning context. While no…

  17. Combining Project-Based Learning and Community-Based Research in a Research Methodology Course: The Lessons Learned

    Science.gov (United States)

    Arantes do Amaral, João Alberto; Lino dos Santos, Rebeca Júlia Rodrigues

    2018-01-01

    In this article, we present our findings regarding the course "Research Methodology," offered to 22 first-year undergraduate students studying Administration at the Federal University of São Paulo, Osasco, Brazil. The course, which combined community-based research and project-based learning, was developed during the second semester of…

  18. A pattern-based methodology for optimizing stitches in double-patterning technology

    Science.gov (United States)

    Wang, Lynn T.; Madhavan, Sriram; Dai, Vito; Capodieci, Luigi

    2015-03-01

    A pattern-based methodology for optimizing stitches is developed based on identifying stitch topologies and replacing them with pre-characterized fixing solutions in decomposed layouts. A topology-based library of stitches with predetermined fixing solutions is built. A pattern-based engine searches for matching topologies in the decomposed layouts. When a match is found, the engine opportunistically replaces the predetermined fixing solution: only a design rule check error-free replacement is preserved. The methodology is demonstrated on a 20nm layout design that contains over 67 million, first metal layer stitches. Results show that a small library containing 3 stitch topologies improves the stitch area regularity by 4x.

  19. Methodology to estimate parameters of an excitation system based on experimental conditions

    Energy Technology Data Exchange (ETDEWEB)

    Saavedra-Montes, A.J. [Carrera 80 No 65-223, Bloque M8 oficina 113, Escuela de Mecatronica, Universidad Nacional de Colombia, Medellin (Colombia); Calle 13 No 100-00, Escuela de Ingenieria Electrica y Electronica, Universidad del Valle, Cali, Valle (Colombia); Ramirez-Scarpetta, J.M. [Calle 13 No 100-00, Escuela de Ingenieria Electrica y Electronica, Universidad del Valle, Cali, Valle (Colombia); Malik, O.P. [2500 University Drive N.W., Electrical and Computer Engineering Department, University of Calgary, Calgary, Alberta (Canada)

    2011-01-15

    A methodology to estimate the parameters of a potential-source controlled rectifier excitation system model is presented in this paper. The proposed parameter estimation methodology is based on the characteristics of the excitation system. A comparison of two pseudo random binary signals, two sampling periods for each one, and three estimation algorithms is also presented. Simulation results from an excitation control system model and experimental results from an excitation system of a power laboratory setup are obtained. To apply the proposed methodology, the excitation system parameters are identified at two different levels of the generator saturation curve. The results show that it is possible to estimate the parameters of the standard model of an excitation system, recording two signals and the system operating in closed loop with the generator. The normalized sum of squared error obtained with experimental data is below 10%, and with simulation data is below 5%. (author)

  20. A methodology of SiP testing based on boundary scan

    Science.gov (United States)

    Qin, He; Quan, Haiyang; Han, Yifei; Zhu, Tianrui; Zheng, Tuo

    2017-10-01

    System in Package (SiP) play an important role in portable, aerospace and military electronic with the microminiaturization, light weight, high density, and high reliability. At present, SiP system test has encountered the problem on system complexity and malfunction location with the system scale exponentially increase. For SiP system, this paper proposed a testing methodology and testing process based on the boundary scan technology. Combining the character of SiP system and referencing the boundary scan theory of PCB circuit and embedded core test, the specific testing methodology and process has been proposed. The hardware requirement of the under test SiP system has been provided, and the hardware platform of the testing has been constructed. The testing methodology has the character of high test efficiency and accurate malfunction location.

  1. Advanced piloted aircraft flight control system design methodology. Volume 1: Knowledge base

    Science.gov (United States)

    Mcruer, Duane T.; Myers, Thomas T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design stages starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. While theory and associated computational means are an important aspect of the design methodology, the lore, knowledge and experience elements, which guide and govern applications are critical features. This material is presented as summary tables, outlines, recipes, empirical data, lists, etc., which encapsulate a great deal of expert knowledge. Much of this is presented in topical knowledge summaries which are attached as Supplements. The composite of the supplements and the main body elements constitutes a first cut at a a Mark 1 Knowledge Base for manned-aircraft flight control.

  2. INSTALLING AN ERP SYSTEM WITH A METHODOLOGY BASED ON THE PRINCIPLES OF GOAL DIRECTED PROJECT MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Ioannis Zafeiropoulos

    2010-01-01

    Full Text Available This paper describes a generic methodology to support the process of modelling, adaptation and implementation (MAI of Enterprise Resource Planning Systems (ERPS based on the principles of goal directed project management (GDPM. The proposed methodology guides the project manager through specific stages in order to successfully complete the ERPS implementation. The development of the proper MAI methodology is deemed necessary because it will simplify the installation process of ERPS. The goal directed project management method was chosen since it provides a way of focusing all changes towards a predetermined goal. The main stages of the methodology are the promotion and preparation steps, the proposal, the contract, the implementation and the completion. The methodology was applied as a pilot application by a major ERPS development company. Important benefits were the easy and effective guidance for all installation and analysis stages, the faster installation for the ERPS and the control and cost reduction for the installation, in terms of time, manpower, technological equipment and other resources.

  3. Therapeutic guidelines on ulcerative colitis: a GRADE methodology based effort of GETECCU.

    Science.gov (United States)

    Gomollón, Fernando; García-López, Santiago; Sicilia, Beatriz; Gisbert, Javier P; Hinojosa, Joaquín

    2013-02-01

    Evidence-based clinical guidelines on Ulcerative colitis (UC) have been developed through a consensus, while GRADE methodology is the current standard for guideline development. This is the first one based on GRADE methodology on UC. Following GRADE methodology, the Spanish Group of Ulcerative Colitis and Crohn's disease (GETECCU) have developed a guideline on UC treatment. After selection of relevant clinical scenarios, 32 clinical questions were chosen and recommendations were established. In 2 questions no recommendation was possible. Twenty-two actions were recommended for, 14 strongly and 8 weakly. However, in 8 questions a recommendation against doing something was obtained, weak in 5 and strong in 3. The majority of recommendations were based on moderate quality evidence, and only 5 on high-quality evidence. With GRADE methodology we find a clear recommendation on possible actions in most clinical decisions in UC treatment, but much more clinical high-quality research is needed. Copyright © 2012 Elsevier España, S.L. and AEEH y AEG. All rights reserved.

  4. Acceleration-based methodology to assess the blast mitigation performance of explosive ordnance disposal helmets

    Science.gov (United States)

    Dionne, J. P.; Levine, J.; Makris, A.

    2018-01-01

    To design the next generation of blast mitigation helmets that offer increasing levels of protection against explosive devices, manufacturers must be able to rely on appropriate test methodologies and human surrogates that will differentiate the performance level of various helmet solutions and ensure user safety. Ideally, such test methodologies and associated injury thresholds should be based on widely accepted injury criteria relevant within the context of blast. Unfortunately, even though significant research has taken place over the last decade in the area of blast neurotrauma, there currently exists no agreement in terms of injury mechanisms for blast-induced traumatic brain injury. In absence of such widely accepted test methods and injury criteria, the current study presents a specific blast test methodology focusing on explosive ordnance disposal protective equipment, involving the readily available Hybrid III mannequin, initially developed for the automotive industry. The unlikely applicability of the associated brain injury criteria (based on both linear and rotational head acceleration) is discussed in the context of blast. Test results encompassing a large number of blast configurations and personal protective equipment are presented, emphasizing the possibility to develop useful correlations between blast parameters, such as the scaled distance, and mannequin engineering measurements (head acceleration). Suggestions are put forward for a practical standardized blast testing methodology taking into account limitations in the applicability of acceleration-based injury criteria as well as the inherent variability in blast testing results.

  5. CAPRE: A New Methodology for Product Recommendation Based on Customer Actionability and Profitability

    OpenAIRE

    Piton, Thomas; Blanchard, Julien; Guillet, Fabrice

    2011-01-01

    International audience; Recommender systems can apply knowledge discovery techniques to the problem of making product recommendations. This aims to establish a customer loyalty strategy and thus to optimize the customer life time value. In this paper we propose CAPRE, a data-mining based methodology for recommender systems based on the analysis of turnover for customers of specific products. Contrary to classical recommender systems, CAPRE does not aspire to predict a customer's behavior but t...

  6. Design-Based Research: Is This a Suitable Methodology for Short-Term Projects?

    Science.gov (United States)

    Pool, Jessica; Laubscher, Dorothy

    2016-01-01

    This article reports on a design-based methodology of a thesis in which a fully face-to-face contact module was converted into a blended learning course. The purpose of the article is to report on how design-based phases, in the form of micro-, meso- and macro-cycles were applied to improve practice and to generate design principles. Design-based…

  7. Hydrologic transport of depleted uranium associated with open air dynamic range testing at Los Alamos National Laboratory, New Mexico, and Eglin Air Force Base, Florida

    International Nuclear Information System (INIS)

    Becker, N.M.; Vanta, E.B.

    1995-01-01

    Hydrologic investigations on depleted uranium fate and transport associated with dynamic testing activities were instituted in the 1980's at Los Alamos National Laboratory and Eglin Air Force Base. At Los Alamos, extensive field watershed investigations of soil, sediment, and especially runoff water were conducted. Eglin conducted field investigations and runoff studies similar to those at Los Alamos at former and active test ranges. Laboratory experiments complemented the field investigations at both installations. Mass balance calculations were performed to quantify the mass of expended uranium which had transported away from firing sites. At Los Alamos, it is estimated that more than 90 percent of the uranium still remains in close proximity to firing sites, which has been corroborated by independent calculations. At Eglin, we estimate that 90 to 95 percent of the uranium remains at test ranges. These data demonstrate that uranium moves slowly via surface water, in both semi-arid (Los Alamos) and humid (Eglin) environments

  8. Hydrologic transport of depleted uranium associated with open air dynamic range testing at Los Alamos National Laboratory, New Mexico, and Eglin Air Force Base, Florida

    Energy Technology Data Exchange (ETDEWEB)

    Becker, N.M. [Los Alamos National Lab., NM (United States); Vanta, E.B. [Wright Laboratory Armament Directorate, Eglin Air Force Base, FL (United States)

    1995-05-01

    Hydrologic investigations on depleted uranium fate and transport associated with dynamic testing activities were instituted in the 1980`s at Los Alamos National Laboratory and Eglin Air Force Base. At Los Alamos, extensive field watershed investigations of soil, sediment, and especially runoff water were conducted. Eglin conducted field investigations and runoff studies similar to those at Los Alamos at former and active test ranges. Laboratory experiments complemented the field investigations at both installations. Mass balance calculations were performed to quantify the mass of expended uranium which had transported away from firing sites. At Los Alamos, it is estimated that more than 90 percent of the uranium still remains in close proximity to firing sites, which has been corroborated by independent calculations. At Eglin, we estimate that 90 to 95 percent of the uranium remains at test ranges. These data demonstrate that uranium moves slowly via surface water, in both semi-arid (Los Alamos) and humid (Eglin) environments.

  9. An automated methodology for levodopa-induced dyskinesia: assessment based on gyroscope and accelerometer signals.

    Science.gov (United States)

    Tsipouras, Markos G; Tzallas, Alexandros T; Rigas, George; Tsouli, Sofia; Fotiadis, Dimitrios I; Konitsiotis, Spiros

    2012-06-01

    In this study, a methodology is presented for an automated levodopa-induced dyskinesia (LID) assessment in patients suffering from Parkinson's disease (PD) under real-life conditions. The methodology is based on the analysis of signals recorded from several accelerometers and gyroscopes, which are placed on the subjects' body while they were performing a series of standardised motor tasks as well as voluntary movements. Sixteen subjects were enrolled in the study. The recordings were analysed in order to extract several features and, based on these features, a classification technique was used for LID assessment, i.e. detection of LID symptoms and classification of their severity. The results were compared with the clinical annotation of the signals, provided by two expert neurologists. The analysis was performed related to the number and topology of sensors used; several different experimental settings were evaluated while a 10-fold stratified cross validation technique was employed in all cases. Moreover, several different classification techniques were examined. The ability of the methodology to be generalised was also evaluated using leave-one-patient-out cross validation. The sensitivity and positive predictive values (average for all LID severities) were 80.35% and 76.84%, respectively. The proposed methodology can be applied in real-life conditions since it can perform LID assessment in recordings which include various PD symptoms (such as tremor, dyskinesia and freezing of gait) of several motor tasks and random voluntary movements. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. Knowledge based system and decision making methodologies in materials selection for aircraft cabin metallic structures

    Science.gov (United States)

    Adhikari, Pashupati Raj

    Materials selection processes have been the most important aspects in product design and development. Knowledge-based system (KBS) and some of the methodologies used in the materials selection for the design of aircraft cabin metallic structures are discussed. Overall aircraft weight reduction means substantially less fuel consumption. Part of the solution to this problem is to find a way to reduce overall weight of metallic structures inside the cabin. Among various methodologies of materials selection using Multi Criterion Decision Making (MCDM) techniques, a few of them are demonstrated with examples and the results are compared with those obtained using Ashby's approach in materials selection. Pre-defined constraint values, mainly mechanical properties, are employed as relevant attributes in the process. Aluminum alloys with high strength-to-weight ratio have been second-to-none in most of the aircraft parts manufacturing. Magnesium alloys that are much lighter in weight as alternatives to the Al-alloys currently in use in the structures are tested using the methodologies and ranked results are compared. Each material attribute considered in the design are categorized as benefit and non-benefit attribute. Using Ashby's approach, material indices that are required to be maximized for an optimum performance are determined, and materials are ranked based on the average of consolidated indices ranking. Ranking results are compared for any disparity among the methodologies.

  11. Methodology for Innovation-Based Control of Business Changes Taking into Consideration the Communication Aspects

    Directory of Open Access Journals (Sweden)

    Gumba Khuta

    2017-01-01

    Full Text Available At the current stage of the economy development the efficient control of the enterprise activity based on the anticipatory adaptation to continuous changes of the environment in tune with the trend of the internal medium dynamic self-organization is the most important prerequisite of the success, so the issues of scientifically substantiated control of changes become especially urgent. The authors have substantiated the essence and comprehensive nature of the process of business changes, and its predominantly innovative character has been determined. The algorithm of controlling business changes at the enterprise based on graph theory methodology has been suggested based on identification of the communication basis of business processes. Using the algorithm enables making scientifically substantiated decisions facilitating achievement of the enterprise functioning goals at some period of time. Based on the algorithm, the authors have developed methodology for optimization of innovation-based organizational enterprise management structure implemented in business practice of RZhDstroi JSC Holding. The results of the suggested methodological tooling validation have shown the appropriateness of the tools in the enterprise business activity.

  12. Attack Methodology Analysis: Emerging Trends in Computer-Based Attack Methodologies and Their Applicability to Control System Networks

    Energy Technology Data Exchange (ETDEWEB)

    Bri Rolston

    2005-06-01

    Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills, and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between

  13. Satellite based radar interferometry to estimate large-scale soil water depletion from clay shrinkage: possibilities and limitations

    NARCIS (Netherlands)

    Brake, te B.; Hanssen, R.F.; Ploeg, van der M.J.; Rooij, de G.H.

    2013-01-01

    Satellite-based radar interferometry is a technique capable of measuring small surface elevation changes at large scales and with a high resolution. In vadose zone hydrology, it has been recognized for a long time that surface elevation changes due to swell and shrinkage of clayey soils can serve as

  14. Revisiting The Depleted Self.

    Science.gov (United States)

    Abraham, Reggie

    2018-04-01

    This article revisits Donald Capps's book The Depleted Self (The depleted self: sin in a narcissistic age. Fortress Press, Minneapolis, 1993), which grew out of his 1990 Schaff Lectures at Pittsburgh Theological Seminary. In these lectures Capps proposed that the theology of guilt had dominated much of post-Reformation discourse. But with the growing prevalence of the narcissistic personality in the late twentieth century, the theology of guilt no longer adequately expressed humanity's sense of "wrongness" before God. Late twentieth-century persons sense this disjunction between God and self through shame dynamics. Narcissists are not "full" of themselves, as popular perspectives might indicate. Instead, they are empty, depleted selves. Psychologists suggest this stems from lack of emotional stimulation and the absence of mirroring in the early stages of life. The narcissist's search for attention and affirmation takes craving, paranoid, manipulative, or phallic forms and is essentially a desperate attempt to fill the internal emptiness. Capps suggests that two narratives from the Gospels are helpful here: the story of the woman with the alabaster jar and the story of Jesus's dialogue with Mary and John at Calvary. These stories provide us with clues as to how depleted selves experienced mirroring and the potential for internal peace in community with Jesus.

  15. Ozone depletion update.

    Science.gov (United States)

    Coldiron, B M

    1996-03-01

    Stratospheric ozone depletion due to chlorofluorocarbons an d increased ultraviolet radiation penetration has long been predicted. To determine if predictions of ozone depletion are correct and, if so, the significance of this depletion. Review of the English literature regarding ozone depletion and solar ultraviolet radiation. The ozone layer is showing definite thinning. Recently, significantly increased ultraviolet radiation transmission has been detected at ground level at several metering stations. It appears that man-made aerosols (air pollution) block increased UVB transmission in urban areas. Recent satellite measurements of stratospheric fluorine levels more directly implicate chlorofluorocarbons as a major source of catalytic stratospheric chlorine, although natural sources may account for up to 40% of stratospheric chlorine. Stratospheric chlorine concentrations, and resultant increased ozone destruction, will be enhanced for at least the next 70 years. The potential for increased transmission of ultraviolet radiation will exist for the next several hundred years. While little damage due to increased ultraviolet radiation has occurred so far, the potential for long-term problems is great.

  16. Environmental restoration risk-based prioritization work package planning and risk ranking methodology. Revision 2

    International Nuclear Information System (INIS)

    Dail, J.L.; Nanstad, L.D.; White, R.K.

    1995-06-01

    This document presents the risk-based prioritization methodology developed to evaluate and rank Environmental Restoration (ER) work packages at the five US Department of Energy, Oak Ridge Field Office (DOE-ORO) sites [i.e., Oak Ridge K-25 Site (K-25), Portsmouth Gaseous Diffusion Plant (PORTS), Paducah Gaseous Diffusion Plant (PGDP), Oak Ridge National Laboratory (ORNL), and the Oak Ridge Y-12 Plant (Y-12)], the ER Off-site Program, and Central ER. This prioritization methodology was developed to support the increased rigor and formality of work planning in the overall conduct of operations within the DOE-ORO ER Program. Prioritization is conducted as an integral component of the fiscal ER funding cycle to establish program budget priorities. The purpose of the ER risk-based prioritization methodology is to provide ER management with the tools and processes needed to evaluate, compare, prioritize, and justify fiscal budget decisions for a diverse set of remedial action, decontamination and decommissioning, and waste management activities. The methodology provides the ER Program with a framework for (1) organizing information about identified DOE-ORO environmental problems, (2) generating qualitative assessments of the long- and short-term risks posed by DOE-ORO environmental problems, and (3) evaluating the benefits associated with candidate work packages designed to reduce those risks. Prioritization is conducted to rank ER work packages on the basis of the overall value (e.g., risk reduction, stakeholder confidence) each package provides to the ER Program. Application of the methodology yields individual work package ''scores'' and rankings that are used to develop fiscal budget requests. This document presents the technical basis for the decision support tools and process

  17. Fully Depleted Charge-Coupled Devices

    Energy Technology Data Exchange (ETDEWEB)

    Holland, Stephen E.

    2006-05-15

    We have developed fully depleted, back-illuminated CCDs thatbuild upon earlier research and development efforts directed towardstechnology development of silicon-strip detectors used inhigh-energy-physics experiments. The CCDs are fabricated on the same typeof high-resistivity, float-zone-refined silicon that is used for stripdetectors. The use of high-resistivity substrates allows for thickdepletion regions, on the order of 200-300 um, with corresponding highdetection efficiency for near-infrared andsoft x-ray photons. We comparethe fully depleted CCD to thep-i-n diode upon which it is based, anddescribe the use of fully depleted CCDs in astronomical and x-ray imagingapplications.

  18. [Methodological novelties applied to the anthropology of food: agent-based models and social networks analysis].

    Science.gov (United States)

    Díaz Córdova, Diego

    2016-01-01

    The aim of this article is to introduce two methodological strategies that have not often been utilized in the anthropology of food: agent-based models and social networks analysis. In order to illustrate these methods in action, two cases based in materials typical of the anthropology of food are presented. For the first strategy, fieldwork carried out in Quebrada de Humahuaca (province of Jujuy, Argentina) regarding meal recall was used, and for the second, elements of the concept of "domestic consumption strategies" applied by Aguirre were employed. The underlying idea is that, given that eating is recognized as a "total social fact" and, therefore, as a complex phenomenon, the methodological approach must also be characterized by complexity. The greater the number of methods utilized (with the appropriate rigor), the better able we will be to understand the dynamics of feeding in the social environment.

  19. A Novel Clustering Methodology Based on Modularity Optimisation for Detecting Authorship Affinities in Shakespearean Era Plays.

    Directory of Open Access Journals (Sweden)

    Leila M Naeni

    Full Text Available In this study we propose a novel, unsupervised clustering methodology for analyzing large datasets. This new, efficient methodology converts the general clustering problem into the community detection problem in graph by using the Jensen-Shannon distance, a dissimilarity measure originating in Information Theory. Moreover, we use graph theoretic concepts for the generation and analysis of proximity graphs. Our methodology is based on a newly proposed memetic algorithm (iMA-Net for discovering clusters of data elements by maximizing the modularity function in proximity graphs of literary works. To test the effectiveness of this general methodology, we apply it to a text corpus dataset, which contains frequencies of approximately 55,114 unique words across all 168 written in the Shakespearean era (16th and 17th centuries, to analyze and detect clusters of similar plays. Experimental results and comparison with state-of-the-art clustering methods demonstrate the remarkable performance of our new method for identifying high quality clusters which reflect the commonalities in the literary style of the plays.

  20. Remedial Action Assessment System: A computer-based methodology for conducting feasibility studies

    International Nuclear Information System (INIS)

    White, M.K.; Buelt, J.L.; Stottlemyre, J.A.

    1991-02-01

    Because of the complexity and number of potential waste sites facing the US Department of Energy (DOE) for potential cleanup, DOE is supporting the development of a computer-based methodology to streamline the remedial investigation/feasibility study process. The Remedial Action Assessment System (RAAS), can be used for screening, linking, and evaluating established technology processes in support of conducting feasibility studies. It is also intended to do the same in support of corrective measures studies. The user interface employs menus, windows, help features, and graphical information while RAAS is in operation. Object-oriented programming is used to link unit processes into sets of compatible processes that form appropriate remedial alternatives. Once the remedial alternatives are formed, the RAAS methodology can evaluate them in terms of effectiveness, implementability, and cost. RAAS will access a user-selected risk assessment code to determine the reduction of risk after remedial action by each recommended alternative. The methodology will also help determine the implementability of the remedial alternatives at a site and access cost estimating tools to provide estimates of capital, operating, and maintenance costs. This paper presents the characteristics of two RAAS prototypes currently being developed. These include the RAAS Technology Information System, which accesses graphical, tabular and textual information about technologies, and the main RAAS methodology, which screens, links, and evaluates remedial technologies. 4 refs., 3 figs., 1 tab

  1. Motivating Students for Project-based Learning for Application of Research Methodology Skills.

    Science.gov (United States)

    Tiwari, Ranjana; Arya, Raj Kumar; Bansal, Manoj

    2017-12-01

    Project-based learning (PBL) is motivational for students to learn research methodology skills. It is a way to engage and give them ownership over their own learning. The aim of this study is to use PBL for application of research methodology skills for better learning by encouraging an all-inclusive approach in teaching and learning rather than an individualized tailored approach. The present study was carried out for MBBS 6 th - and 7 th -semester students of community medicine. Students and faculties were sensitized about PBL and components of research methodology skills. They worked in small groups. The students were asked to fill the student feedback Questionnaire and the faculty was also asked to fill the faculty feedback Questionnaire. Both the Questionnaires were assessed on a 5 point Likert scale. After submitted projects, document analysis was done. A total of 99 students of the 6 th and 7 th semester were participated in PBL. About 90.91% students agreed that there should be continuation of PBL in subsequent batches. 73.74% felt satisfied and motivated with PBL, whereas 76.77% felt that they would be able to use research methodology in the near future. PBL requires considerable knowledge, effort, persistence, and self-regulation on the part of the students. They need to devise plans, gather information evaluate both the findings, and their approach. Facilitator plays a critical role in helping students in the process by shaping opportunity for learning, guiding students, thinking, and helping them construct new understanding.

  2. Contextual System of Symbol Structural Recognition based on an Object-Process Methodology

    OpenAIRE

    Delalandre, Mathieu

    2005-01-01

    We present in this paper a symbol recognition system for the graphic documents. This one is based on a contextual approach for symbol structural recognition exploiting an Object-Process Methodology. It uses a processing library composed of structural recognition processings and contextual evaluation processings. These processings allow our system to deal with the multi-representation of symbols. The different processings are controlled, in an automatic way, by an inference engine during the r...

  3. Learning Theory Bases of Communicative Methodology and the Notional/Functional Syllabus

    OpenAIRE

    Jacqueline D., Beebe

    1992-01-01

    This paper examines the learning theories that underlie the philosophy and practices known as communicative language teaching methodology. These theories are identified first as a reaction against the behavioristic learning theory of audiolingualism. Approaches to syllabus design based on both the "weak" version of communicative language teaching-learning to use the second language-and the "strong" version-using the second language to learn it-are examined. The application of cognitive theory...

  4. Methodological Approach to Company Cash Flows Target-Oriented Forecasting Based on Financial Position Analysis

    OpenAIRE

    Sergey Krylov

    2012-01-01

    The article treats a new methodological approach to the company cash flows target-oriented forecasting based on its financial position analysis. The approach is featured to be universal and presumes application of the following techniques developed by the author: financial ratio values correction techniques and correcting cash flows techniques. The financial ratio values correction technique assumes to analyze and forecast company financial position while the correcting cash flows technique i...

  5. An Intuitionistic Fuzzy Methodology for Component-Based Software Reliability Optimization

    DEFF Research Database (Denmark)

    Madsen, Henrik; Grigore, Albeanu; Popenţiuvlǎdicescu, Florin

    2012-01-01

    Component-based software development is the current methodology facilitating agility in project management, software reuse in design and implementation, promoting quality and productivity, and increasing the reliability and performability. This paper illustrates the usage of intuitionistic fuzzy...... degree approach in modelling the quality of entities in imprecise software reliability computing in order to optimize management results. Intuitionistic fuzzy optimization algorithms are proposed to be used for complex software systems reliability optimization under various constraints....

  6. Failure to Replicate Depletion of Self-Control

    Science.gov (United States)

    Xu, Xiaomeng; Demos, Kathryn E.; Leahey, Tricia M.; Hart, Chantelle N.; Trautvetter, Jennifer; Coward, Pamela; Middleton, Kathryn R.; Wing, Rena R.

    2014-01-01

    The limited resource or strength model of self-control posits that the use of self-regulatory resources leads to depletion and poorer performance on subsequent self-control tasks. We conducted four studies (two with community samples, two with young adult samples) utilizing a frequently used depletion procedure (crossing out letters protocol) and the two most frequently used dependent measures of self-control (handgrip perseverance and modified Stroop). In each study, participants completed a baseline self-control measure, a depletion or control task (randomized), and then the same measure of self-control a second time. There was no evidence for significant depletion effects in any of these four studies. The null results obtained in four attempts to replicate using strong methodological approaches may indicate that depletion has more limited effects than implied by prior publications. We encourage further efforts to replicate depletion (particularly among community samples) with full disclosure of positive and negative results. PMID:25333564

  7. Failure to replicate depletion of self-control.

    Directory of Open Access Journals (Sweden)

    Xiaomeng Xu

    Full Text Available The limited resource or strength model of self-control posits that the use of self-regulatory resources leads to depletion and poorer performance on subsequent self-control tasks. We conducted four studies (two with community samples, two with young adult samples utilizing a frequently used depletion procedure (crossing out letters protocol and the two most frequently used dependent measures of self-control (handgrip perseverance and modified Stroop. In each study, participants completed a baseline self-control measure, a depletion or control task (randomized, and then the same measure of self-control a second time. There was no evidence for significant depletion effects in any of these four studies. The null results obtained in four attempts to replicate using strong methodological approaches may indicate that depletion has more limited effects than implied by prior publications. We encourage further efforts to replicate depletion (particularly among community samples with full disclosure of positive and negative results.

  8. A deviation based assessment methodology for multiple machine health patterns classification and fault detection

    Science.gov (United States)

    Jia, Xiaodong; Jin, Chao; Buzza, Matt; Di, Yuan; Siegel, David; Lee, Jay

    2018-01-01

    Successful applications of Diffusion Map (DM) in machine failure detection and diagnosis have been reported in several recent studies. DM provides an efficient way to visualize the high-dimensional, complex and nonlinear machine data, and thus suggests more knowledge about the machine under monitoring. In this paper, a DM based methodology named as DM-EVD is proposed for machine degradation assessment, abnormality detection and diagnosis in an online fashion. Several limitations and challenges of using DM for machine health monitoring have been analyzed and addressed. Based on the proposed DM-EVD, a deviation based methodology is then proposed to include more dimension reduction methods. In this work, the incorporation of Laplacian Eigen-map and Principal Component Analysis (PCA) are explored, and the latter algorithm is named as PCA-Dev and is validated in the case study. To show the successful application of the proposed methodology, case studies from diverse fields are presented and investigated in this work. Improved results are reported by benchmarking with other machine learning algorithms.

  9. AN INDUCTIVE, INTERACTIVE AND ADAPTIVE HYBRID PROBLEM-BASED LEARNING METHODOLOGY: APPLICATION TO STATISTICS

    Directory of Open Access Journals (Sweden)

    ADA ZHENG

    2011-10-01

    Full Text Available We have developed an innovative hybrid problem-based learning (PBL methodology. The methodology has the following distinctive features: i Each complex question was decomposed into a set of coherent finer subquestions by following the carefully designed criteria to maintain a delicate balance between guiding the students and inspiring them to think independently. This learning methodology enabled the students to solve the complex questions progressively in an inductive context. ii Facilitated by the utilization of our web-based learning systems, the teacher was able to interact with the students intensively and could allocate more teaching time to provide tailor-made feedback for individual student. The students were actively engaged in the learning activities, stimulated by the intensive interaction. iii The answers submitted by the students could be automatically consolidated in the report of the Moodle system in real-time. The teacher could adjust the teaching schedule and focus of the class to adapt to the learning progress of the students by analysing the automatically generated report and log files of the web-based learning system. As a result, the attendance rate of the students increased from about 50% to more than 90%, and the students’ learning motivation have been significantly enhanced.

  10. A Fault Diagnosis Methodology for Gear Pump Based on EEMD and Bayesian Network.

    Science.gov (United States)

    Liu, Zengkai; Liu, Yonghong; Shan, Hongkai; Cai, Baoping; Huang, Qing

    2015-01-01

    This paper proposes a fault diagnosis methodology for a gear pump based on the ensemble empirical mode decomposition (EEMD) method and the Bayesian network. Essentially, the presented scheme is a multi-source information fusion based methodology. Compared with the conventional fault diagnosis with only EEMD, the proposed method is able to take advantage of all useful information besides sensor signals. The presented diagnostic Bayesian network consists of a fault layer, a fault feature layer and a multi-source information layer. Vibration signals from sensor measurement are decomposed by the EEMD method and the energy of intrinsic mode functions (IMFs) are calculated as fault features. These features are added into the fault feature layer in the Bayesian network. The other sources of useful information are added to the information layer. The generalized three-layer Bayesian network can be developed by fully incorporating faults and fault symptoms as well as other useful information such as naked eye inspection and maintenance records. Therefore, diagnostic accuracy and capacity can be improved. The proposed methodology is applied to the fault diagnosis of a gear pump and the structure and parameters of the Bayesian network is established. Compared with artificial neural network and support vector machine classification algorithms, the proposed model has the best diagnostic performance when sensor data is used only. A case study has demonstrated that some information from human observation or system repair records is very helpful to the fault diagnosis. It is effective and efficient in diagnosing faults based on uncertain, incomplete information.

  11. A Novel Consensus-Based Particle Swarm Optimization-Assisted Trust-Tech Methodology for Large-Scale Global Optimization.

    Science.gov (United States)

    Zhang, Yong-Feng; Chiang, Hsiao-Dong

    2017-09-01

    A novel three-stage methodology, termed the "consensus-based particle swarm optimization (PSO)-assisted Trust-Tech methodology," to find global optimal solutions for nonlinear optimization problems is presented. It is composed of Trust-Tech methods, consensus-based PSO, and local optimization methods that are integrated to compute a set of high-quality local optimal solutions that can contain the global optimal solution. The proposed methodology compares very favorably with several recently developed PSO algorithms based on a set of small-dimension benchmark optimization problems and 20 large-dimension test functions from the CEC 2010 competition. The analytical basis for the proposed methodology is also provided. Experimental results demonstrate that the proposed methodology can rapidly obtain high-quality optimal solutions that can contain the global optimal solution. The scalability of the proposed methodology is promising.

  12. Development of a new nuclide generation and depletion code using a topological solver based on graph theory

    International Nuclear Information System (INIS)

    Kasselmann, S.; Scholthaus, S.; Rössel, C.; Allelein, H.-J.

    2014-01-01

    The problem of calculating the amounts of a coupled nuclide system varying with time especially when exposed to a neutron flux is a well-known problem and has been addressed by a number of computer codes. These codes cover a broad spectrum of applications, are based on comprehensive validation work and are therefore justifiably renowned among their users. However, due to their long development history, they are lacking a modern interface, which impedes a fast and robust internal coupling to other codes applied in the field of nuclear reactor physics. Therefore a project has been initiated to develop a new object-oriented nuclide transmutation code. It comprises an innovative solver based on graph theory, which exploits the topology of nuclide chains. This allows to always deal with the smallest nuclide system for the problem of interest. Highest priority has been given to the existence of a generic software interfaces well as an easy handling by making use of XML files for input and output. In this paper we report on the status of the code development and present first benchmark results, which prove the applicability of the selected approach. (author)

  13. The development of a neuroscience-based methodology for the nuclear energy learning/teaching process

    International Nuclear Information System (INIS)

    Barabas, Roberta de C.; Sabundjian, Gaiane

    2015-01-01

    When compared to other energy sources such as fossil fuels, coal, oil, and gas, nuclear energy has perhaps the lowest impact on the environment. Moreover, nuclear energy has also benefited other fields such as medicine, pharmaceutical industry, and agriculture, among others. However, despite all benefits that result from the peaceful uses of nuclear energy, the theme is still addressed with prejudice. Education may be the starting point for public acceptance of nuclear energy as it provides pedagogical approaches, learning environments, and human resources, which are essential conditions for effective learning. So far nuclear energy educational researches have been conducted using only conventional assessment methods. The global educational scenario has demonstrated absence of neuroscience-based methods for the teaching of nuclear energy, and that may be an opportunity for developing new strategic teaching methods that will help demystifying the theme consequently improving public acceptance of this type of energy. This work aims to present the first step of a methodology in progress based on researches in neuroscience to be applied to Brazilian science teachers in order to contribute to an effective teaching/learning process. This research will use the Implicit Association Test (IAT) to verify implicit attitudes of science teachers concerning nuclear energy. Results will provide data for the next steps of the research. The literature has not reported a similar neuroscience-based methodology applied to the nuclear energy learning/teaching process; therefore, this has demonstrated to be an innovating methodology. The development of the methodology is in progress and the results will be presented in future works. (author)

  14. The development of a neuroscience-based methodology for the nuclear energy learning/teaching process

    Energy Technology Data Exchange (ETDEWEB)

    Barabas, Roberta de C.; Sabundjian, Gaiane, E-mail: robertabarabas@usp.br, E-mail: gdjian@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    When compared to other energy sources such as fossil fuels, coal, oil, and gas, nuclear energy has perhaps the lowest impact on the environment. Moreover, nuclear energy has also benefited other fields such as medicine, pharmaceutical industry, and agriculture, among others. However, despite all benefits that result from the peaceful uses of nuclear energy, the theme is still addressed with prejudice. Education may be the starting point for public acceptance of nuclear energy as it provides pedagogical approaches, learning environments, and human resources, which are essential conditions for effective learning. So far nuclear energy educational researches have been conducted using only conventional assessment methods. The global educational scenario has demonstrated absence of neuroscience-based methods for the teaching of nuclear energy, and that may be an opportunity for developing new strategic teaching methods that will help demystifying the theme consequently improving public acceptance of this type of energy. This work aims to present the first step of a methodology in progress based on researches in neuroscience to be applied to Brazilian science teachers in order to contribute to an effective teaching/learning process. This research will use the Implicit Association Test (IAT) to verify implicit attitudes of science teachers concerning nuclear energy. Results will provide data for the next steps of the research. The literature has not reported a similar neuroscience-based methodology applied to the nuclear energy learning/teaching process; therefore, this has demonstrated to be an innovating methodology. The development of the methodology is in progress and the results will be presented in future works. (author)

  15. Generic Methodology for Field Calibration of Nacelle-Based Wind Lidars

    Directory of Open Access Journals (Sweden)

    Antoine Borraccino

    2016-11-01

    Full Text Available Nacelle-based Doppler wind lidars have shown promising capabilities to assess power performance, detect yaw misalignment or perform feed-forward control. The power curve application requires uncertainty assessment. Traceable measurements and uncertainties of nacelle-based wind lidars can be obtained through a methodology applicable to any type of existing and upcoming nacelle lidar technology. The generic methodology consists in calibrating all the inputs of the wind field reconstruction algorithms of a lidar. These inputs are the line-of-sight velocity and the beam position, provided by the geometry of the scanning trajectory and the lidar inclination. The line-of-sight velocity is calibrated in atmospheric conditions by comparing it to a reference quantity based on classic instrumentation such as cup anemometers and wind vanes. The generic methodology was tested on two commercially developed lidars, one continuous wave and one pulsed systems, and provides consistent calibration results: linear regressions show a difference of ∼0.5% between the lidar-measured and reference line-of-sight velocities. A comprehensive uncertainty procedure propagates the reference uncertainty to the lidar measurements. At a coverage factor of two, the estimated line-of-sight velocity uncertainty ranges from 3.2% at 3 m · s − 1 to 1.9% at 16 m · s − 1 . Most of the line-of-sight velocity uncertainty originates from the reference: the cup anemometer uncertainty accounts for ∼90% of the total uncertainty. The propagation of uncertainties to lidar-reconstructed wind characteristics can use analytical methods in simple cases, which we demonstrate through the example of a two-beam system. The newly developed calibration methodology allows robust evaluation of a nacelle lidar’s performance and uncertainties to be established. Calibrated nacelle lidars may consequently be further used for various wind turbine applications in confidence.

  16. SAS6. User's guide. A two-dimensional depletion and criticality analysis code package based on the SCALE-4 system

    International Nuclear Information System (INIS)

    Leege, P.F.A. de; Li, J.M.; Kloosterman, J.L.

    1995-04-01

    This users' guide gives a description of the functionality and the requested input of the SAS6 code sequence which can be used to perform burnup and criticality calculations based on functional modules from the SCALE-4 code system and libraries. The input file for the SAS6 control module is very similar to that of the other SAS and CSAS control modules available in the SCALE-4 system. Especially the geometry input of SAS6 is quite similar to that of SAS2H. However, the functionality of SAS6 is different from that of SAS2H. The geometry of the reactor lattice can be treated in more detail because use is made of the two-dimensional lattice code WIMS-D/IRI (An adapted version of WIMS-D/4) instead of the one-dimensional transport code XSDRNPM-S. Also the neutron absorption and production rates of nuclides not explicitly specified in the input can be accounted for by six pseudo nuclides. Furthermore, the centre pin can be subdivided into maximal 10 zones to improve the burnup calculation of the centre pin and to obtain more accurate k-infinite values for the assembly. Also the time step specification is more flexible than in the SAS2H sequence. (orig.)

  17. Hidden shift of the ionome of plants exposed to elevated CO₂depletes minerals at the base of human nutrition.

    Science.gov (United States)

    Loladze, Irakli

    2014-05-07

    Mineral malnutrition stemming from undiversified plant-based diets is a top global challenge. In C3 plants (e.g., rice, wheat), elevated concentrations of atmospheric carbon dioxide (eCO2) reduce protein and nitrogen concentrations, and can increase the total non-structural carbohydrates (TNC; mainly starch, sugars). However, contradictory findings have obscured the effect of eCO2 on the ionome-the mineral and trace-element composition-of plants. Consequently, CO2-induced shifts in plant quality have been ignored in the estimation of the impact of global change on humans. This study shows that eCO2 reduces the overall mineral concentrations (-8%, 95% confidence interval: -9.1 to -6.9, p carbon:minerals in C3 plants. The meta-analysis of 7761 observations, including 2264 observations at state of the art FACE centers, covers 130 species/cultivars. The attained statistical power reveals that the shift is systemic and global. Its potential to exacerbate the prevalence of 'hidden hunger' and obesity is discussed.DOI: http://dx.doi.org/10.7554/eLife.02245.001. Copyright © 2014, Loladze.

  18. Cell-based top-down design methodology for RSFQ digital circuits

    Science.gov (United States)

    Yoshikawa, N.; Koshiyama, J.; Motoori, K.; Matsuzaki, F.; Yoda, K.

    2001-08-01

    We propose a cell-based top-down design methodology for rapid single flux quantum (RSFQ) digital circuits. Our design methodology employs a binary decision diagram (BDD), which is currently used for the design of CMOS pass-transistor logic circuits. The main features of the BDD RSFQ circuits are the limited primitive number, dual rail nature, non-clocking architecture, and small gate count. We have made a standard BDD RSFQ cell library and prepared a top-down design CAD environment, by which we can perform logic synthesis, logic simulation, circuit simulation and layout view extraction. In order to clarify problems expected in large-scale RSFQ circuits design, we have designed a small RSFQ microprocessor based on simple architecture using our top-down design methodology. We have estimated its system performance and compared it with that of the CMOS microprocessor with the same architecture. It was found that the RSFQ system is superior in terms of the operating speed though it requires extremely large chip area.

  19. A safety assessment methodology applied to CNS/ATM-based air traffic control system

    Energy Technology Data Exchange (ETDEWEB)

    Vismari, Lucio Flavio, E-mail: lucio.vismari@usp.b [Safety Analysis Group (GAS), School of Engineering at University of Sao Paulo (Poli-USP), Av. Prof. Luciano Gualberto, Trav.3, n.158, Predio da Engenharia de Eletricidade, Sala C2-32, CEP 05508-900, Sao Paulo (Brazil); Batista Camargo Junior, Joao, E-mail: joaocamargo@usp.b [Safety Analysis Group (GAS), School of Engineering at University of Sao Paulo (Poli-USP), Av. Prof. Luciano Gualberto, Trav.3, n.158, Predio da Engenharia de Eletricidade, Sala C2-32, CEP 05508-900, Sao Paulo (Brazil)

    2011-07-15

    In the last decades, the air traffic system has been changing to adapt itself to new social demands, mainly the safe growth of worldwide traffic capacity. Those changes are ruled by the Communication, Navigation, Surveillance/Air Traffic Management (CNS/ATM) paradigm , based on digital communication technologies (mainly satellites) as a way of improving communication, surveillance, navigation and air traffic management services. However, CNS/ATM poses new challenges and needs, mainly related to the safety assessment process. In face of these new challenges, and considering the main characteristics of the CNS/ATM, a methodology is proposed at this work by combining 'absolute' and 'relative' safety assessment methods adopted by the International Civil Aviation Organization (ICAO) in ICAO Doc.9689 , using Fluid Stochastic Petri Nets (FSPN) as the modeling formalism, and compares the safety metrics estimated from the simulation of both the proposed (in analysis) and the legacy system models. To demonstrate its usefulness, the proposed methodology was applied to the 'Automatic Dependent Surveillance-Broadcasting' (ADS-B) based air traffic control system. As conclusions, the proposed methodology assured to assess CNS/ATM system safety properties, in which FSPN formalism provides important modeling capabilities, and discrete event simulation allowing the estimation of the desired safety metric.

  20. A safety assessment methodology applied to CNS/ATM-based air traffic control system

    International Nuclear Information System (INIS)

    Vismari, Lucio Flavio; Batista Camargo Junior, Joao

    2011-01-01

    In the last decades, the air traffic system has been changing to adapt itself to new social demands, mainly the safe growth of worldwide traffic capacity. Those changes are ruled by the Communication, Navigation, Surveillance/Air Traffic Management (CNS/ATM) paradigm , based on digital communication technologies (mainly satellites) as a way of improving communication, surveillance, navigation and air traffic management services. However, CNS/ATM poses new challenges and needs, mainly related to the safety assessment process. In face of these new challenges, and considering the main characteristics of the CNS/ATM, a methodology is proposed at this work by combining 'absolute' and 'relative' safety assessment methods adopted by the International Civil Aviation Organization (ICAO) in ICAO Doc.9689 , using Fluid Stochastic Petri Nets (FSPN) as the modeling formalism, and compares the safety metrics estimated from the simulation of both the proposed (in analysis) and the legacy system models. To demonstrate its usefulness, the proposed methodology was applied to the 'Automatic Dependent Surveillance-Broadcasting' (ADS-B) based air traffic control system. As conclusions, the proposed methodology assured to assess CNS/ATM system safety properties, in which FSPN formalism provides important modeling capabilities, and discrete event simulation allowing the estimation of the desired safety metric.

  1. Effectiveness of the management of price risk methodologies for the corn market based on trading signals

    Directory of Open Access Journals (Sweden)

    W. Rossouw

    2013-03-01

    Full Text Available Corn production is scattered geographically over various continents, but most of it is grown in the United States. As such, the world price of corn futures contracts is largely dominated by North American corn prices as traded on the Chicago Board of Trade. In recent years, this market has been characterised by an increase in price volatility and magnitude of price movement as a result of decreasing stock levels. The development and implementation of an effective and successful derivative price risk management strategy based on the Chicago Board of Trade corn futures contract will therefore be of inestimable value to market stakeholders worldwide. The research focused on the efficient market hypothesis and the possibility of contesting this phenomenon through an application of a derivative price risk management methodology. The methodology is based on a combination of an analysis of market trends and technical oscillators with the objective of generating returns superior to that of a market benchmark. The study found that market participants are currently unable to exploit price movement in a manner which results in returns that contest the notion of efficient markets. The methodology proposed, however, does allow the user to consistently achieve returns superior to that of a predetermined market benchmark. The benchmark price for the purposes of this study was the average price offered by the market over the contract lifetime, and as such, the efficient market hypothesis was successfully contested

  2. Performance Assessment of the Wave Dragon Wave Energy Converter Based on the EquiMar Methodology

    DEFF Research Database (Denmark)

    Parmeggiani, Stefano; Chozas, Julia Fernandez; Pecher, Arthur

    2011-01-01

    on experimental data deriving from sea trials rather than solely on numerical predictions. The study applies this methodology to evaluate the performance of Wave Dragon at two locations in the North Sea, based on the data acquired during the sea trials of a 1:4.5 scale prototype. Indications about power......At the present pre-commercial phase of the wave energy sector, device developers are called to provide reliable estimates on power performance and production at possible deployment locations. The EU EquiMar project has proposed a novel approach, where the performance assessment is based mainly...

  3. A Consistent Methodology Based Parameter Estimation for a Lactic Acid Bacteria Fermentation Model

    DEFF Research Database (Denmark)

    Spann, Robert; Roca, Christophe; Kold, David

    2017-01-01

    Lactic acid bacteria are used in many industrial applications, e.g. as starter cultures in the dairy industry or as probiotics, and research on their cell production is highly required. A first principles kinetic model was developed to describe and understand the biological, physical, and chemical...... mechanisms in a lactic acid bacteria fermentation. We present here a consistent approach for a methodology based parameter estimation for a lactic acid fermentation. In the beginning, just an initial knowledge based guess of parameters was available and an initial parameter estimation of the complete set...

  4. Intrinsic Depletion or Not

    DEFF Research Database (Denmark)

    Klösgen, Beate; Bruun, Sara; Hansen, Søren

      The presence of a depletion layer of water along extended hydrophobic interfaces, and a possibly related formation of nanobubbles, is an ongoing discussion. The phenomenon was initially reported when we, years ago, chose thick films (~300-400Å) of polystyrene as cushions between a crystalline...... carrier and biomimetic membranes deposited thereupon and exposed to bulk water. While monitoring the sequential build-up of the sandwiched composite structure by continuous neutron reflectivity experiments the formation of an unexpected additional layer was detected (1). Located at the polystyrene surface...... in between he polymer cushion and bulk water the layer was attributed to water of reduced density and was called "depletion layer".  Impurities or preparative artefacts were excluded as its origin. Later on, the formation of nanobubbles from this vapour-like water phase was initiated by tipping the surface...

  5. Environmental external effects from wind power based on the EU ExternE methodology

    DEFF Research Database (Denmark)

    Ibsen, Liselotte Schleisner; Nielsen, Per Sieverts

    1998-01-01

    The European Commission has launched a major study project, ExternE, to develop a methodology to quantify externalities. A “National Implementation Phase”, was started under the Joule II programme with the purpose of implementing the ExternE methodology in all member states. The main objective...... of the Danish part of the project is to implement the framework for externality evaluation, for three different power plants located in Denmark. The paper will focus on the assessment of the impacts of the whole fuel cycles for wind, natural gas and biogas. Priority areas for environmental impact assessment...... are identified, based on results of earlier studies and some identified of specific relevance for Denmark. Importance is attached to the quantification of impacts for each of the three fuel cycles and to monetisation of the externalities....

  6. Capital expenditure and depletion

    International Nuclear Information System (INIS)

    Rech, O.; Saniere, A.

    2003-01-01

    In the future, the increase in oil demand will be covered for the most part by non conventional oils, but conventional sources will continue to represent a preponderant share of the world oil supply. Their depletion represents a complex challenge involving technological, economic and political factors. At the same time, there is reason for concern about the decrease in exploration budgets at the major oil companies. (author)

  7. Depleted uranium: A DOE management guide

    International Nuclear Information System (INIS)

    1995-10-01

    The U.S. Department of Energy (DOE) has a management challenge and financial liability in the form of 50,000 cylinders containing 555,000 metric tons of depleted uranium hexafluoride (UF 6 ) that are stored at the gaseous diffusion plants. The annual storage and maintenance cost is approximately $10 million. This report summarizes several studies undertaken by the DOE Office of Technology Development (OTD) to evaluate options for long-term depleted uranium management. Based on studies conducted to date, the most likely use of the depleted uranium is for shielding of spent nuclear fuel (SNF) or vitrified high-level waste (HLW) containers. The alternative to finding a use for the depleted uranium is disposal as a radioactive waste. Estimated disposal costs, utilizing existing technologies, range between $3.8 and $11.3 billion, depending on factors such as applicability of the Resource Conservation and Recovery Act (RCRA) and the location of the disposal site. The cost of recycling the depleted uranium in a concrete based shielding in SNF/HLW containers, although substantial, is comparable to or less than the cost of disposal. Consequently, the case can be made that if DOE invests in developing depleted uranium shielded containers instead of disposal, a long-term solution to the UF 6 problem is attained at comparable or lower cost than disposal as a waste. Two concepts for depleted uranium storage casks were considered in these studies. The first is based on standard fabrication concepts previously developed for depleted uranium metal. The second converts the UF 6 to an oxide aggregate that is used in concrete to make dry storage casks

  8. Fully Depleted Charge-Coupled Devices

    International Nuclear Information System (INIS)

    Holland, Stephen E.

    2006-01-01

    We have developed fully depleted, back-illuminated CCDs that build upon earlier research and development efforts directed towards technology development of silicon-strip detectors used in high-energy-physics experiments. The CCDs are fabricated on the same type of high-resistivity, float-zone-refined silicon that is used for strip detectors. The use of high-resistivity substrates allows for thick depletion regions, on the order of 200-300 um, with corresponding high detection efficiency for near-infrared and soft x-ray photons. We compare the fully depleted CCD to the p-i-n diode upon which it is based, and describe the use of fully depleted CCDs in astronomical and x-ray imaging applications

  9. Ozone-depleting Substances (ODS)

    Data.gov (United States)

    U.S. Environmental Protection Agency — This site includes all of the ozone-depleting substances (ODS) recognized by the Montreal Protocol. The data include ozone depletion potentials (ODP), global warming...

  10. System Architecture-based Design Methodology for Monitoring the Ground-based Augmentation System: Category I - Integrity Risk

    OpenAIRE

    Elias, Paulo; Saotome, Osamu

    2012-01-01

    Abstract: This paper has described a method to accomplish the Ground-Based Augmentation System signal-in-space integrity risk monitoring for a ground station specified by ICAO, Annex 10, Vol. 1 and RTCA DO-245A, which is a mandatory requirement to meet the certification aspects for a Ground-Based Augmentation System station. The proposed methodology was based on the Risk Tree Analysis technique, which is an optional way to design and develop an engineering solution named as integrity risk monit...

  11. High-resolution three-dimensional imaging of a depleted CMOS sensor using an edge Transient Current Technique based on the Two Photon Absorption process (TPA-eTCT)

    Energy Technology Data Exchange (ETDEWEB)

    García, Marcos Fernández; Sánchez, Javier González; Echeverría, Richard Jaramillo [Instituto de Física de Cantabria (CSIC-UC), Avda. los Castros s/n, E-39005 Santander (Spain); Moll, Michael [CERN, Organisation europénne pour la recherche nucléaire, CH-1211 Genéve 23 (Switzerland); Santos, Raúl Montero [SGIker Laser Facility, UPV/EHU, Sarriena, s/n - 48940 Leioa-Bizkaia (Spain); Moya, David [Instituto de Física de Cantabria (CSIC-UC), Avda. los Castros s/n, E-39005 Santander (Spain); Pinto, Rogelio Palomo [Departamento de Ingeniería Electrónica, Escuela Superior de Ingenieros Universidad de Sevilla (Spain); Vila, Iván [Instituto de Física de Cantabria (CSIC-UC), Avda. los Castros s/n, E-39005 Santander (Spain)

    2017-02-11

    For the first time, the deep n-well (DNW) depletion space of a High Voltage CMOS sensor has been characterized using a Transient Current Technique based on the simultaneous absorption of two photons. This novel approach has allowed to resolve the DNW implant boundaries and therefore to accurately determine the real depleted volume and the effective doping concentration of the substrate. The unprecedented spatial resolution of this new method comes from the fact that measurable free carrier generation in two photon mode only occurs in a micrometric scale voxel around the focus of the beam. Real three-dimensional spatial resolution is achieved by scanning the beam focus within the sample.

  12. High-resolution three-dimensional imaging of a depleted CMOS sensor using an edge Transient Current Technique based on the Two Photon Absorption process (TPA-eTCT)

    CERN Document Server

    García, Marcos Fernández; Echeverría, Richard Jaramillo; Moll, Michael; Santos, Raúl Montero; Moya, David; Pinto, Rogelio Palomo; Vila, Iván

    2016-01-01

    For the first time, the deep n-well (DNW) depletion space of a High Voltage CMOS sensor has been characterized using a Transient Current Technique based on the simultaneous absorption of two photons. This novel approach has allowed to resolve the DNW implant boundaries and therefore to accurately determine the real depleted volume and the effective doping concentration of the substrate. The unprecedented spatial resolution of this new method comes from the fact that measurable free carrier generation in two photon mode only occurs in a micrometric scale voxel around the focus of the beam. Real three-dimensional spatial resolution is achieved by scanning the beam focus within the sample.

  13. A SystemC-Based Design Methodology for Digital Signal Processing Systems

    Directory of Open Access Journals (Sweden)

    Christian Haubelt

    2007-03-01

    Full Text Available Digital signal processing algorithms are of big importance in many embedded systems. Due to complexity reasons and due to the restrictions imposed on the implementations, new design methodologies are needed. In this paper, we present a SystemC-based solution supporting automatic design space exploration, automatic performance evaluation, as well as automatic system generation for mixed hardware/software solutions mapped onto FPGA-based platforms. Our proposed hardware/software codesign approach is based on a SystemC-based library called SysteMoC that permits the expression of different models of computation well known in the domain of digital signal processing. It combines the advantages of executability and analyzability of many important models of computation that can be expressed in SysteMoC. We will use the example of an MPEG-4 decoder throughout this paper to introduce our novel methodology. Results from a five-dimensional design space exploration and from automatically mapping parts of the MPEG-4 decoder onto a Xilinx FPGA platform will demonstrate the effectiveness of our approach.

  14. A Duration Prediction Using a Material-Based Progress Management Methodology for Construction Operation Plans

    Directory of Open Access Journals (Sweden)

    Yongho Ko

    2017-04-01

    Full Text Available Precise and accurate prediction models for duration and cost enable contractors to improve their decision making for effective resource management in terms of sustainability in construction. Previous studies have been limited to cost-based estimations, but this study focuses on a material-based progress management method. Cost-based estimations typically used in construction, such as the earned value method, rely on comparing the planned budget with the actual cost. However, accurately planning budgets requires analysis of many factors, such as the financial status of the sectors involved. Furthermore, there is a higher possibility of changes in the budget than in the total amount of material used during construction, which is deduced from the quantity take-off from drawings and specifications. Accordingly, this study proposes a material-based progress management methodology, which was developed using different predictive analysis models (regression, neural network, and auto-regressive moving average as well as datasets on material and labor, which can be extracted from daily work reports from contractors. A case study on actual datasets was conducted, and the results show that the proposed methodology can be efficiently used for progress management in construction.

  15. Systematic Analysis of an IEED Unit Based in a New Methodology for M&S

    Directory of Open Access Journals (Sweden)

    Luis Adrian Zuñiga Aviles

    2010-12-01

    Full Text Available In the field of modeling and simulation of mechatronics designs of high complexity, is presented a systematic analysis of an IEDD unit [1] (improvised explosive device disposal based in a new methodology for modeling and simulation divided into 6 stages in order to increase the accuracy of validation of whole system, this mechatronic unit is a Non-holonomic Unmanned wheeled mobile manipulator, MU-NH-WMM, formed by a Differential Traction and a manipulator arm with 4 degrees of freedom mounted on wheeled mobile base, hence the contribution of this work is a novel methodology based on a practice proposal of philosophy of mechatronics design, which establishes the suitable kinematics to coupled wheeled mobile manipulator, where the motion equations and kinematics transformations are the base of the specific stages in order to obtain the dynamic of coupled system, validating the behavior and the trajectories tracking, in order to achieve the complex tasks of approaching to work area and the appropiatehandling of explosive device, this work is focused in the first of them; such that the errors in the model can be detected and later confined by proposed control.

  16. Systematic Analysis of an IEED Unit Based in a New Methodology for M&S

    Directory of Open Access Journals (Sweden)

    Jesus Carlos Pedraza Ortega

    2011-01-01

    Full Text Available In the field of modeling and simulation of mechatronics designs of high complexity, is presented a systematic analysis of an IEDD unit [1] (improvised explosive device disposal based in a new methodology for modeling and simulation divided into 6 stages in order to increase the accuracy of validation of whole system, this mechatronic unit is a Non-holonomic Unmanned wheeled mobile manipulator, MU-NH-WMM, formed by a Differential Traction and a manipulator arm with 4 degrees of freedom mounted on wheeled mobile base, hence the contribution of this work is a novel methodology based on a practice proposal of philosophy of mechatronics design, which establishes the suitable kinematics to coupled wheeled mobile manipulator, where the motion equations and kinematics transformations are the base of the specific stages in order to obtain the dynamic of coupled system, validating the behavior and the trajectories tracking, in order to achieve the complex tasks of approaching to work area and the appropiatehandling of explosive device, this work is focused in the first of them; such that the errors in the model can be detected and later confined by proposed control.

  17. FPGA hardware acceleration for high performance neutron transport computation based on agent methodology - 318

    International Nuclear Information System (INIS)

    Shanjie, Xiao; Tatjana, Jevremovic

    2010-01-01

    The accurate, detailed and 3D neutron transport analysis for Gen-IV reactors is still time-consuming regardless of advanced computational hardware available in developed countries. This paper introduces a new concept in addressing the computational time while persevering the detailed and accurate modeling; a specifically designed FPGA co-processor accelerates robust AGENT methodology for complex reactor geometries. For the first time this approach is applied to accelerate the neutronics analysis. The AGENT methodology solves neutron transport equation using the method of characteristics. The AGENT methodology performance was carefully analyzed before the hardware design based on the FPGA co-processor was adopted. The most time-consuming kernel part is then transplanted into the FPGA co-processor. The FPGA co-processor is designed with data flow-driven non von-Neumann architecture and has much higher efficiency than the conventional computer architecture. Details of the FPGA co-processor design are introduced and the design is benchmarked using two different examples. The advanced chip architecture helps the FPGA co-processor obtaining more than 20 times speed up with its working frequency much lower than the CPU frequency. (authors)

  18. A hybrid system identification methodology for wireless structural health monitoring systems based on dynamic substructuring

    Science.gov (United States)

    Dragos, Kosmas; Smarsly, Kay

    2016-04-01

    System identification has been employed in numerous structural health monitoring (SHM) applications. Traditional system identification methods usually rely on centralized processing of structural response data to extract information on structural parameters. However, in wireless SHM systems the centralized processing of structural response data introduces a significant communication bottleneck. Exploiting the merits of decentralization and on-board processing power of wireless SHM systems, many system identification methods have been successfully implemented in wireless sensor networks. While several system identification approaches for wireless SHM systems have been proposed, little attention has been paid to obtaining information on the physical parameters (e.g. stiffness, damping) of the monitored structure. This paper presents a hybrid system identification methodology suitable for wireless sensor networks based on the principles of component mode synthesis (dynamic substructuring). A numerical model of the monitored structure is embedded into the wireless sensor nodes in a distributed manner, i.e. the entire model is segmented into sub-models, each embedded into one sensor node corresponding to the substructure the sensor node is assigned to. The parameters of each sub-model are estimated by extracting local mode shapes and by applying the equations of the Craig-Bampton method on dynamic substructuring. The proposed methodology is validated in a laboratory test conducted on a four-story frame structure to demonstrate the ability of the methodology to yield accurate estimates of stiffness parameters. Finally, the test results are discussed and an outlook on future research directions is provided.

  19. An integrated science-based methodology to assess potential risks and implications of engineered nanomaterials.

    Science.gov (United States)

    Tolaymat, Thabet; El Badawy, Amro; Sequeira, Reynold; Genaidy, Ash

    2015-11-15

    There is an urgent need for broad and integrated studies that address the risks of engineered nanomaterials (ENMs) along the different endpoints of the society, environment, and economy (SEE) complex adaptive system. This article presents an integrated science-based methodology to assess the potential risks of engineered nanomaterials. To achieve the study objective, two major tasks are accomplished, knowledge synthesis and algorithmic computational methodology. The knowledge synthesis task is designed to capture "what is known" and to outline the gaps in knowledge from ENMs risk perspective. The algorithmic computational methodology is geared toward the provision of decisions and an understanding of the risks of ENMs along different endpoints for the constituents of the SEE complex adaptive system. The approach presented herein allows for addressing the formidable task of assessing the implications and risks of exposure to ENMs, with the long term goal to build a decision-support system to guide key stakeholders in the SEE system towards building sustainable ENMs and nano-enabled products. Published by Elsevier B.V.

  20. A Novel Water Supply Network Sectorization Methodology Based on a Complete Economic Analysis, Including Uncertainties

    Directory of Open Access Journals (Sweden)

    Enrique Campbell

    2016-04-01

    Full Text Available The core idea behind sectorization of Water Supply Networks (WSNs is to establish areas partially isolated from the rest of the network to improve operational control. Besides the benefits associated with sectorization, some drawbacks must be taken into consideration by water operators: the economic investment associated with both boundary valves and flowmeters and the reduction of both pressure and system resilience. The target of sectorization is to properly balance these negative and positive aspects. Sectorization methodologies addressing the economic aspects mainly consider costs of valves and flowmeters and of energy, and the benefits in terms of water saving linked to pressure reduction. However, sectorization entails other benefits, such as the reduction of domestic consumption, the reduction of burst frequency and the enhanced capacity to detect and intervene over future leakage events. We implement a development proposed by the International Water Association (IWA to estimate the aforementioned benefits. Such a development is integrated in a novel sectorization methodology based on a social network community detection algorithm, combined with a genetic algorithm optimization method and Monte Carlo simulation. The methodology is implemented over a fraction of the WSN of Managua city, capital of Nicaragua, generating a net benefit of 25,572 $/year.

  1. Methodology of the National School-based Health Survey in Malaysia, 2012.

    Science.gov (United States)

    Yusoff, Fadhli; Saari, Riyanti; Naidu, Balkish M; Ahmad, Noor Ani; Omar, Azahadi; Aris, Tahir

    2014-09-01

    The National School-Based Health Survey 2012 was a nationwide school health survey of students in Standard 4 to Form 5 (10-17 years of age), who were schooling in government schools in Malaysia during the period of data collection. The survey comprised 3 subsurveys: the Global School Health Survey (GSHS), the Mental Health Survey, and the National School-Based Nutrition Survey. The aim of the survey was to provide data on the health status of adolescents in Malaysia toward strengthening the adolescent health program in the country. The design of the survey was created to fulfill the requirements of the 3 subsurveys. A 2-stage stratified sampling method was adopted in the sampling. The methods for data collection were via questionnaire and physical examination. The National School-Based Health Survey 2012 adopted an appropriate methodology for a school-based survey to ensure valid and reliable findings. © 2014 APJPH.

  2. Design of an Emulsion-based Personal Detergent through a Model-based Chemical Product Design Methodology

    DEFF Research Database (Denmark)

    Mattei, Michele; Hill, Michael; Kontogeorgis, Georgios

    2013-01-01

    -by-one the different classes of chemicals, until a formulation is obtained, the stability of which as en emulsion is finally checked with appropriate models. Structured databases, appropriate pure component as well as mixture property models, rule-based selection criteria and CAMD techniques are employed together...... to obtain one or more candidate formulations. A conceptual casestudy representing a personal detergent is presented to highlight the methodology....

  3. The use of depleted uranium ammunition under contemporary international law: is there a need for a treaty-based ban on DU weapons?

    Science.gov (United States)

    Borrmann, Robin

    2010-01-01

    This article examines whether the use of Depleted Uranium (DU) munitions can be considered illegal under current public international law. The analysis covers the law of arms control and focuses in particular on international humanitarian law. The article argues that DU ammunition cannot be addressed adequately under existing treaty based weapon bans, such as the Chemical Weapons Convention, due to the fact that DU does not meet the criteria required to trigger the applicability of those treaties. Furthermore, it is argued that continuing uncertainties regarding the effects of DU munitions impedes a reliable review of the legality of their use under various principles of international law, including the prohibition on employing indiscriminate weapons; the prohibition on weapons that are intended, or may be expected, to cause widespread, long-term and severe damage to the natural environment; and the prohibition on causing unnecessary suffering or superfluous injury. All of these principles require complete knowledge of the effects of the weapon in question. Nevertheless, the author argues that the same uncertainty places restrictions on the use of DU under the precautionary principle. The paper concludes with an examination of whether or not there is a need for--and if so whether there is a possibility of achieving--a Convention that comprehensively outlaws the use, transfer and stockpiling of DU weapons, as proposed by some non-governmental organisations (NGOs).

  4. A Web-Based Data-Querying Tool Based on Ontology-Driven Methodology and Flowchart-Based Model

    Science.gov (United States)

    Ping, Xiao-Ou; Chung, Yufang; Liang, Ja-Der; Yang, Pei-Ming; Huang, Guan-Tarn; Lai, Feipei

    2013-01-01

    Background Because of the increased adoption rate of electronic medical record (EMR) systems, more health care records have been increasingly accumulating in clinical data repositories. Therefore, querying the data stored in these repositories is crucial for retrieving the knowledge from such large volumes of clinical data. Objective The aim of this study is to develop a Web-based approach for enriching the capabilities of the data-querying system along the three following considerations: (1) the interface design used for query formulation, (2) the representation of query results, and (3) the models used for formulating query criteria. Methods The Guideline Interchange Format version 3.5 (GLIF3.5), an ontology-driven clinical guideline representation language, was used for formulating the query tasks based on the GLIF3.5 flowchart in the Protégé environment. The flowchart-based data-querying model (FBDQM) query execution engine was developed and implemented for executing queries and presenting the results through a visual and graphical interface. To examine a broad variety of patient data, the clinical data generator was implemented to automatically generate the clinical data in the repository, and the generated data, thereby, were employed to evaluate the system. The accuracy and time performance of the system for three medical query tasks relevant to liver cancer were evaluated based on the clinical data generator in the experiments with varying numbers of patients. Results In this study, a prototype system was developed to test the feasibility of applying a methodology for building a query execution engine using FBDQMs by formulating query tasks using the existing GLIF. The FBDQM-based query execution engine was used to successfully retrieve the clinical data based on the query tasks formatted using the GLIF3.5 in the experiments with varying numbers of patients. The accuracy of the three queries (ie, “degree of liver damage,” “degree of liver damage

  5. METHODOLOGICAL BASES OF ECOLOGICAL CULTURE FORMATION OF PUPILS ON THE BASIS OF ECO-DEVELOPMENT IDEAS

    Directory of Open Access Journals (Sweden)

    Natalia F. Vinokurova

    2016-01-01

    Full Text Available Aim. The article describes methodological bases of formation of ecological culture of students as the aim of innovative training for a sustainable future. The authors take into account international and the Russian experience, connected with development of ecological culture as an educational resource of society adaptation to environmental constraints, risks, crises and present-day consolidated actions towards sustainable development of civilization. Methods. The methodological basis of constructing of the model formation of pupils’ ecological culture is developed from the standpoint of the idea of eco-development (noosphere, co-evolution, sustainable development and a set of axiological, cultural, personal-activity, co-evolutionary, cultural and ecological approaches. Justified methodical basis has allowed to construct educational level of formation of ecological culture of pupils, comprising interconnected unity of the target, substantive, procedural, effectively and appraisal components. Results and scientific novelty. The article presents the results of many years research of authors on environmental education for sustainable development in the framework of the Nizhny Novgorod scientific school. A characteristic of ecological culture of students as the goal of environmental education based on ecodevelopment ideas is given. It is shown that the ecological culture of students directs them to new values in life meanings, methods of ecological-oriented actions and behavior, changing settings of the consumer society and ensuring the development of the younger generation of co-evolutionary, spiritual guidance in a postindustrial society. The authors’ model of the formation of ecological culture of pupils is represented by conjugation philosophical and methodological, theoretical, methodological and pedagogical levels that ensure the integrity and hierarchical pedagogical research on the issue. The article discloses a pedagogical assessment

  6. A methodology for the quantification of doctrine and materiel approaches in a capability-based assessment

    Science.gov (United States)

    Tangen, Steven Anthony

    Due to the complexities of modern military operations and the technologies employed on today's military systems, acquisition costs and development times are becoming increasingly large. Meanwhile, the transformation of the global security environment is driving the U.S. military's own transformation. In order to meet the required capabilities of the next generation without buying prohibitively costly new systems, it is necessary for the military to evolve across the spectrum of doctrine, organization, training, materiel, leadership and education, personnel, and facilities (DOTMLPF). However, the methods for analyzing DOTMLPF approaches within the early acquisition phase of a capability-based assessment (CBA) are not as well established as the traditional technology design techniques. This makes it difficult for decision makers to decide if investments should be made in materiel or non-materiel solutions. This research develops an agent-based constructive simulation to quantitatively assess doctrine alongside materiel approaches. Additionally, life-cycle cost techniques are provided to enable a cost-effectiveness trade. These techniques are wrapped together in a decision-making environment that brings crucial information forward so informed and appropriate acquisition choices can be made. The methodology is tested on a future unmanned aerial vehicle design problem. Through the implementation of this quantitative methodology on the proof-of-concept study, it is shown that doctrinal changes including fleet composition, asset allocation, and patrol pattern were capable of dramatic improvements in system effectiveness at a much lower cost than the incorporation of candidate technologies. Additionally, this methodology was able to quantify the precise nature of strong doctrine-doctrine and doctrine-technology interactions which have been observed only qualitatively throughout military history. This dissertation outlines the methodology and demonstrates how potential

  7. Regression Tree-Based Methodology for Customizing Building Energy Benchmarks to Individual Commercial Buildings

    Science.gov (United States)

    Kaskhedikar, Apoorva Prakash

    According to the U.S. Energy Information Administration, commercial buildings represent about 40% of the United State's energy consumption of which office buildings consume a major portion. Gauging the extent to which an individual building consumes energy in excess of its peers is the first step in initiating energy efficiency improvement. Energy Benchmarking offers initial building energy performance assessment without rigorous evaluation. Energy benchmarking tools based on the Commercial Buildings Energy Consumption Survey (CBECS) database are investigated in this thesis. This study proposes a new benchmarking methodology based on decision trees, where a relationship between the energy use intensities (EUI) and building parameters (continuous and categorical) is developed for different building types. This methodology was applied to medium office and school building types contained in the CBECS database. The Random Forest technique was used to find the most influential parameters that impact building energy use intensities. Subsequently, correlations which were significant were identified between EUIs and CBECS variables. Other than floor area, some of the important variables were number of workers, location, number of PCs and main cooling equipment. The coefficient of variation was used to evaluate the effectiveness of the new model. The customization technique proposed in this thesis was compared with another benchmarking model that is widely used by building owners and designers namely, the ENERGY STAR's Portfolio Manager. This tool relies on the standard Linear Regression methods which is only able to handle continuous variables. The model proposed uses data mining technique and was found to perform slightly better than the Portfolio Manager. The broader impacts of the new benchmarking methodology proposed is that it allows for identifying important categorical variables, and then incorporating them in a local, as against a global, model framework for EUI

  8. An inexpensive, interdisciplinary, methodology to conduct an impact study of homeless persons on hospital based services.

    Science.gov (United States)

    Parker, R David; Regier, Michael; Brown, Zachary; Davis, Stephen

    2015-02-01

    Homelessness is a primary concern for community health. Scientific literature on homelessness is wide ranging and diverse. One opportunity to add to existing literature is the development and testing of affordable, easily implemented methods for measuring the impact of homeless on the healthcare system. Such methodological approaches rely on the strengths in a multidisciplinary approach, including providers, both healthcare and homeless services and applied clinical researchers. This paper is a proof of concept for a methodology which is easily adaptable nationwide, given the mandated implementation of homeless management information systems in the United States and other countries; medical billing systems by hospitals; and research methods of researchers. Adaptation is independent of geographic region, budget restraints, specific agency skill sets, and many other factors that impact the application of a consistent methodological science based approach to assess and address homelessness. We conducted a secondary data analysis merging data from homeless utilization and hospital case based data. These data detailed care utilization among homeless persons in a small, Appalachian city in the United States. In our sample of 269 persons who received at least one hospital based service and one homeless service between July 1, 2012 and June 30, 2013, the total billed costs were $5,979,463 with 10 people costing more than one-third ($1,957,469) of the total. Those persons were primarily men, living in an emergency shelter, with pre-existing disabling conditions. We theorize that targeted services, including Housing First, would be an effective intervention. This is proposed in a future study.

  9. Risk-based prioritization methodology for the classification of groundwater pollution sources.

    Science.gov (United States)

    Pizzol, Lisa; Zabeo, Alex; Critto, Andrea; Giubilato, Elisa; Marcomini, Antonio

    2015-02-15

    Water management is one of the EU environmental priorities and it is one of the most serious challenges that today's major cities are facing. The main European regulation for the protection of water resources is represented by the Water Framework Directive (WFD) and the Groundwater Directive (2006/118/EC) which require the identification, risk-based ranking and management of sources of pollution and the identification of those contamination sources that threaten the achievement of groundwater's good quality status. The aim of this paper is to present a new risk-based prioritization methodology to support the determination of a management strategy for the achievement of the good quality status of groundwater. The proposed methodology encompasses the following steps: 1) hazard analysis, 2) pathway analysis, 3) receptor vulnerability analysis and 4) relative risk estimation. Moreover, by integrating GIS functionalities and Multi Criteria Decision Analysis (MCDA) techniques, it allows to: i) deal with several sources and multiple impacted receptors within the area of concern; ii) identify different receptors' vulnerability levels according to specific groundwater uses; iii) assess the risks posed by all contamination sources in the area; and iv) provide a risk-based ranking of the contamination sources that can threaten the achievement of the groundwater good quality status. The application of the proposed framework to a well-known industrialized area located in the surroundings of Milan (Italy) is illustrated in order to demonstrate the effectiveness of the proposed framework in supporting the identification of intervention priorities. Among the 32 sources analyzed in the case study, three sources received the highest relevance score, due to the medium-high relative risks estimated for Chromium (VI) and Perchloroethylene. The case study application showed that the developed methodology is flexible and easy to adapt to different contexts, thanks to the possibility to

  10. In vivo patellofemoral contact mechanics during active extension using a novel dynamic MRI-based methodology.

    Science.gov (United States)

    Borotikar, B S; Sheehan, F T

    2013-12-01

    To establish an in vivo, normative patellofemoral (PF) cartilage contact mechanics database acquired during voluntary muscle control using a novel, dynamic, magnetic resonance (MR) imaging-based, computational methodology and validate the contact mechanics sensitivity to the known sub-millimeter methodological accuracies. Dynamic cine phase-contrast and multi-plane cine (MPC) images were acquired while female subjects (n = 20, sample of convenience) performed an open kinetic chain (knee flexion-extension) exercise inside a 3-T MR scanner. Static cartilage models were created from high resolution three-dimensional static MR data and accurately placed in their dynamic pose at each time frame based on the cine-PC (CPC) data. Cartilage contact parameters were calculated based on the surface overlap. Statistical analysis was performed using paired t-test and a one-sample repeated measures ANOVA. The sensitivity of the contact parameters to the known errors in the PF kinematics was determined. Peak mean PF contact area was 228.7 ± 173.6 mm(2) at 40° knee angle. During extension, contact centroid and peak strain locations tracked medially on the femoral and patellar cartilage and were not significantly different from each other. At 25°, 30°, 35°, and 40° of knee extension, contact area was significantly different. Contact area and centroid locations were insensitive to rotational and translational perturbations. This study is a first step towards unfolding the biomechanical pathways to anterior PF pain and osteoarthritis (OA) using dynamic, in vivo, and accurate methodologies. The database provides crucial data for future studies and for validation of, or as an input to, computational models. Published by Elsevier Ltd.

  11. A Proposed Theory Seeded Methodology for Design Based Research into Effective Use of MUVES in Vocational Education Contexts

    Science.gov (United States)

    Cochrane, Todd; Davis, Niki; Morrow, Donna

    2013-01-01

    A methodology for design based research (DBR) into effective development and use of Multi-User Virtual Environments (MUVE) in vocational education is proposed. It blends software development with DBR with two theories selected to inform the methodology. Legitimate peripheral participation LPP (Lave & Wenger, 1991) provides a filter when…

  12. New Fabrication Methodologies for the Development of Low Power Gas Sensors Based on Semiconducting Nanowires

    OpenAIRE

    Samà Monsonís, Jordi

    2016-01-01

    La tesis titulada New Fabrication Methodologies for the Development of Low Power Gas Sensors Based on Semiconducting Nanowires, se enmarca dentro de los sensores de gas para la monitorización ambiental de la calidad del aire, con el objetivo de detectar la presencia de gases nocivos para la salud humana. El trabajo desarrollado se basa en el uso de sensores de gas resistivos, es decir, que la adsorción de un gas en la superficie del sensor da lugar a un cambio en la conductividad del sens...

  13. Development of chitosan based edible films: process optimization using response surface methodology.

    Science.gov (United States)

    Singh, Tarun Pal; Chatli, Manish Kumar; Sahoo, Jhari

    2015-05-01

    Three-factors Box-Behnken design of response surface methodology (RSM) was used to optimize chitosan level (1.5, 2.0, 2.5 %w/v), glycerol level (0.5, 0.75, 1.0 %w/v) and drying temperature (35, 40, 45 °C) for the development of chitosan based edible films. The optimization was done on the basis of different responses viz. thickness, moisture, solubility, colour profile (L*, a*, b* value), penetrability, density, transmittance and water vapor transmission rate (WVTR). The linear effect of chitosan was significant (p fashion. Drying temperature also significantly (p industry.

  14. A rigorous methodology for development and uncertainty analysis of group contribution based property models

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Abildskov, Jens; Sin, Gürkan

    ) assessment of property model prediction errors, (iii) effect of outliers and data pre-treatment, (iv) formulation of parameter estimation problem (e.g. weighted least squares, ordinary least squares, robust regression, etc.) In this study a comprehensive methodology is developed to perform a rigorous......) weighted-least-square regression. 3) Initialization of estimation by use of linear algebra providing a first guess. 4) Sequential parameter and simultaneous GC parameter by using of 4 different minimization algorithms. 5) Thorough uncertainty analysis: a) based on asymptotic approximation of parameter...

  15. Smart learning objects for smart education in computer science theory, methodology and robot-based implementation

    CERN Document Server

    Stuikys, Vytautas

    2015-01-01

    This monograph presents the challenges, vision and context to design smart learning objects (SLOs) through Computer Science (CS) education modelling and feature model transformations. It presents the latest research on the meta-programming-based generative learning objects (the latter with advanced features are treated as SLOs) and the use of educational robots in teaching CS topics. The introduced methodology includes the overall processes to develop SLO and smart educational environment (SEE) and integrates both into the real education setting to provide teaching in CS using constructivist a

  16. HSTDEK: Developing a methodology for construction of large-scale, multi-use knowledge bases

    Science.gov (United States)

    Freeman, Michael S.

    1987-01-01

    The primary research objectives of the Hubble Space Telescope Design/Engineering Knowledgebase (HSTDEK) are to develop a methodology for constructing and maintaining large scale knowledge bases which can be used to support multiple applications. To insure the validity of its results, this research is being persued in the context of a real world system, the Hubble Space Telescope. The HSTDEK objectives are described in detail. The history and motivation of the project are briefly described. The technical challenges faced by the project are outlined.

  17. A study of polar ozone depletion based on sequential assimilation of satellite data from the ENVISAT/MIPAS and Odin/SMR instruments

    Directory of Open Access Journals (Sweden)

    J. D. Rösevall

    2007-01-01

    Full Text Available The objective of this study is to demonstrate how polar ozone depletion can be mapped and quantified by assimilating ozone data from satellites into the wind driven transport model DIAMOND, (Dynamical Isentropic Assimilation Model for OdiN Data. By assimilating a large set of satellite data into a transport model, ozone fields can be built up that are less noisy than the individual satellite ozone profiles. The transported fields can subsequently be compared to later sets of incoming satellite data so that the rates and geographical distribution of ozone depletion can be determined. By tracing the amounts of solar irradiation received by different air parcels in a transport model it is furthermore possible to study the photolytic reactions that destroy ozone. In this study, destruction of ozone that took place in the Antarctic winter of 2003 and in the Arctic winter of 2002/2003 have been examined by assimilating ozone data from the ENVISAT/MIPAS and Odin/SMR satellite-instruments. Large scale depletion of ozone was observed in the Antarctic polar vortex of 2003 when sunlight returned after the polar night. By mid October ENVISAT/MIPAS data indicate vortex ozone depletion in the ranges 80–100% and 70–90% on the 425 and 475 K potential temperature levels respectively while the Odin/SMR data indicates depletion in the ranges 70–90% and 50–70%. The discrepancy between the two instruments has been attributed to systematic errors in the Odin/SMR data. Assimilated fields of ENVISAT/MIPAS data indicate ozone depletion in the range 10–20% on the 475 K potential temperature level, (~19 km altitude, in the central regions of the 2002/2003 Arctic polar vortex. Assimilated fields of Odin/SMR data on the other hand indicate ozone depletion in the range 20–30%.

  18. Location of Bioelectricity Plants in the Madrid Community Based on Triticale Crop: A Multicriteria Methodology

    Directory of Open Access Journals (Sweden)

    L. Romero

    2015-01-01

    Full Text Available This paper presents a work whose objective is, first, to quantify the potential of the triticale biomass existing in each of the agricultural regions in the Madrid Community through a crop simulation model based on regression techniques and multiple correlation. Second, a methodology for defining which area has the best conditions for the installation of electricity plants from biomass has been described and applied. The study used a methodology based on compromise programming in a discrete multicriteria decision method (MDM context. To make a ranking, the following criteria were taken into account: biomass potential, electric power infrastructure, road networks, protected spaces, and urban nuclei surfaces. The results indicate that, in the case of the Madrid Community, the Campiña region is the most suitable for setting up plants powered by biomass. A minimum of 17,339.9 tons of triticale will be needed to satisfy the requirements of a 2.2 MW power plant. The minimum range of action for obtaining the biomass necessary in Campiña region would be 6.6 km around the municipality of Algete, based on Geographic Information Systems. The total biomass which could be made available in considering this range in this region would be 18,430.68 t.

  19. [Implementation methodology of practices based on scientific evidence for assistance in natural delivery: a pilot study].

    Science.gov (United States)

    Côrtes, Clodoaldo Tentes; Santos, Rafael Cleison Silva Dos; Caroci, Adriana de Souza; Oliveira, Sheyla Guimarães; Oliveira, Sonia Maria Junqueira Vasconcelos de; Riesco, Maria Luiza Gonzalez

    2015-10-01

    Presenting methodology for transferring knowledge to improve maternal outcomes in natural delivery based on scientific evidence. An intervention study conducted in the maternity hospital of Itapecerica da Serra, SP, with 50 puerperal women and 102 medical records from July to November 2014. The PACES tool from Joanna Briggs Institute, consisting of pre-clinical audit (phase 1), implementation of best practice (phase 2) and Follow-up Clinical Audit (phase 3) was used. Data were analyzed by comparing results of phases 1 and 3 with Fisher's exact test and a significance level of 5%. The vertical position was adopted by the majority of puerperal women with statistical difference between phases 1 and 3. A significant increase in bathing/showering, walking and massages for pain relief was found from the medical records. No statistical difference was found in other practices and outcomes. Barriers and difficulties in the implementation of evidence-based practices have been identified. Variables were refined, techniques and data collection instruments were verified, and an intervention proposal was made. The study found possibilities for implementing a methodology of practices based on scientific evidence for assistance in natural delivery.

  20. Consequences of biome depletion

    International Nuclear Information System (INIS)

    Salvucci, Emiliano

    2013-01-01

    The human microbiome is an integral part of the superorganism together with their host and they have co-evolved since the early days of the existence of the human species. The modification of the microbiome as a result changes in food and social habits of human beings throughout their life history has led to the emergence of many diseases. In contrast with the Darwinian view of nature of selfishness and competence, new holistic approaches are rising. Under these views, the reconstitution of the microbiome comes out as a fundamental therapy for emerging diseases related to biome depletion.

  1. Unravelling emotional viewpoints on a bio-based economy using Q methodology.

    Science.gov (United States)

    Sleenhoff, Susanne; Cuppen, Eefje; Osseweijer, Patricia

    2015-10-01

    A transition to a bio-based economy will affect society and requires collective action from a broad range of stakeholders. This includes the public, who are largely unaware of this transition. For meaningful public engagement people's emotional viewpoints play an important role. However, what the public's emotions about the transition are and how they can be taken into account is underexposed in public engagement literature and practice. This article aims to unravel the public's emotional views of the bio-based economy as a starting point for public engagement. Using Q methodology with visual representations of a bio-based economy we found four emotional viewpoints: (1) compassionate environmentalist, (2) principled optimist, (3) hopeful motorist and (4) cynical environmentalist. These provide insight into the distinct and shared ways through which members of the public connect with the transition. Implications for public engagement are discussed. © The Author(s) 2014.

  2. A new transmission methodology for quality assurance in radiotherapy based on radiochromic film measurements

    Science.gov (United States)

    do Amaral, Leonardo L.; Pavoni, Juliana F.; Sampaio, Francisco; Netto, Thomaz Ghilardi

    2015-01-01

    Despite individual quality assurance (QA) being recommended for complex techniques in radiotherapy (RT) treatment, the possibility of errors in dose delivery during therapeutic application has been verified. Therefore, it is fundamentally important to conduct in vivo QA during treatment. This work presents an in vivo transmission quality control methodology, using radiochromic film (RCF) coupled to the linear accelerator (linac) accessory holder. This QA methodology compares the dose distribution measured by the film in the linac accessory holder with the dose distribution expected by the treatment planning software. The calculated dose distribution is obtained in the coronal and central plane of a phantom with the same dimensions of the acrylic support used for positioning the film but in a source‐to‐detector distance (SDD) of 100 cm, as a result of transferring the IMRT plan in question with all the fields positioned with the gantry vertically, that is, perpendicular to the phantom. To validate this procedure, first of all a Monte Carlo simulation using PENELOPE code was done to evaluate the differences between the dose distributions measured by the film in a SDD of 56.8 cm and 100 cm. After that, several simple dose distribution tests were evaluated using the proposed methodology, and finally a study using IMRT treatments was done. In the Monte Carlo simulation, the mean percentage of points approved in the gamma function comparing the dose distribution acquired in the two SDDs were 99.92%±0.14%. In the simple dose distribution tests, the mean percentage of points approved in the gamma function were 99.85%±0.26% and the mean percentage differences in the normalization point doses were −1.41%. The transmission methodology was approved in 24 of 25 IMRT test irradiations. Based on these results, it can be concluded that the proposed methodology using RCFs can be applied for in vivo QA in RT treatments. PACS number: 87.55.Qr, 87.55.km, 87.55.N‐ PMID

  3. Metodologia de custeio para a ergonomia Ergonomics-based costing methodology

    Directory of Open Access Journals (Sweden)

    José Roberto Dourado Mafra

    2006-12-01

    Full Text Available Uma metodologia de Custeio para Ergonomia é apresentada neste artigo. Aqui o custeio é construído em paralelo ao processo da Análise de Ergonomia. Faz-se uma breve revisão da literatura. Essa metodologia de custeio abrange uma estimativa inicial de custos e a posterior aferição desses custos, decorrentes da ausência de Ergonomia no delineamento das situações em estudo; num outro momento, são feitos os cálculos dos custos das correções, ou investimentos necessários e a avaliação dos benefícios aportados pela nova concepção. A aplicação dessa metodologia é exemplificada em um estudo de caso de uma cozinha industrial, onde foi realizada uma Análise Ergonômica do Trabalho. No estudo de caso, a ausência de ergonomia é caracterizada por indicadores econômicos de efetividade na empresa. Conclui-se que essa metodologia de custeio mostra como problemas no desempenho impactam no negócio, economicamente, caracterizados em saúde, qualidade de vida e produtividade no trabalho. Nesse sentido, acredita-se ter contribuído com o estado da prática, contabilizando os custos e avaliando a viabilidade da solução.This paper discusses an ergonomics-based costing methodology, in which the costing process and the ergonomic work analysis are realized at the same time. A brief bibliographic review is presented. Two questions are pointed out regarding the economic evaluation of ergonomic interventions: one is the costing problem and the other evaluation itself. This costing methodology involves an initial costing estimate of the lack of ergonomics in the study case, followed by the checking of data validity; then, the costs of solutions are calculated and the benefits of the new conception are assessed. The methodology is applied to one example, i.e. a case study of an industrial kitchen, where an ergonomic work analysis was performed. In the studied case, the lack of ergonomics is characterized by economic indicators of company efficacy

  4. Flow methodology for methanol determination in biodiesel exploiting membrane-based extraction

    International Nuclear Information System (INIS)

    Araujo, Andre R.T.S.; Saraiva, M. Lucia M.F.S.; Lima, Jose L.F.C.; Korn, M. Gracas A.

    2008-01-01

    A methodology based in flow analysis and membrane-based extraction has been applied to the determination of methanol in biodiesel samples. A hydrophilic membrane was used to perform the liquid-liquid extraction in the system with the organic sample fed to the donor side of the membrane and the methanol transfer to an aqueous acceptor buffer solution. The quantification of the methanol was then achieved in aqueous solution by the combined use of immobilised alcohol oxidase (AOD), soluble peroxidase and 2,2'-azino-bis(3-ethylbenzothiazoline-6-sulfonic acid) (ABTS). The optimization of parameters such as the type of membrane, the groove volume and configuration of the membrane unit, the appropriate organic solvent, sample injection volume, as well as immobilised packed AOD reactor was performed. Two dynamic analytical working ranges were achieved, up to 0.015% and up to 0.200% (m/m) methanol concentrations, just by changing the volume of acceptor aqueous solution. Detection limits of 0.0002% (m/m) and 0.007% (m/m) methanol were estimated, respectively. The decision limit (CCα) and the detection capacity (CCβ) were 0.206 and 0.211% (m/m), respectively. The developed methodology showed good precision, with a relative standard deviation (R.S.D.) <5.0% (n = 10). Biodiesel samples from different sources were then directly analyzed without any sample pre-treatment. Statistical evaluation showed good compliance, for a 95% confidence level, between the results obtained with the flow system and those furnished by the gas chromatography reference method. The proposed methodology turns out to be more environmental friendly and cost-effective than the reference method

  5. Risk Based Inspection Methodology and Software Applied to Atmospheric Storage Tanks

    Science.gov (United States)

    Topalis, P.; Korneliussen, G.; Hermanrud, J.; Steo, Y.

    2012-05-01

    A new risk-based inspection (RBI) methodology and software is presented in this paper. The objective of this work is to allow management of the inspections of atmospheric storage tanks in the most efficient way, while, at the same time, accident risks are minimized. The software has been built on the new risk framework architecture, a generic platform facilitating efficient and integrated development of software applications using risk models. The framework includes a library of risk models and the user interface is automatically produced on the basis of editable schemas. This risk-framework-based RBI tool has been applied in the context of RBI for above-ground atmospheric storage tanks (AST) but it has been designed with the objective of being generic enough to allow extension to the process plants in general. This RBI methodology is an evolution of an approach and mathematical models developed for Det Norske Veritas (DNV) and the American Petroleum Institute (API). The methodology assesses damage mechanism potential, degradation rates, probability of failure (PoF), consequence of failure (CoF) in terms of environmental damage and financial loss, risk and inspection intervals and techniques. The scope includes assessment of the tank floor for soil-side external corrosion and product-side internal corrosion and the tank shell courses for atmospheric corrosion and internal thinning. It also includes preliminary assessment for brittle fracture and cracking. The data are structured according to an asset hierarchy including Plant, Production Unit, Process Unit, Tag, Part and Inspection levels and the data are inherited / defaulted seamlessly from a higher hierarchy level to a lower level. The user interface includes synchronized hierarchy tree browsing, dynamic editor and grid-view editing and active reports with drill-in capability.

  6. Optimization of cocoa nib roasting based on sensory properties and colour using response surface methodology

    Directory of Open Access Journals (Sweden)

    D.M.H. A.H. Farah

    2012-05-01

    Full Text Available Roasting of cocoa beans is a critical stage for development of its desirable flavour, aroma and colour. Prior to roasting, cocoa bean may taste astringent, bitter, acidy, musty, unclean, nutty or even chocolate-like, depends on the bean sources and their preparations. After roasting, the bean possesses a typical intense cocoa flavour. The Maillard or non-enzymatic browning reactions is a very important process for the development of cocoa flavor, which occurs primarily during the roasting process and it has generally been agreed that the main flavor components, pyrazines formation is associated within this reaction involving amino acids and reducing sugars. The effect of cocoa nib roasting conditions on sensory properties and colour of cocoa beans were investigated in this study. Roasting conditions in terms of temperature ranged from 110 to 160OC and time ranged from 15 to 40 min were optimized by using Response Surface Methodology based on the cocoa sensory characteristics including chocolate aroma, acidity, astringency, burnt taste and overall acceptability. The analyses used 9- point hedonic scale with twelve trained panelist. The changes in colour due to the roasting condition were also monitored using chromameter. Result of this study showed that sensory quality of cocoa liquor increased with the increase in roasting time and temperature up to 160OC and up to 40 min, respectively. Based on the Response Surface Methodology, the optimised operating condition for the roaster was at temperature of 127OC and time of 25 min. The proposed roasting conditions were able to produce superior quality cocoa beans that will be very useful for cocoa manufactures.Key words : Cocoa, cocoa liquor, flavour, aroma, colour, sensory characteristic, response surface methodology.

  7. A proposal on teaching methodology: cooperative learning by peer tutoring based on the case method

    Science.gov (United States)

    Pozo, Antonio M.; Durbán, Juan J.; Salas, Carlos; del Mar Lázaro, M.

    2014-07-01

    The European Higher Education Area (EHEA) proposes substantial changes in the teaching-learning model, moving from a model based mainly on the activity of teachers to a model in which the true protagonist is the student. This new framework requires that students develop new abilities and acquire specific skills. This also implies that the teacher should incorporate new methodologies in class. In this work, we present a proposal on teaching methodology based on cooperative learning and peer tutoring by case study. A noteworthy aspect of the case-study method is that it presents situations that can occur in real life. Therefore, students can acquire certain skills that will be useful in their future professional practice. An innovative aspect in the teaching methodology that we propose is to form work groups consisting of students from different levels in the same major. In our case, the teaching of four subjects would be involved: one subject of the 4th year, one subject of the 3rd year, and two subjects of the 2nd year of the Degree in Optics and Optometry of the University of Granada, Spain. Each work group would consist of a professor and a student of the 4th year, a professor and a student of the 3rd year, and two professors and two students of the 2nd year. Each work group would have a tutoring process from each professor for the corresponding student, and a 4th-year student providing peer tutoring for the students of the 2nd and 3rd year.

  8. Ozone depleting substances management inventory system

    Directory of Open Access Journals (Sweden)

    Felix Ivan Romero Rodríguez

    2018-02-01

    Full Text Available Context: The care of the ozone layer is an activity that contributes to the planet's environmental stability. For this reason, the Montreal Protocol is created to control the emission of substances that deplete the ozone layer and reduce its production from an organizational point of view. However, it is also necessary to have control of those that are already circulating and those present in the equipment that cannot be replaced yet because of the context of the companies that keep it. Generally, the control mechanisms for classifying the type of substances, equipment and companies that own them, are carried in physical files, spreadsheets and text documents, which makes it difficult to control and manage the data stored in them. Method: The objective of this research is to computerize the process of control of substances that deplete the ozone layer. An evaluation and description of all process to manage Ozone-Depleting Substances (ODS, and its alternatives, is done. For computerization, the agile development methodology SCRUM is used, and for the technological solution tools and free open source technologies are used. Result: As a result of the research, a computer tool was developed that automates the process of control and management of substances that exhaust the ozone layer and its alternatives. Conclusions: The developed computer tool allows to control and manage the ozone-depleting substances and the equipment that use them. It also manages the substances that arise as alternatives to be used for the protection of the ozone layer.

  9. IoT-Based Information System for Healthcare Application: Design Methodology Approach

    Directory of Open Access Journals (Sweden)

    Damian Dziak

    2017-06-01

    Full Text Available Over the last few decades, life expectancy has increased significantly. However, elderly people who live on their own often need assistance due to mobility difficulties, symptoms of dementia or other health problems. In such cases, an autonomous supporting system may be helpful. This paper proposes the Internet of Things (IoT-based information system for indoor and outdoor use. Since the conducted survey of related works indicated a lack of methodological approaches to the design process, therefore a Design Methodology (DM, which approaches the design target from the perspective of the stakeholders, contracting authorities and potential users, is introduced. The implemented solution applies the three-axial accelerometer and magnetometer, Pedestrian Dead Reckoning (PDR, thresholding and the decision trees algorithm. Such an architecture enables the localization of a monitored person within four room-zones with accuracy; furthermore, it identifies falls and the activities of lying, standing, sitting and walking. Based on the identified activities, the system classifies current activities as normal, suspicious or dangerous, which is used to notify the healthcare staff about possible problems. The real-life scenarios validated the high robustness of the proposed solution. Moreover, the test results satisfied both stakeholders and future users and ensured further cooperation with the project.

  10. Simultaneous determination of five Alternaria toxins in cereals using QuEChERS-based methodology.

    Science.gov (United States)

    Zhou, Jian; Xu, Jiao-Jiao; Cai, Zeng-Xuan; Huang, Bai-Fen; Jin, Mi-Cong; Ren, Yi-Ping

    2017-11-15

    Two analytical approaches, including ultra-high performance liquid chromatograph linked with photo-diode array/fluorescence detector, and ultra-high performance liquid chromatography-tandem mass spectrometry, have been proposed for simultaneous determination of five Alternaria toxins in cereals, which both based on QuEChERS strategy. After QuEChERS extraction and salting out, the collected supernatant was subjected to an extra dispersive liquid-liquid microextraction step prior to quantitative analysis. Response surface methodology based on central composite design was employed to optimize the micro-extraction conditions. During photo-diode array/fluorescence detector method validation, satisfactory analytical characteristics, in terms of selectivity, recovery (72.7%-109.1%), precision (inter-day RSDs <9.6%), sensitivity (limits of quantification ranged from 2.1μgkg -1 to 120.0μgkg -1 ) and linearity (R 2 all higher than 0.9984), were achieved. Mass spectrometry method was employed as a certified method for accuracy. The two methodologies were successfully applied to 71 samples including three different matrices and the quantitative results were compared. As the result of non-parametric test shown, the established two analytical approach exhibited comparable quantitative accuracy to each other. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Model methodology for estimating pesticide concentration extremes based on sparse monitoring data

    Science.gov (United States)

    Vecchia, Aldo V.

    2018-03-22

    This report describes a new methodology for using sparse (weekly or less frequent observations) and potentially highly censored pesticide monitoring data to simulate daily pesticide concentrations and associated quantities used for acute and chronic exposure assessments, such as the annual maximum daily concentration. The new methodology is based on a statistical model that expresses log-transformed daily pesticide concentration in terms of a seasonal wave, flow-related variability, long-term trend, and serially correlated errors. Methods are described for estimating the model parameters, generating conditional simulations of daily pesticide concentration given sparse (weekly or less frequent) and potentially highly censored observations, and estimating concentration extremes based on the conditional simulations. The model can be applied to datasets with as few as 3 years of record, as few as 30 total observations, and as few as 10 uncensored observations. The model was applied to atrazine, carbaryl, chlorpyrifos, and fipronil data for U.S. Geological Survey pesticide sampling sites with sufficient data for applying the model. A total of 112 sites were analyzed for atrazine, 38 for carbaryl, 34 for chlorpyrifos, and 33 for fipronil. The results are summarized in this report; and, R functions, described in this report and provided in an accompanying model archive, can be used to fit the model parameters and generate conditional simulations of daily concentrations for use in investigations involving pesticide exposure risk and uncertainty.

  12. Model-Driven Methodology for Rapid Deployment of Smart Spaces Based on Resource-Oriented Architectures

    Directory of Open Access Journals (Sweden)

    José R. Casar

    2012-07-01

    Full Text Available Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT and Web of Things (WoT are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i to integrate sensing and actuating functionalities into everyday objects, and (ii to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD methodology based on the Model Driven Architecture (MDA. This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.

  13. [Activity-based costing methodology to manage resources in intensive care units].

    Science.gov (United States)

    Alvear V, Sandra; Canteros G, Jorge; Jara M, Juan; Rodríguez C, Patricia

    2013-11-01

    An accurate estimation of resources use by individual patients is crucial in hospital management. To measure financial costs of health care actions in intensive care units of two public regional hospitals in Chile. Prospective follow up of 716 patients admitted to two intensive care units during 2011. The financial costs of health care activities was calculated using the Activity-Based Costing methodology. The main activities recorded were procedures and treatments, monitoring, response to patient needs, patient maintenance and coordination. Activity-Based Costs, including human resources and assorted indirect costs correspond to 81 to 88% of costs per disease in one hospital and 69 to 80% in the other. The costs associated to procedures and treatments are the most significant and are approximately $100,000 (Chilean pesos) per day of hospitalization. The second most significant cost corresponds to coordination activities, which fluctuates between $86,000 and 122,000 (Chilean pesos). There are significant differences in resources use between the two hospitals studied. Therefore cost estimation methodologies should be incorporated in the management of these clinical services.

  14. Model-Driven Methodology for Rapid Deployment of Smart Spaces Based on Resource-Oriented Architectures

    Science.gov (United States)

    Corredor, Iván; Bernardos, Ana M.; Iglesias, Josué; Casar, José R.

    2012-01-01

    Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT) and Web of Things (WoT) are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i) to integrate sensing and actuating functionalities into everyday objects, and (ii) to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD) methodology based on the Model Driven Architecture (MDA). This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym. PMID:23012544

  15. Site-conditions map for Portugal based on VS measurements: methodology and final model

    Science.gov (United States)

    Vilanova, Susana; Narciso, João; Carvalho, João; Lopes, Isabel; Quinta Ferreira, Mario; Moura, Rui; Borges, José; Nemser, Eliza; Pinto, carlos

    2017-04-01

    In this paper we present a statistically significant site-condition model for Portugal based on shear-wave velocity (VS) data and surface geology. We also evaluate the performance of commonly used Vs30 proxies based on exogenous data and analyze the implications of using those proxies for calculating site amplification in seismic hazard assessment. The dataset contains 161 Vs profiles acquired in Portugal in the context of research projects, technical reports, academic thesis and academic papers. The methodologies involved in characterizing the Vs structure at the sites in the database include seismic refraction, multichannel analysis of seismic waves and refraction microtremor. Invasive measurements were performed in selected locations in order to compare the Vs profiles obtained from both invasive and non-invasive techniques. In general there was good agreement in the subsurface structure of Vs30 obtained from the different methodologies. The database flat-file includes information on Vs30, surface geology at 1:50.000 and 1:500.000 scales, elevation and topographic slope and based on SRTM30 topographic dataset. The procedure used to develop the site-conditions map is based on a three-step process that includes defining a preliminary set of geological units based on the literature, performing statistical tests to assess whether or not the differences in the distributions of Vs30 are statistically significant, and merging of the geological units accordingly. The dataset was, to some extent, affected by clustering and/or preferential sampling and therefore a declustering algorithm was applied. The final model includes three geological units: 1) Igneous, metamorphic and old (Paleogene and Mesozoic) sedimentary rocks; 2) Neogene and Pleistocene formations, and 3) Holocene formations. The evaluation of proxies indicates that although geological analogues and topographic slope are in general unbiased, the latter shows significant bias for particular geological units and

  16. Attitudes toward simulation-based learning in nursing students: an application of Q methodology.

    Science.gov (United States)

    Yeun, Eun Ja; Bang, Ho Yoon; Ryoo, Eon Na; Ha, Eun-Ho

    2014-07-01

    SBL is a highly advanced educational method that promotes technical/non-technical skills, increases team competency, and increases health care team interaction in a safe health care environment with no potential for harm to the patient. Even though students may experience the same simulation, their reactions are not necessarily uniform. This study aims at identifying the diversely perceived attitudes of undergraduate nursing students toward simulation-based learning. This study design was utilized using a Q methodology, which analyzes the subjectivity of each type of attitude. Data were collected from 22 undergraduate nursing students who had an experience of simulation-based learning before going to the clinical setting. The 45 selected Q-statements from each of 22 participants were classified into the shape of a normal distribution using a 9-point scale. The collected data was analyzed using the pc-QUANL program. The results revealed two discrete groups of students toward simulation-based learning: 'adventurous immersion' and 'constructive criticism'. The findings revealed that teaching and learning strategies based on the two factors of attitudes could beneficially contribute to the customization of simulation-based learning. In nursing education and clinical practice, teaching and learning strategies based on types I and II can be used to refine an alternative learning approach that supports and complements clinical practice. Recommendations have been provided based on the findings. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Theoretical and methodological bases using foreign experience in public administration reform

    Directory of Open Access Journals (Sweden)

    I. О. Lozitska

    2016-03-01

    Full Text Available The article is devoted to theoretical and methodological basics of implementation of foreign experience of public administration reform. The well-established identification of the terms «public administration» and «state administration» in modern management and science are used. The field study of the implementation performs the process of European integration as a factor of influence on public management reforms In the process of analyzing. It is established, that a theoretical basis of introduction of foreign experience in the science of public administration is not systemically developed. Comprehensive comparative analysis of the adaptation of the algorithm of introduction of foreign experience in public administration reform are proposed. The article examines various approaches regarding the principles of reforming public administration on the basis of introducing foreign experience of good governance. Studies are based on the use of the complex of General scientific and special scientific methods of research. The article improved conceptual approaches regarding organizational principles of implementation of foreign experience in the practice of public administration in Ukraine. The feasibility of the methodology of the system approach to the process of reforming the system of public administration on individual scientific direction are proved. Approaches are developments of the methodology of the structurally functional analysis of the transformation process in public administration are proposed. The methodology of the analysis the introduction of foreign codetermination by examining institutional forms, organizing principles and socio-historical origins and trends of public administration development. Sareas for further research of reforming the state in line with the concept of sustainable development «Ukraine – 2020» are uggested. In the article the attention is focused on the feasibility of developing the modern paradigm of

  18. Simulating fission product transients via the history-based local-parameter methodology

    International Nuclear Information System (INIS)

    Jenkins, D.A.; Rouben, B.; Salvatore, M.

    1993-01-01

    This paper describes the fission-product-calculation capacity of the history-based local-parameter methodology for evaluating lattice properties for use in core-tracking calculations in CANDU reactors. In addition to taking into account the individual past history of each bundles flux/power level, fuel temperature, and coolant density and temperature that the bundle has seen during its stay in the core, the latest refinement of the history-based method provides the capability of fission-product-drivers. It allows the bundle-specific concentrations of the three basic groups of saturating fission products to be calculated in steady state or following a power transient, including long shutdowns. The new capability is illustrated by simulating the startup period following a typical long-shutdown, starting from a snapshot in the Point Lepreau operating history. 9 refs., 7 tabs

  19. A Study on Ductility of Prestressed Concrete Pier Based on Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    H. Wang

    2016-12-01

    Full Text Available The ductility of prestressed concrete pier is studied based on response surface methodology. Referring to the pervious prestressed concrete pier, based on Box-Behnken design, the ductility of 25 prestressed concrete piers is calculated by numerical method. The relationship between longitudinal reinforcement ratio, shear reinforcement ratio, prestressed tendon quantity, concrete compressive strength and ductility factor is gotten. The influence of the longitudinal reinforcement ratio, the shear reinforcement ratio, the prestressed tendon quantity and concrete compressive strength to curvature ductility is discussed. Then the ductility regression equation is deduced. The result showed that the influence of the prestressed tendon quantity to the ductility of prestressed concrete pier is significant. With the increasing of the prestressed tendon quantity, the curvature ductility curved reduces. With the increasing of shear reinforcement ratio and compressive strength of concrete, the curvature ductility increases linearly. And the influence of the longitudinal reinforcement ratio to ductility of the prestressed concrete pier is insignificant.

  20. Methodology, outcome, safety and in vivo accuracy in traditional frame-based stereoelectroencephalography.

    Science.gov (United States)

    van der Loo, Lars E; Schijns, Olaf E M G; Hoogland, Govert; Colon, Albert J; Wagner, G Louis; Dings, Jim T A; Kubben, Pieter L

    2017-07-05

    Stereoelectroencephalography (SEEG) is an established diagnostic technique for the localization of the epileptogenic zone in drug-resistant epilepsy. In vivo accuracy of SEEG electrode positioning is of paramount importance since higher accuracy may lead to more precise resective surgery, better seizure outcome and reduction of complications. To describe experiences with the SEEG technique in our comprehensive epilepsy center, to illustrate surgical methodology, to evaluate in vivo application accuracy and to consider the diagnostic yield of SEEG implantations. All patients who underwent SEEG implantations between September 2008 and April 2016 were analyzed. Planned electrode trajectories were compared with post-implantation trajectories after fusion of pre- and postoperative imaging. Quantitative analysis of deviation using Euclidean distance and directional errors was performed. Explanatory variables for electrode accuracy were analyzed using linear regression modeling. The surgical methodology, procedure-related complications and diagnostic yield were reported. Seventy-six implantations were performed in 71 patients, and a total of 902 electrodes were implanted. Median entry and target point deviations were 1.54 mm and 2.93 mm. Several factors that predicted entry and target point accuracy were identified. The rate of major complications was 2.6%. SEEG led to surgical therapy of various modalities in 53 patients (69.7%). This study demonstrated that entry and target point localization errors can be predicted by linear regression models, which can aid in identification of high-risk electrode trajectories and further enhancement of accuracy. SEEG is a reliable technique, as demonstrated by the high accuracy of conventional frame-based implantation methodology and the good diagnostic yield.

  1. Microscopic to macroscopic depletion model development for FORMOSA-P

    International Nuclear Information System (INIS)

    Noh, J.M.; Turinsky, P.J.; Sarsour, H.N.

    1996-01-01

    Microscopic depletion has been gaining popularity with regard to employment in reactor core nodal calculations, mainly attributed to the superiority of microscopic depletion in treating spectral history effects during depletion. Another trend is the employment of loading pattern optimization computer codes in support of reload core design. Use of such optimization codes has significantly reduced design efforts to optimize reload core loading patterns associated with increasingly complicated lattice designs. A microscopic depletion model has been developed for the FORMOSA-P pressurized water reactor (PWR) loading pattern optimization code. This was done for both fidelity improvements and to make FORMOSA-P compatible with microscopic-based nuclear design methods. Needless to say, microscopic depletion requires more computational effort compared with macroscopic depletion. This implies that microscopic depletion may be computationally restrictive if employed during the loading pattern optimization calculation because many loading patterns are examined during the course of an optimization search. Therefore, the microscopic depletion model developed here uses combined models of microscopic and macroscopic depletion. This is done by first performing microscopic depletions for a subset of possible loading patterns from which 'collapsed' macroscopic cross sections are obtained. The collapsed macroscopic cross sections inherently incorporate spectral history effects. Subsequently, the optimization calculations are done using the collapsed macroscopic cross sections. Using this approach allows maintenance of microscopic depletion level accuracy without substantial additional computing resources

  2. Combining the auxin-inducible degradation system with CRISPR/Cas9-based genome editing for the conditional depletion of endogenous Drosophila melanogaster proteins.

    Science.gov (United States)

    Bence, Melinda; Jankovics, Ferenc; Lukácsovich, Tamás; Erdélyi, Miklós

    2017-04-01

    Inducible protein degradation techniques have considerable advantages over classical genetic approaches, which generate loss-of-function phenotypes at the gene or mRNA level. The plant-derived auxin-inducible degradation system (AID) is a promising technique which enables the degradation of target proteins tagged with the AID motif in nonplant cells. Here, we present a detailed characterization of this method employed during the adult oogenesis of Drosophila. Furthermore, with the help of CRISPR/Cas9-based genome editing, we improve the utility of the AID system in the conditional elimination of endogenously expressed proteins. We demonstrate that the AID system induces efficient and reversible protein depletion of maternally provided proteins both in the ovary and the early embryo. Moreover, the AID system provides a fine spatiotemporal control of protein degradation and allows for the generation of different levels of protein knockdown in a well-regulated manner. These features of the AID system enable the unraveling of the discrete phenotypes of genes with highly complex functions. We utilized this system to generate a conditional loss-of-function allele which allows for the specific degradation of the Vasa protein without affecting its alternative splice variant (solo) and the vasa intronic gene (vig). With the help of this special allele, we demonstrate that dramatic decrease of Vasa protein in the vitellarium does not influence the completion of oogenesis as well as the establishment of proper anteroposterior and dorsoventral polarity in the developing oocyte. Our study suggests that both the localization and the translation of gurken mRNA in the vitellarium is independent from Vasa. © 2017 Federation of European Biochemical Societies.

  3. A Visual Analytics Based Decision Support Methodology For Evaluating Low Energy Building Design Alternatives

    Science.gov (United States)

    Dutta, Ranojoy

    The ability to design high performance buildings has acquired great importance in recent years due to numerous federal, societal and environmental initiatives. However, this endeavor is much more demanding in terms of designer expertise and time. It requires a whole new level of synergy between automated performance prediction with the human capabilities to perceive, evaluate and ultimately select a suitable solution. While performance prediction can be highly automated through the use of computers, performance evaluation cannot, unless it is with respect to a single criterion. The need to address multi-criteria requirements makes it more valuable for a designer to know the "latitude" or "degrees of freedom" he has in changing certain design variables while achieving preset criteria such as energy performance, life cycle cost, environmental impacts etc. This requirement can be met by a decision support framework based on near-optimal "satisficing" as opposed to purely optimal decision making techniques. Currently, such a comprehensive design framework is lacking, which is the basis for undertaking this research. The primary objective of this research is to facilitate a complementary relationship between designers and computers for Multi-Criterion Decision Making (MCDM) during high performance building design. It is based on the application of Monte Carlo approaches to create a database of solutions using deterministic whole building energy simulations, along with data mining methods to rank variable importance and reduce the multi-dimensionality of the problem. A novel interactive visualization approach is then proposed which uses regression based models to create dynamic interplays of how varying these important variables affect the multiple criteria, while providing a visual range or band of variation of the different design parameters. The MCDM process has been incorporated into an alternative methodology for high performance building design referred to as

  4. Riddle of depleted uranium

    International Nuclear Information System (INIS)

    Hussein, A.S.

    2005-01-01

    Depleted Uranium (DU) is the waste product of uranium enrichment from the manufacturing of fuel rods for nuclear reactors in nuclear power plants and nuclear power ships. DU may also results from the reprocessing of spent nuclear reactor fuel. Potentially DU has both chemical and radiological toxicity with two important targets organs being the kidney and the lungs. DU is made into a metal and, due to its availability, low price, high specific weight, density and melting point as well as its pyrophoricity; it has a wide range of civilian and military applications. Due to the use of DU over the recent years, there appeared in some press on health hazards that are alleged to be due to DU. In these paper properties, applications, potential environmental and health effects of DU are briefly reviewed

  5. A methodology for a quantitative assessment of safety culture in NPPs based on Bayesian networks

    International Nuclear Information System (INIS)

    Kim, Young Gab; Lee, Seung Min; Seong, Poong Hyun

    2017-01-01

    Highlights: • A safety culture framework and a quantitative methodology to assess safety culture were proposed. • The relation among Norm system, Safety Management System and worker's awareness was established. • Safety culture probability at NPPs was updated by collecting actual organizational data. • Vulnerable areas and the relationship between safety culture and human error were confirmed. - Abstract: For a long time, safety has been recognized as a top priority in high-reliability industries such as aviation and nuclear power plants (NPPs). Establishing a safety culture requires a number of actions to enhance safety, one of which is changing the safety culture awareness of workers. The concept of safety culture in the nuclear power domain was established in the International Atomic Energy Agency (IAEA) safety series, wherein the importance of employee attitudes for maintaining organizational safety was emphasized. Safety culture assessment is a critical step in the process of enhancing safety culture. In this respect, assessment is focused on measuring the level of safety culture in an organization, and improving any weakness in the organization. However, many continue to think that the concept of safety culture is abstract and unclear. In addition, the results of safety culture assessments are mostly subjective and qualitative. Given the current situation, this paper suggests a quantitative methodology for safety culture assessments based on a Bayesian network. A proposed safety culture framework for NPPs would include the following: (1) a norm system, (2) a safety management system, (3) safety culture awareness of worker, and (4) Worker behavior. The level of safety culture awareness of workers at NPPs was reasoned through the proposed methodology. Then, areas of the organization that were vulnerable in terms of safety culture were derived by analyzing observational evidence. We also confirmed that the frequency of events involving human error

  6. Application of knowledge tools in training, based on problems’ solving: methodology and it support

    Directory of Open Access Journals (Sweden)

    D. V. Kudryavtsev

    2017-01-01

    Full Text Available The development of information accessibility in the 21st century necessitates the production of own knowledge in the learning process, and not justthe transfer of information. The computer should be used as a universal tool for working with knowledge, which is the study of the world, information obtaining, the organization and structuring of their own knowledge and presentation to other people. The aim of the work is to develop a methodology for the use of tools for working with knowledge in problem-based learning, which will be used to form an appropriate information and educational environment. The relevance of the paper is due to the importance of problem-based learning, as well as the emergence of many tools and technologies for obtaining, structuring and transferring knowledge. Traditional education focuses on the transfer of knowledge mainly in the study area, but less attention is paid to the development of common metacognitive abilities, including problems’ solving. With problem-based learning, knowledge is acquired through interaction with the surrounding world and the development of one’s own judgments. This educational methodology has been known for a long time, but our goal is to support this process by using tools for working with knowledge at different stages of problems’ solving. The paper provides a theoretical overview of previous studies and the current state of the problem-based learning area and work with knowledge. We have studied knowledge management tools based on several classifications and selected the most suitable ones to support problem-based learning. The distribution of work tools with knowledge on the stages of the problemsolving process is proposed. We have considered the use of computer tools and options for creating an informational educational environment for implementing the proposed ideas of problem-based learning and working with knowledge. A more detailed consideration is given to the implementation

  7. Potential-based methodology for active sound control in three dimensional settings.

    Science.gov (United States)

    Lim, H; Utyuzhnikov, S V; Lam, Y W; Kelly, L

    2014-09-01

    This paper extends a potential-based approach to active noise shielding with preservation of wanted sound in three-dimensional settings. The approach, which was described in a previous publication [Lim et al., J. Acoust. Soc. Am. 129(2), 717-725 (2011)], provides several significant advantages over conventional noise control methods. Most significantly, the methodology does not require any information including the characterization of sources, impedance boundary conditions and surrounding medium, and that the methodology automatically differentiates between the wanted and unwanted sound components. The previous publication proved the concept in one-dimensional conditions. In this paper, the approach for more realistic conditions is studied by numerical simulation and experimental validation in three-dimensional cases. The results provide a guideline to the implementation of the active shielding method with practical three-dimensional conditions. Through numerical simulation it is demonstrated that while leaving the wanted sound unchanged, the developed approach offers selective volumetric noise cancellation within a targeted domain. In addition, the method is implemented in a three-dimensional experiment with a white noise source in a semi-anechoic chamber. The experimental study identifies practical difficulties and limitations in the use of the approach for real applications.

  8. Application of machine learning methodology for pet-based definition of lung cancer

    Science.gov (United States)

    Kerhet, A.; Small, C.; Quon, H.; Riauka, T.; Schrader, L.; Greiner, R.; Yee, D.; McEwan, A.; Roa, W.

    2010-01-01

    We applied a learning methodology framework to assist in the threshold-based segmentation of non-small-cell lung cancer (nsclc) tumours in positron-emission tomography–computed tomography (pet–ct) imaging for use in radiotherapy planning. Gated and standard free-breathing studies of two patients were independently analysed (four studies in total). Each study had a pet–ct and a treatment-planning ct image. The reference gross tumour volume (gtv) was identified by two experienced radiation oncologists who also determined reference standardized uptake value (suv) thresholds that most closely approximated the gtv contour on each slice. A set of uptake distribution-related attributes was calculated for each pet slice. A machine learning algorithm was trained on a subset of the pet slices to cope with slice-to-slice variation in the optimal suv threshold: that is, to predict the most appropriate suv threshold from the calculated attributes for each slice. The algorithm’s performance was evaluated using the remainder of the pet slices. A high degree of geometric similarity was achieved between the areas outlined by the predicted and the reference suv thresholds (Jaccard index exceeding 0.82). No significant difference was found between the gated and the free-breathing results in the same patient. In this preliminary work, we demonstrated the potential applicability of a machine learning methodology as an auxiliary tool for radiation treatment planning in nsclc. PMID:20179802

  9. A Statistical Methodology for Determination of Safety Systems Actuation Setpoints Based on Extreme Value Statistics

    Directory of Open Access Journals (Sweden)

    D. R. Novog

    2008-01-01

    Full Text Available This paper provides a novel and robust methodology for determination of nuclear reactor trip setpoints which accounts for uncertainties in input parameters and models, as well as accounting for the variations in operating states that periodically occur. Further it demonstrates that in performing best estimate and uncertainty calculations, it is critical to consider the impact of all fuel channels and instrumentation in the integration of these uncertainties in setpoint determination. This methodology is based on the concept of a true trip setpoint, which is the reactor setpoint that would be required in an ideal situation where all key inputs and plant responses were known, such that during the accident sequence a reactor shutdown will occur which just prevents the acceptance criteria from being exceeded. Since this true value cannot be established, the uncertainties in plant simulations and plant measurements as well as operational variations which lead to time changes in the true value of initial conditions must be considered. This paper presents the general concept used to determine the actuation setpoints considering the uncertainties and changes in initial conditions, and allowing for safety systems instrumentation redundancy. The results demonstrate unique statistical behavior with respect to both fuel and instrumentation uncertainties which has not previously been investigated.

  10. Vocalist - an international programme for the validation of constraint based methodology in structural integrity

    International Nuclear Information System (INIS)

    Lidbury, D.; Bass, R.; Gilles, Ph.; Connors, D.; Eisele, U.; Keim, E.; Keinanen, H.; Marie, St.; Nagel, G.; Taylor, N.; Wadier, Y.

    2001-01-01

    The pattern of crack-tip stresses and strains causing plastic flow and fracture in components is different to that in test specimens. This gives rise to the so-called constraint effect. Crack-tip constraint in components is generally lower than in test specimens. Effective toughness is correspondingly higher. The fracture toughness measured on test specimens is thus likely to underestimate that exhibited by cracks in components. A 36-month programme was initiated in October 2000 as part of the Fifth Framework of the European Atomic Energy Community (EURATOM), with the objective of achieving (i) an improved defect assessment methodology for predicting safety margins; (ii) improved lifetime management arguments. The programme VOCALIST (Validation of Constraint Based Methodology in Structural Integrity) is one of a 'cluster' of Fifth Framework projects in the area of Plant Life Management (Nuclear Fission). VOCALIST is also an associated project of NESC (Network for Evaluating Steel Components). The present paper describes the aims and objectives of VOCALIST, its interactions with NESC, and gives details of its various Work Packages. (authors)

  11. A GIS – Based Methodology for Land Suitability Evaluation in Veneto (NE Italy

    Directory of Open Access Journals (Sweden)

    Alba Gallo

    2014-12-01

    Full Text Available Since almost ten years, the Soil Science Research Group in Venice is carrying out studies on the characterization of soils in the Veneto region and their suitability for specific uses. Several areas have been investigated with the aim to select the best land use for a sustainable environment. The scenarios taken into consideration range from the Alpine and pre – Alpine region to the alluvial plain. Attention has been focused especially to land suitability for forestry, typical and niche crops, pasture and vineyard. The land evaluation procedure has been applied by a GIS – based methodology. Today, the GIS techniques are essential for the success of a correct and fast work, concerning the interpretation and processing of soil data and its display in form of map. Integrating information with crop and soil requirements, by means of "matching tables", it was possible to edit and manage land suitability maps for specific purposes. The applied methodology proved a useful and effective tool for sustainable land management.

  12. Computationally based methodology for reengineering the high-level waste planning process at SRS

    International Nuclear Information System (INIS)

    Paul, P.K.; Gregory, M.V.; Wells, M.N.

    1997-01-01

    The Savannah River Site (SRS) has started processing its legacy of 34 million gallons of high-level radioactive waste into its final disposable form. The SRS high-level waste (HLW) complex consists of 51 waste storage tanks, 3 evaporators, 6 waste treatment operations, and 2 waste disposal facilities. It is estimated that processing wastes to clean up all tanks will take 30+ yr of operation. Integrating all the highly interactive facility operations through the entire life cycle in an optimal fashion-while meeting all the budgetary, regulatory, and operational constraints and priorities-is a complex and challenging planning task. The waste complex operating plan for the entire time span is periodically published as an SRS report. A computationally based integrated methodology has been developed that has streamlined the planning process while showing how to run the operations at economically and operationally optimal conditions. The integrated computational model replaced a host of disconnected spreadsheet calculations and the analysts' trial-and-error solutions using various scenario choices. This paper presents the important features of the integrated computational methodology and highlights the parameters that are core components of the planning process

  13. Active teaching-learning methodologies: medical students' views of problem-based learning

    Directory of Open Access Journals (Sweden)

    José Roberto Bittencourt Costa

    Full Text Available The prevailing undergraduate medical training process still favors disconnection and professional distancing from social needs. The Brazilian Ministries of Education and Health, through the National Curriculum Guidelines, the Incentives Program for Changes in the Medical Curriculum (PROMED, and the National Program for Reorientation of Professional Training in Health (PRO-SAÚDE, promoted the stimulus for an effective connection between medical institutions and the Unified National Health System (SUS. In accordance to the new paradigm for medical training, the Centro Universitário Serra dos Órgãos (UNIFESO established a teaching plan in 2005 using active methodologies, specifically problem-based learning (PBL. Research was conducted through semi-structured interviews with third-year undergraduate students at the UNIFESO Medical School. The results were categorized as proposed by Bardin's thematic analysis, with the purpose of verifying the students' impressions of the new curriculum. Active methodologies proved to be well-accepted by students, who defined them as exciting and inclusive of theory and practice in medical education.

  14. Methodology of synchronization among strategy and operation. A standards-based modeling approach

    Directory of Open Access Journals (Sweden)

    VICTOR EDWIN COLLAZOS

    2017-05-01

    Full Text Available Enterprise Architecture (EA has gained importance in recent years, mainly for its concept of “alignment” between the strategic and operational levels of organizations. Such alignment occurs when Information Technology (IT is applied correctly and timely, working in synergy and harmony with strategy and the operation to achieve mutually their own goals and satisfy the organizational needs.Both the strategic and operational levels have standards that help model elements necessary to obtain desired results. In this sense, BMM and BPMN were selected because both have the support of OMG and they are fairly well known for modelling the strategic level and operational level, respectively. In addition, i* modeling goal can be used for reducing the gap between these two standards. This proposal may help both the high-level design of the information system and to the appropriate identification of the business processes that will support it.This paper presents a methodology for aligning strategy and the operation based on standards and heuristics. We have made a classification for elements of the models and, for some specific cases, an extension of the heuristics associated between them. This allows us to propose methodology, which uses above-mentioned standards and combines mappings, transformations and actions to be considered in the alignment process.

  15. SCIRehab uses practice-based evidence methodology to associate patient and treatment characteristics with outcomes.

    Science.gov (United States)

    Whiteneck, Gale G; Gassaway, Julie

    2013-04-01

    To describe the application of practice-based evidence (PBE) methodology to spinal cord injury (SCI) rehabilitation in the SCIRehab study, and to summarize associations of patient characteristics and treatment interventions to outcomes. Prospective observational study. Six SCI rehabilitation centers. Patients with traumatic SCI (N=1376) admitted for first rehabilitation. Not applicable. FIM and residence at discharge, and FIM, residence, Craig Handicap Assessment and Reporting Technique, work/school status, Patient Health Questionnaire-9, Diener Satisfaction with Life Scale, rehospitalization, and presence of pressure ulcers at 1 year postinjury. Patient demographic and injury characteristics explained significant variation in rehabilitation outcomes, particularly functional outcomes. Regression modeling also identified a large number of significant associations with outcomes when total time in each discipline was modeled and when models were developed for each discipline, examining time spent in the many specific interventions provided by each discipline. The application of PBE methodology in the SCIRehab study provided extensive information about the process of inpatient SCI rehabilitation. While patient demographic and injury characteristics explain substantial variation in rehabilitation outcomes, particularly functional outcomes, significant relations also were found between the type and quantity of treatment interventions delivered by each rehabilitation discipline and a broad range of outcomes. Copyright © 2013 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  16. Development of a consistent Monte Carlo-deterministic transport methodology based on the method of characteristics and MCNP5

    International Nuclear Information System (INIS)

    Karriem, Z.; Ivanov, K.; Zamonsky, O.

    2011-01-01

    This paper presents work that has been performed to develop an integrated Monte Carlo- Deterministic transport methodology in which the two methods make use of exactly the same general geometry and multigroup nuclear data. The envisioned application of this methodology is in reactor lattice physics methods development and shielding calculations. The methodology will be based on the Method of Long Characteristics (MOC) and the Monte Carlo N-Particle Transport code MCNP5. Important initial developments pertaining to ray tracing and the development of an MOC flux solver for the proposed methodology are described. Results showing the viability of the methodology are presented for two 2-D general geometry transport problems. The essential developments presented is the use of MCNP as geometry construction and ray tracing tool for the MOC, verification of the ray tracing indexing scheme that was developed to represent the MCNP geometry in the MOC and the verification of the prototype 2-D MOC flux solver. (author)

  17. An Evidence Based Methodology to Facilitate Public Library Non-fiction Collection Development

    Directory of Open Access Journals (Sweden)

    Matthew Kelly

    2015-12-01

    Full Text Available Objective – This research was designed as a pilot study to test a methodology for subject based collection analysis for public libraries. Methods – WorldCat collection data from eight Australian public libraries was extracted using the Collection Evaluation application. The data was aggregated and filtered to assess how the sample’s titles could be compared against the OCLC Conspectus subject categories. A hierarchy of emphasis emerged and this was divided into tiers ranging from 1% of the sample. These tiers were further analysed to quantify their representativeness against both the sample’s titles and the subject categories taken as a whole. The interpretive aspect of the study sought to understand the types of knowledge embedded in the tiers and was underpinned by hermeneutic phenomenology. Results – The study revealed that there was a marked tendency for a small percentage of subject categories to constitute a large proportion of the potential topicality that might have been represented in these types of collections. The study also found that distribution of the aggregated collection conformed to a Power Law distribution (80/20 so that approximately 80% of the collection was represented by 20% of the subject categories. The study also found that there were significant commonalities in the types of subject categories that were found in the designated tiers and that it may be possible to develop ontologies that correspond to the collection tiers. Conclusions – The evidence-based methodology developed in this pilot study has the potential for further development to help to improve the practice of collection development. The introduction of the concept of the epistemic role played by collection tiers is a promising aid to inform our understanding of knowledge organization for public libraries. The research shows a way forward to help to link subjective decision making with a scientifically based approach to managing knowledge

  18. An overview of case-based and problem-based learning methodologies for dental education.

    Science.gov (United States)

    Nadershahi, Nader A; Bender, Daniel J; Beck, Lynn; Lyon, Cindy; Blaseio, Alexander

    2013-10-01

    Dental education has undergone significant curriculum reform in response to the 1995 Institute of Medicine report Dental Education at the Crossroads and the series of white papers from the American Dental Education Association Commission on Change and Innovation in Dental Education (ADEA CCI) first published in the Journal of Dental Education and subsequently collected in a volume titled Beyond the Crossroads: Change and Innovation in Dental Education. An important element of this reform has been the introduction into academic dentistry of active learning strategies such as problem-based and case-based learning. As an aide to broadening understanding of these approaches in order to support their expansion in dental education, this article reviews the major characteristics of each approach, situates each in adult learning theory, and discusses the advantages of case-based learning in the development of a multidisciplinary, integrated predoctoral dental curriculum.

  19. The new MCNP6 depletion capability

    International Nuclear Information System (INIS)

    Fensin, M. L.; James, M. R.; Hendricks, J. S.; Goorley, J. T.

    2012-01-01

    The first MCNP based in-line Monte Carlo depletion capability was officially released from the Radiation Safety Information and Computational Center as MCNPX 2.6.0. Both the MCNP5 and MCNPX codes have historically provided a successful combinatorial geometry based, continuous energy, Monte Carlo radiation transport solution for advanced reactor modeling and simulation. However, due to separate development pathways, useful simulation capabilities were dispersed between both codes and not unified in a single technology. MCNP6, the next evolution in the MCNP suite of codes, now combines the capability of both simulation tools, as well as providing new advanced technology, in a single radiation transport code. We describe here the new capabilities of the MCNP6 depletion code dating from the official RSICC release MCNPX 2.6.0, reported previously, to the now current state of MCNP6. NEA/OECD benchmark results are also reported. The MCNP6 depletion capability enhancements beyond MCNPX 2.6.0 reported here include: (1) new performance enhancing parallel architecture that implements both shared and distributed memory constructs; (2) enhanced memory management that maximizes calculation fidelity; and (3) improved burnup physics for better nuclide prediction. MCNP6 depletion enables complete, relatively easy-to-use depletion calculations in a single Monte Carlo code. The enhancements described here help provide a powerful capability as well as dictate a path forward for future development to improve the usefulness of the technology. (authors)

  20. The New MCNP6 Depletion Capability

    International Nuclear Information System (INIS)

    Fensin, Michael Lorne; James, Michael R.; Hendricks, John S.; Goorley, John T.

    2012-01-01

    The first MCNP based inline Monte Carlo depletion capability was officially released from the Radiation Safety Information and Computational Center as MCNPX 2.6.0. Both the MCNP5 and MCNPX codes have historically provided a successful combinatorial geometry based, continuous energy, Monte Carlo radiation transport solution for advanced reactor modeling and simulation. However, due to separate development pathways, useful simulation capabilities were dispersed between both codes and not unified in a single technology. MCNP6, the next evolution in the MCNP suite of codes, now combines the capability of both simulation tools, as well as providing new advanced technology, in a single radiation transport code. We describe here the new capabilities of the MCNP6 depletion code dating from the official RSICC release MCNPX 2.6.0, reported previously, to the now current state of MCNP6. NEA/OECD benchmark results are also reported. The MCNP6 depletion capability enhancements beyond MCNPX 2.6.0 reported here include: (1) new performance enhancing parallel architecture that implements both shared and distributed memory constructs; (2) enhanced memory management that maximizes calculation fidelity; and (3) improved burnup physics for better nuclide prediction. MCNP6 depletion enables complete, relatively easy-to-use depletion calculations in a single Monte Carlo code. The enhancements described here help provide a powerful capability as well as dictate a path forward for future development to improve the usefulness of the technology.

  1. Towards A Model-based Prognostics Methodology for Electrolytic Capacitors: A Case Study Based on Electrical Overstress Accelerated Aging

    Directory of Open Access Journals (Sweden)

    Gautam Biswas

    2012-12-01

    Full Text Available This paper presents a model-driven methodology for predict- ing the remaining useful life of electrolytic capacitors. This methodology adopts a Kalman filter approach in conjunction with an empirical state-based degradation model to predict the degradation of capacitor parameters through the life of the capacitor. Electrolytic capacitors are important components of systems that range from power supplies on critical avion- ics equipment to power drivers for electro-mechanical actuators. These devices are known for their comparatively low reliability and given their critical role in the system, they are good candidates for component level prognostics and health management. Prognostics provides a way to assess remain- ing useful life of a capacitor based on its current state of health and its anticipated future usage and operational conditions. This paper proposes and empirical degradation model and discusses experimental results for an accelerated aging test performed on a set of identical capacitors subjected to electrical stress. The data forms the basis for developing the Kalman-filter based remaining life prediction algorithm.

  2. A Model-Based Methodology for Simultaneous Design and Control of a Bioethanol Production Process

    DEFF Research Database (Denmark)

    Alvarado-Morales, Merlin; Abd.Hamid, Mohd-Kamaruddin; Sin, Gürkan

    2010-01-01

    In this work, a framework for the simultaneous solution of design and control problems is presented. Within this framework, two methodologies are presented, the integrated process design and controller design (IPDC) methodology and the process-group contribution (PGC) methodology. The concepts...... of attainable region (AR), driving force (DF), process-group (PG) and reverse simulation are used within these methodologies. The IPDC methodology is used to find the optimal design-control strategy of a process by locating the maximum point in the AR and DF diagrams for reactor and separator, respectively....... The PGC methodology is used to generate more efficient separation designs in terms of energy consumption by targeting the separation task at the largest DF. Both methodologies are highlighted through the application of two case studies, a bioethanol production process and a succinic acid production...

  3. A G-function-based reliability-based design methodology applied to a cam roller system

    International Nuclear Information System (INIS)

    Wang, W.; Sui, P.; Wu, Y.T.

    1996-01-01

    Conventional reliability-based design optimization methods treats the reliability function as an ordinary function and applies existing mathematical programming techniques to solve the design problem. As a result, the conventional approach requires nested loops with respect to g-function, and is very time consuming. A new reliability-based design method is proposed in this paper that deals with the g-function directly instead of the reliability function. This approach has the potential of significantly reducing the number of calls for g-function calculations since it requires only one full reliability analysis in a design iteration. A cam roller system in a typical high pressure fuel injection diesel engine is designed using both the proposed and the conventional approach. The proposed method is much more efficient for this application

  4. A Model-Based Methodology for Integrated Design and Operation of Reactive Distillation Processes

    DEFF Research Database (Denmark)

    Mansouri, Seyed Soheil; Sales-Cruz, Mauricio; Huusom, Jakob Kjøbsted

    2015-01-01

    calculation of reactive bubble points. For an energy-efficient design, the driving-forc eapproach (to determine the optimal feed location) for a reactive system has been employed. For both thereactive McCabe-Thiele and driving force method, vapor-liquid equilibrium data are based on elements. Thereactive...... bubble point algorithm is used to compute the reactive vapor-liquid equilibrium data set.The operation of the RDC at the highest driving force and other candidate points is compared through openloop and closed-loop analysis. By application of this methodology it is shown that designing the process atthe...... maximum driving force results in an energy efficient and operable design. It is verified that the reactive distillation design option is less sensitive to the disturbances in the feed at the highest driving force and hasthe inherent ability to reject disturbances....

  5. TOWARDS A MULTI-SCALE AGENT-BASED PROGRAMMING LANGUAGE METHODOLOGY.

    Science.gov (United States)

    Somogyi, Endre; Hagar, Amit; Glazier, James A

    2016-12-01

    Living tissues are dynamic, heterogeneous compositions of objects , including molecules, cells and extra-cellular materials, which interact via chemical, mechanical and electrical process and reorganize via transformation, birth, death and migration processes . Current programming language have difficulty describing the dynamics of tissues because: 1: Dynamic sets of objects participate simultaneously in multiple processes, 2: Processes may be either continuous or discrete, and their activity may be conditional, 3: Objects and processes form complex, heterogeneous relationships and structures, 4: Objects and processes may be hierarchically composed, 5: Processes may create, destroy and transform objects and processes. Some modeling languages support these concepts, but most cannot translate models into executable simulations. We present a new hybrid executable modeling language paradigm, the Continuous Concurrent Object Process Methodology ( CCOPM ) which naturally expresses tissue models, enabling users to visually create agent-based models of tissues, and also allows computer simulation of these models.

  6. Design methodology for a special single winding based bearingless switched reluctance motor

    Directory of Open Access Journals (Sweden)

    Madhurjya Dev Choudhury

    2017-06-01

    Full Text Available Bearingless switched reluctance motors (BSRMs have both magnetic bearing as well as conventional motor characteristics which make them suitable for diverse industrial applications. This study proposes a design methodology for a BSRM in order to calculate the appropriate geometrical dimensions essential for realising a minimum levitation force at every orientation of rotor. It is based on the stator–rotor overlap angle and helps in reducing the complexities associated with the self-bearing operation of a switched reluctance motor (SRM. Different from a conventional SRM, the motor under study deploys a special single set parallel winding scheme for simultaneous production of torque as well as radial force. An analytical model incorporating this single set winding is developed for calculating the torque and the radial force. The proposed bearingless design is verified by developing a two-dimensional finite-element model of a 12/8 SRM in ANSYS Maxwell.

  7. Construction Material-Based Methodology for Military Contingency Base Construction: Case Study of Maiduguri, Nigeria

    Science.gov (United States)

    2016-09-01

    characteris- tics of what is contained within the region.5 Cluster sites are also created based on researched information for each material, in order to...hemisphere. Forest certification allows for forest owners and managers to obtain recog- nition for their forest management practices. Certification is a...govern- ment certification that will enable them to continue in operation. Lastly, block manufacturing should not be left to people only in it for the

  8. The Toxicity of Depleted Uranium

    OpenAIRE

    Briner, Wayne

    2010-01-01

    Depleted uranium (DU) is an emerging environmental pollutant that is introduced into the environment primarily by military activity. While depleted uranium is less radioactive than natural uranium, it still retains all the chemical toxicity associated with the original element. In large doses the kidney is the target organ for the acute chemical toxicity of this metal, producing potentially lethal tubular necrosis. In contrast, chronic low dose exposure to depleted uranium may not produce a c...

  9. A Rapid Python-Based Methodology for Target-Focused Combinatorial Library Design.

    Science.gov (United States)

    Li, Shiliang; Song, Yuwei; Liu, Xiaofeng; Li, Honglin

    2016-01-01

    The chemical space is so vast that only a small portion of it has been examined. As a complementary approach to systematically probe the chemical space, virtual combinatorial library design has extended enormous impacts on generating novel and diverse structures for drug discovery. Despite the favorable contributions, high attrition rates in drug development that mainly resulted from lack of efficacy and side effects make it increasingly challenging to discover good chemical starting points. In most cases, focused libraries, which are restricted to particular regions of the chemical space, are deftly exploited to maximize hit rate and improve efficiency at the beginning of the drug discovery and drug development pipeline. This paper presented a valid methodology for fast target-focused combinatorial library design in both reaction-based and production-based ways with the library creating rates of approximately 70,000 molecules per second. Simple, quick and convenient operating procedures are the specific features of the method. SHAFTS, a hybrid 3D similarity calculation software, was embedded to help refine the size of the libraries and improve hit rates. Two target-focused (p38-focused and COX2-focused) libraries were constructed efficiently in this study. This rapid library enumeration method is portable and applicable to any other targets for good chemical starting points identification collaborated with either structure-based or ligand-based virtual screening.

  10. Legal, ethical, and methodological considerations in the Internet-based study of child pornography offenders.

    Science.gov (United States)

    Ray, James V; Kimonis, Eva R; Donoghue, Christine

    2010-01-01

    With its ever-growing penetration of remote regions of the world, the Internet provides great opportunity for conducting research. Beyond clear advantages such as increased cost-effectiveness and efficiency in collecting large samples, Internet-based research has proven particularly useful in reaching hidden or marginalized populations who engage in illegal or deviant behaviors. However, this new medium for research raises important and complex legal, ethical, and methodological/technological issues that researchers must address, particularly when studying undetected criminal behaviors. The current paper chronicles various issues that were encountered in the implementation of an active Internet-based pilot research study of child pornography (CP) users. Moreover, this study was undertaken to address a critical gap in the existing research on CP offending, which has to date primarily focused on incarcerated or convicted samples. The Internet provides the optimal medium for studying community populations of CP users, given that it has become the primary market for CP distribution. This paper is designed to serve as a guide for researchers interested in conducting Internet-based research studies on criminal and sexually deviant populations, particularly CP offenders. Several recommendations are offered based on our own experiences in the implementation of this study. Copyright 2009 John Wiley & Sons, Ltd.

  11. A Methodology for Texture Feature-based Quality Assessment in Nucleus Segmentation of Histopathology Image.

    Science.gov (United States)

    Wen, Si; Kurc, Tahsin M; Gao, Yi; Zhao, Tianhao; Saltz, Joel H; Zhu, Wei

    2017-01-01

    Image segmentation pipelines often are sensitive to algorithm input parameters. Algorithm parameters optimized for a set of images do not necessarily produce good-quality-segmentation results for other images. Even within an image, some regions may not be well segmented due to a number of factors, including multiple pieces of tissue with distinct characteristics, differences in staining of the tissue, normal versus tumor regions, and tumor heterogeneity. Evaluation of quality of segmentation results is an important step in image analysis. It is very labor intensive to do quality assessment manually with large image datasets because a whole-slide tissue image may have hundreds of thousands of nuclei. Semi-automatic mechanisms are needed to assist researchers and application developers to detect image regions with bad segmentations efficiently. Our goal is to develop and evaluate a machine-learning-based semi-automated workflow to assess quality of nucleus segmentation results in a large set of whole-slide tissue images. We propose a quality control methodology, in which machine-learning algorithms are trained with image intensity and texture features to produce a classification model. This model is applied to image patches in a whole-slide tissue image to predict the quality of nucleus segmentation in each patch. The training step of our methodology involves the selection and labeling of regions by a pathologist in a set of images to create the training dataset. The image regions are partitioned into patches. A set of intensity and texture features is computed for each patch. A classifier is trained with the features and the labels assigned by the pathologist. At the end of this process, a classification model is generated. The classification step applies the classification model to unlabeled test images. Each test image is partitioned into patches. The classification model is applied to each patch to predict the patch's label. The proposed methodology has been

  12. Software development methodology for computer based I&C systems of prototype fast breeder reactor

    International Nuclear Information System (INIS)

    Manimaran, M.; Shanmugam, A.; Parimalam, P.; Murali, N.; Satya Murty, S.A.V.

    2015-01-01

    Highlights: • Software development methodology adopted for computer based I&C systems of PFBR is detailed. • Constraints imposed as part of software requirements and coding phase are elaborated. • Compliance to safety and security requirements are described. • Usage of CASE (Computer Aided Software Engineering) tools during software design, analysis and testing phase are explained. - Abstract: Prototype Fast Breeder Reactor (PFBR) is sodium cooled reactor which is in the advanced stage of construction in Kalpakkam, India. Versa Module Europa bus based Real Time Computer (RTC) systems are deployed for Instrumentation & Control of PFBR. RTC systems have to perform safety functions within the stipulated time which calls for highly dependable software. Hence, well defined software development methodology is adopted for RTC systems starting from the requirement capture phase till the final validation of the software product. V-model is used for software development. IEC 60880 standard and AERB SG D-25 guideline are followed at each phase of software development. Requirements documents and design documents are prepared as per IEEE standards. Defensive programming strategies are followed for software development using C language. Verification and validation (V&V) of documents and software are carried out at each phase by independent V&V committee. Computer aided software engineering tools are used for software modelling, checking for MISRA C compliance and to carry out static and dynamic analysis. Various software metrics such as cyclomatic complexity, nesting depth and comment to code are checked. Test cases are generated using equivalence class partitioning, boundary value analysis and cause and effect graphing techniques. System integration testing is carried out wherein functional and performance requirements of the system are monitored

  13. Improving patient care in cardiac surgery using Toyota production system based methodology.

    Science.gov (United States)

    Culig, Michael H; Kunkle, Richard F; Frndak, Diane C; Grunden, Naida; Maher, Thomas D; Magovern, George J

    2011-02-01

    A new cardiac surgery program was developed in a community hospital setting using the operational excellence (OE) method, which is based on the principles of the Toyota production system. The initial results of the first 409 heart operations, performed over the 28 months between March 1, 2008, and June 30, 2010, are presented. Operational excellence methodology was taught to the cardiac surgery team. Coaching started 2 months before the opening of the program and continued for 24 months. Of the 409 cases presented, 253 were isolated coronary artery bypass graft operations. One operative death occurred. According to the database maintained by The Society of Thoracic Surgeons, the risk-adjusted operative mortality rate was 61% lower than the regional rate. Likewise, the risk-adjusted rate of major complications was 57% lower than The Society of Thoracic Surgeons regional rate. Daily solution to determine cause was attempted on 923 distinct perioperative problems by all team members. Using the cost of complications as described by Speir and coworkers, avoiding predicted complications resulted in a savings of at least $884,900 as compared with the regional average. By the systematic use of a real time, highly formatted problem-solving methodology, processes of care improved daily. Using carefully disciplined teamwork, reliable implementation of evidence-based protocols was realized by empowering the front line to make improvements. Low rates of complications were observed, and a cost savings of $3,497 per each case of isolated coronary artery bypass graft was realized. Copyright © 2011 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  14. Research on Methodology to Prioritize Critical Digital Assets based on Nuclear Risk Assessment

    International Nuclear Information System (INIS)

    Kim, Wonjik; Kwon, Kookheui; Kim, Hyundoo

    2016-01-01

    Digital systems are used in nuclear facilities to monitor and control various types of field devices, as well as to obtain and store vital information. Therefore, it is getting important for nuclear facilities to protect digital systems from cyber-attack in terms of safety operation and public health since cyber compromise of these systems could lead to unacceptable radiological consequences. Based on KINAC/RS-015 which is a cyber security regulatory standard, regulatory activities for cyber security at nuclear facilities generally focus on critical digital assets (CDAs) which are safety, security, and emergency preparedness related digital assets. Critical digital assets are estimated over 60% among all digital assets in a nuclear power plant. Therefore, it was required to prioritize critical digital assets to improve efficiency of regulation and implementation. In this paper, the research status on methodology development to prioritize critical digital assets based on nuclear risk assessment will be introduced. In this paper, to derive digital asset directly affect accident, PRA results (ET, FT, and minimal cut set) are analyzed. According to result of analysis, digital systems related to CD are derived ESF-CCS (safety-related component control system) and Process-CCS (non-safety-related component control system) as well as Engineered Safety Features Actuation System (ESFAS). These digital assets can be identified Vital Digital Asset (VDA). Hereafter, to develop general methodology which was identified VDA related to accident among CDAs, (1) method using result of minimal cut set in PRA model will be studied and (2) method quantifying result of Digital I and C PRA which is performed to reflect all digital cabinet related to system in FT will be studied

  15. The activity-based methodology to assess ship emissions - A review.

    Science.gov (United States)

    Nunes, R A O; Alvim-Ferraz, M C M; Martins, F G; Sousa, S I V

    2017-12-01

    Several studies tried to estimate atmospheric emissions with origin in the maritime sector, concluding that it contributed to the global anthropogenic emissions through the emission of pollutants that have a strong impact on hu' health and also on climate change. Thus, this paper aimed to review published studies since 2010 that used activity-based methodology to estimate ship emissions, to provide a summary of the available input data. After exclusions, 26 articles were analysed and the main information were scanned and registered, namely technical information about ships, ships activity and movement information, engines, fuels, load and emission factors. The larger part of studies calculating in-port ship emissions concluded that the majority was emitted during hotelling and most of the authors allocating emissions by ship type concluded that containerships were the main pollutant emitters. To obtain technical information about ships the combined use of data from Lloyd's Register of Shipping database with other sources such as port authority's databases, engine manufactures and ship-owners seemed the best approach. The use of AIS data has been growing in recent years and seems to be the best method to report activities and movements of ships. To predict ship powers the Hollenbach (1998) method which estimates propelling power as a function of instantaneous speed based on total resistance and use of load balancing schemes for multi-engine installations seemed to be the best practices for more accurate ship emission estimations. For emission factors improvement, new on-board measurement campaigns or studies should be undertaken. Regardless of the effort that has been performed in the last years to obtain more accurate shipping emission inventories, more precise input data (technical information about ships, engines, load and emission factors) should be obtained to improve the methodology to develop global and universally accepted emission inventories for an

  16. Improving Efficiency Using Time-Driven Activity-Based Costing Methodology.

    Science.gov (United States)

    Tibor, Laura C; Schultz, Stacy R; Menaker, Ronald; Weber, Bradley D; Ness, Jay; Smith, Paula; Young, Phillip M

    2017-03-01

    The aim of this study was to increase efficiency in MR enterography using a time-driven activity-based costing methodology. In February 2015, a multidisciplinary team was formed to identify the personnel, equipment, space, and supply costs of providing outpatient MR enterography. The team mapped the current state, completed observations, performed timings, and calculated costs associated with each element of the process. The team used Pareto charts to understand the highest cost and most time-consuming activities, brainstormed opportunities, and assessed impact. Plan-do-study-act cycles were developed to test the changes, and run charts were used to monitor progress. The process changes consisted of revising the workflow associated with the preparation and administration of glucagon, with completed implementation in November 2015. The time-driven activity-based costing methodology allowed the radiology department to develop a process to more accurately identify the costs of providing MR enterography. The primary process modification was reassigning responsibility for the administration of glucagon from nurses to technologists. After implementation, the improvements demonstrated success by reducing non-value-added steps and cost by 13%, staff time by 16%, and patient process time by 17%. The saved process time was used to augment existing examination time slots to more accurately accommodate the entire enterographic examination. Anecdotal comments were captured to validate improved staff satisfaction within the multidisciplinary team. This process provided a successful outcome to address daily workflow frustrations that could not previously be improved. A multidisciplinary team was necessary to achieve success, in addition to the use of a structured problem-solving approach. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  17. Groundwater depletion embedded in international food trade

    Science.gov (United States)

    Dalin, Carole; Wada, Yoshihide; Kastner, Thomas; Puma, Michael J.

    2017-03-01

    Recent hydrological modelling and Earth observations have located and quantified alarming rates of groundwater depletion worldwide. This depletion is primarily due to water withdrawals for irrigation, but its connection with the main driver of irrigation, global food consumption, has not yet been explored. Here we show that approximately eleven per cent of non-renewable groundwater use for irrigation is embedded in international food trade, of which two-thirds are exported by Pakistan, the USA and India alone. Our quantification of groundwater depletion embedded in the world’s food trade is based on a combination of global, crop-specific estimates of non-renewable groundwater abstraction and international food trade data. A vast majority of the world’s population lives in countries sourcing nearly all their staple crop imports from partners who deplete groundwater to produce these crops, highlighting risks for global food and water security. Some countries, such as the USA, Mexico, Iran and China, are particularly exposed to these risks because they both produce and import food irrigated from rapidly depleting aquifers. Our results could help to improve the sustainability of global food production and groundwater resource management by identifying priority regions and agricultural products at risk as well as the end consumers of these products.

  18. Spatio-temporal image-based parametric water surface reconstruction: a novel methodology based on refraction

    Science.gov (United States)

    Engelen, L.; Creëlle, S.; Schindfessel, L.; De Mulder, T.

    2018-03-01

    This paper presents a low-cost and easy-to-implement image-based reconstruction technique for laboratory experiments, which results in a temporal description of the water surface topography. The distortion due to refraction of a known pattern, located below the water surface, is used to fit a low parameter surface model that describes the time-dependent and three-dimensional surface variation. Instead of finding the optimal water depth for characteristic points on the surface, the deformation of the entire pattern is compared to its original shape. This avoids the need for feature tracking adopted in similar techniques, which improves the robustness to suboptimal optical conditions and small-scale, high-frequency surface perturbations. Experimental validation, by comparison with water depth measurements using a level gauge and pressure sensor, proves sub-millimetre accuracy for smooth and steady surface shapes. Although such accuracy cannot be achieved in case of highly dynamic surface phenomena, the low-frequency and large-scale free surface oscillations can still be measured with a temporal and spatial resolution mostly limited by the available optical set-up. The technique is initially intended for periodic surface phenomena, but the results presented in this paper indicate that also irregular surface shapes can robustly be reconstructed. Therefore, the presented technique is a promising tool for other research applications that require non-intrusive, low-cost surface measurements while maintaining visual access to the water below the surface. The latter ensures that the suggested surface reconstruction is compatible with simultaneous image-based velocity measurements, enabling a detailed study of the flow.

  19. A grey-based group decision-making methodology for the selection of hydrogen technologiess in Life Cycle Sustainability perspective

    DEFF Research Database (Denmark)

    Manzardo, Alessandro; Ren, Jingzheng; Mazzi, Anna

    2012-01-01

    The objective of this research is to develop a grey-based group decision-making methodology for the selection of the best renewable energy technology (including hydrogen) using a life cycle sustainability perspective. The traditional grey relational analysis has been modified to better address...... using the proposed methodology, electrolysis of water technology by hydropower has been considered to be the best technology for hydrogen production according to the decision-making group....

  20. A Novel Methodology for Improving Plant Pest Surveillance in Vineyards and Crops Using UAV-Based Hyperspectral and Spatial Data

    OpenAIRE

    Vanegas, Fernando; Bratanov, Dmitry; Powell, Kevin; Weiss, John; Gonzalez, Felipe

    2018-01-01

    Recent advances in remote sensed imagery and geospatial image processing using unmanned aerial vehicles (UAVs) have enabled the rapid and ongoing development of monitoring tools for crop management and the detection/surveillance of insect pests. This paper describes a (UAV) remote sensing-based methodology to increase the efficiency of existing surveillance practices (human inspectors and insect traps) for detecting pest infestations (e.g., grape phylloxera in vineyards). The methodology uses...

  1. Analytical Modeling of Unsteady Aluminum Depletion in Thermal Barrier Coatings

    OpenAIRE

    YEŞİLATA, Bülent

    2014-01-01

    The oxidation behavior of thermal barrier coatings (TBCs) in aircraft turbines is studied. A simple, unsteady and one-dimensional, diffusion model based on aluminum depletion from a bond-coat to form an oxide layer of Al2O3 is introduced. The model is employed for a case study with currently available experimental data. The diffusion coefficient of the depleted aluminum in the alloy, the concentration profiles at different oxidation times, and the thickness of Al-depleted region are...

  2. An experimental strategy validated to design cost-effective culture media based on response surface methodology.

    Science.gov (United States)

    Navarrete-Bolaños, J L; Téllez-Martínez, M G; Miranda-López, R; Jiménez-Islas, H

    2017-07-03

    For any fermentation process, the production cost depends on several factors, such as the genetics of the microorganism, the process condition, and the culture medium composition. In this work, a guideline for the design of cost-efficient culture media using a sequential approach based on response surface methodology is described. The procedure was applied to analyze and optimize a culture medium of registered trademark and a base culture medium obtained as a result of the screening analysis from different culture media used to grow the same strain according to the literature. During the experiments, the procedure quantitatively identified an appropriate array of micronutrients to obtain a significant yield and find a minimum number of culture medium ingredients without limiting the process efficiency. The resultant culture medium showed an efficiency that compares favorably with the registered trademark medium at a 95% lower cost as well as reduced the number of ingredients in the base culture medium by 60% without limiting the process efficiency. These results demonstrated that, aside from satisfying the qualitative requirements, an optimum quantity of each constituent is needed to obtain a cost-effective culture medium. Study process variables for optimized culture medium and scaling-up production for the optimal values are desirable.

  3. A core-monitoring based methodology for predictions of graphite weight loss in AGR moderator bricks

    International Nuclear Information System (INIS)

    McNally, K.; Warren, N.; Fahad, M.; Hall, G.; Marsden, B.J.

    2017-01-01

    Highlights: • A statistically-based methodology for estimating graphite density is presented. • Graphite shrinkage is accounted for using a finite element model. • Differences in weight loss forecasts were found when compared to the existing model. - Abstract: Physically based models, resolved using the finite element (FE) method are often used to model changes in dimensions and the associated stress fields of graphite moderator bricks within a reactor. These models require inputs that describe the loading conditions (temperature, fluence and weight loss ‘field variables’), and coded relationships describing the behaviour of graphite under these conditions. The weight loss field variables are calculated using a reactor chemistry/physics code FEAT DIFFUSE. In this work the authors consider an alternative data source of weight loss: that from a longitudinal dataset of density measurements made on small samples trepanned from operating reactors during statutory outages. A nonlinear mixed-effect model is presented for modelling the age and depth-related trends in density. A correction that accounts for irradiation-induced dimensional changes (axial and radial shrinkage) is subsequently applied. The authors compare weight loss forecasts made using FEAT DIFFUSE with those based on an alternative statistical model for a layer four moderator brick for the Hinkley Point B, Reactor 3. The authors compare the two approaches for the weight loss distribution through the brick with a particular focus on the interstitial keyway, and for the average (over the volume of the brick) weight loss.

  4. Competency-based curriculum and active methodology: perceptions of nursing students.

    Science.gov (United States)

    Paranhos, Vania Daniele; Mendes, Maria Manuela Rino

    2010-01-01

    This study identifies the perceptions of undergraduate students at the University of São Paulo at Ribeirão Preto, Brazil, College of Nursing (EERP-USP) concerning the teaching-learning process in two courses: Integrated Seminar: Health-Disease/Care Process in Health Services Policies and Organization, which was offered to first-year students in 2005 and 2006 and Integrality in Health Care I and II, which was offered to second-year students in 2006. The courses proposal was to adopt active methodology and competency-based curriculum. Data were collected from written tests submitted to 62 students at the end of the curse, focusing on the tests pertinence, development of performance, structure and pedagogical dynamics, organization and settings. Thematic analysis indicated that students enjoyed the courses, highlighted the role of the professor/facilitator at points of the pedagogical cycle and learning recorded in students portfolios. Students valued their experience in the Primary Health Care setting, which was based on, and has since the beginning of the program been based on, the theory-professional practice interlocution and closeness to the principles of the Unified Health System (SUS).

  5. FDTD-based optical simulations methodology for CMOS image sensors pixels architecture and process optimization

    Science.gov (United States)

    Hirigoyen, Flavien; Crocherie, Axel; Vaillant, Jérôme M.; Cazaux, Yvon

    2008-02-01

    This paper presents a new FDTD-based optical simulation model dedicated to describe the optical performances of CMOS image sensors taking into account diffraction effects. Following market trend and industrialization constraints, CMOS image sensors must be easily embedded into even smaller packages, which are now equipped with auto-focus and short-term coming zoom system. Due to miniaturization, the ray-tracing models used to evaluate pixels optical performances are not accurate anymore to describe the light propagation inside the sensor, because of diffraction effects. Thus we adopt a more fundamental description to take into account these diffraction effects: we chose to use Maxwell-Boltzmann based modeling to compute the propagation of light, and to use a software with an FDTD-based (Finite Difference Time Domain) engine to solve this propagation. We present in this article the complete methodology of this modeling: on one hand incoherent plane waves are propagated to approximate a product-use diffuse-like source, on the other hand we use periodic conditions to limit the size of the simulated model and both memory and computation time. After having presented the correlation of the model with measurements we will illustrate its use in the case of the optimization of a 1.75μm pixel.

  6. Towards A Model-based Prognostics Methodology for Electrolytic Capacitors: A Case Study Based on Electrical Overstress Accelerated Aging

    Data.gov (United States)

    National Aeronautics and Space Administration — A remaining useful life prediction methodology for elec- trolytic capacitors is presented. This methodology adopts a Kalman filter approach in conjunction with an...

  7. Towards A Model-Based Prognostics Methodology For Electrolytic Capacitors: A Case Study Based On Electrical Overstress Accelerated Aging

    Data.gov (United States)

    National Aeronautics and Space Administration — This paper presents a model-driven methodology for predict- ing the remaining useful life of electrolytic capacitors. This methodology adopts a Kalman filter...

  8. Depletable resources and the economy

    NARCIS (Netherlands)

    Heijman, W.J.M.

    1991-01-01

    The subject of this thesis is the depletion of scarce resources. The main question to be answered is how to avoid future resource crises. After dealing with the complex relation between nature and economics, three important concepts in relation with resource depletion are discussed: steady state,

  9. The Toxicity of Depleted Uranium

    Directory of Open Access Journals (Sweden)

    Wayne Briner

    2010-01-01

    Full Text Available Depleted uranium (DU is an emerging environmental pollutant that is introduced into the environment primarily by military activity. While depleted uranium is less radioactive than natural uranium, it still retains all the chemical toxicity associated with the original element. In large doses the kidney is the target organ for the acute chemical toxicity of this metal, producing potentially lethal tubular necrosis. In contrast, chronic low dose exposure to depleted uranium may not produce a clear and defined set of symptoms. Chronic low-dose, or subacute, exposure to depleted uranium alters the appearance of milestones in developing organisms. Adult animals that were exposed to depleted uranium during development display persistent alterations in behavior, even after cessation of depleted uranium exposure. Adult animals exposed to depleted uranium demonstrate altered behaviors and a variety of alterations to brain chemistry. Despite its reduced level of radioactivity evidence continues to accumulate that depleted uranium, if ingested, may pose a radiologic hazard. The current state of knowledge concerning DU is discussed.

  10. Best Practices for Mudweight Window Generation and Accuracy Assessment between Seismic Based Pore Pressure Prediction Methodologies for a Near-Salt Field in Mississippi Canyon, Gulf of Mexico

    Science.gov (United States)

    Mannon, Timothy Patrick, Jr.

    Improving well design has and always will be the primary goal in drilling operations in the oil and gas industry. Oil and gas plays are continuing to move into increasingly hostile drilling environments, including near and/or sub-salt proximities. The ability to reduce the risk and uncertainly involved in drilling operations in unconventional geologic settings starts with improving the techniques for mudweight window modeling. To address this issue, an analysis of wellbore stability and well design improvement has been conducted. This study will show a systematic approach to well design by focusing on best practices for mudweight window projection for a field in Mississippi Canyon, Gulf of Mexico. The field includes depleted reservoirs and is in close proximity of salt intrusions. Analysis of offset wells has been conducted in the interest of developing an accurate picture of the subsurface environment by making connections between depth, non-productive time (NPT) events, and mudweights used. Commonly practiced petrophysical methods of pore pressure, fracture pressure, and shear failure gradient prediction have been applied to key offset wells in order to enhance the well design for two proposed wells. For the first time in the literature, the accuracy of the commonly accepted, seismic interval velocity based and the relatively new, seismic frequency based methodologies for pore pressure prediction are qualitatively and quantitatively compared for accuracy. Accuracy standards will be based on the agreement of the seismic outputs to pressure data obtained while drilling and petrophysically based pore pressure outputs for each well. The results will show significantly higher accuracy for the seismic frequency based approach in wells that were in near/sub-salt environments and higher overall accuracy for all of the wells in the study as a whole.

  11. Affinity-based methodologies and ligands for antibody purification: advances and perspectives.

    Science.gov (United States)

    Roque, Ana C A; Silva, Cláudia S O; Taipa, M Angela

    2007-08-10

    Many successful, recent therapies for life-threatening diseases such as cancer and rheumatoid arthritis are based on the recognition between native or genetically engineered antibodies and cell-surface receptors. Although naturally produced by the immune system, the need for antibodies with unique specificities and designed for single application, has encouraged the search for novel antibody purification strategies. The availability of these products to the end-consumer is strictly related to manufacture costs, particularly those attributed to downstream processing. Over the last decades, academia and industry have developed different types of interactions and separation techniques for antibody purification, affinity-based strategies being the most common and efficient methodologies. The affinity ligands utilized range from biological to synthetic designed molecules with enhanced resistance and stability. Despite the successes achieved, the purification "paradigm" still moves interests and efforts in the continuous demand for improved separation performances. This review will focus on recent advances and perspectives in antibody purification by affinity interactions using different techniques, with particular emphasis on affinity chromatography.

  12. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform

    Science.gov (United States)

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-01-01

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform’s mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument’s working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform. PMID:27869722

  13. METHODOLOGICAL APPROACH TO ANALYSIS AND EVALUATION OF INFORMATION PROTECTION IN INFORMATION SYSTEMS BASED ON VULNERABILITY DANGER

    Directory of Open Access Journals (Sweden)

    Y. M. Krotiuk

    2008-01-01

    Full Text Available The paper considers a methodological approach to an analysis and estimation of information security in the information systems which is based on the analysis of vulnerabilities and an extent of their hazard. By vulnerability hazard it is meant a complexity of its operation as a part of an information system. The required and sufficient vulnerability operational conditions  have  been  determined in the paper. The paper proposes a generalized model for attack realization which is used as a basis for construction of an attack realization model for an operation of a particular vulnerability. A criterion for estimation of information protection in the information systems which is based on the estimation of vulnerability hazard is formulated in the paper. The proposed approach allows to obtain a quantitative estimation of the information system security on the basis of the proposed schemes on realization of typical attacks for the distinguished classes of vulnerabilities.The methodical approach is used for choosing variants to be applied for realization of protection mechanisms in the information systems as well as for estimation of information safety in the operating information systems.

  14. Temperature-based estimation of global solar radiation using soft computing methodologies

    Science.gov (United States)

    Mohammadi, Kasra; Shamshirband, Shahaboddin; Danesh, Amir Seyed; Abdullah, Mohd Shahidan; Zamani, Mazdak

    2016-07-01

    Precise knowledge of solar radiation is indeed essential in different technological and scientific applications of solar energy. Temperature-based estimation of global solar radiation would be appealing owing to broad availability of measured air temperatures. In this study, the potentials of soft computing techniques are evaluated to estimate daily horizontal global solar radiation (DHGSR) from measured maximum, minimum, and average air temperatures ( T max, T min, and T avg) in an Iranian city. For this purpose, a comparative evaluation between three methodologies of adaptive neuro-fuzzy inference system (ANFIS), radial basis function support vector regression (SVR-rbf), and polynomial basis function support vector regression (SVR-poly) is performed. Five combinations of T max, T min, and T avg are served as inputs to develop ANFIS, SVR-rbf, and SVR-poly models. The attained results show that all ANFIS, SVR-rbf, and SVR-poly models provide favorable accuracy. Based upon all techniques, the higher accuracies are achieved by models (5) using T max- T min and T max as inputs. According to the statistical results, SVR-rbf outperforms SVR-poly and ANFIS. For SVR-rbf (5), the mean absolute bias error, root mean square error, and correlation coefficient are 1.1931 MJ/m2, 2.0716 MJ/m2, and 0.9380, respectively. The survey results approve that SVR-rbf can be used efficiently to estimate DHGSR from air temperatures.

  15. DEVELOPING FINAL COURSE MONOGRAPHS USING A TEAM-BASED LEARNING METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Ani Mari Hartz

    2016-04-01

    Full Text Available This article describes an experience with the Team-Based Learning (TBL methodology in courses designed to support the planning and execution of final course monographs. It contains both professors’ and students’ perceptions, through observation and assessment. A qualitative approach using observation techniques and desk research was used in conjunction with a quantitative approach based on a questionnaire. The sample consisted of 49 students from a higher education institution, 27 of them in a Communication Course and the remaining 22 in a Business Administration course. Qualitative data analysis was performed through simple categorization with back-defined categories, while the quantitative data analysis employed descriptive statistics and cluster analysis using Minitab 17.1 software. The main findings include the identification of: three student profiles (designated as traditional, collaborative and practical; a preference for guidance and feedback from the professor rather than other students; and a need for a professor-led closing discussion when applying the TBL method. As regards the main benefits to students, they recognized that discussion in groups allowed them to realize how much they really know about the subject studied. Finally, most students seemed to like the TBL approach.

  16. Response surface methodology-based optimisation of agarose gel electrophoresis for screening and electropherotyping of rotavirus.

    Science.gov (United States)

    Mishra, Vikas; Nag, Vijaya Lakshmi; Tandon, Ritu; Awasthi, Shally

    2010-04-01

    Management of rotavirus diarrhoea cases and prevention of nosocomial infection require rapid diagnostic method at the patient care level. Diagnostic tests currently available are not routinely used due to economic or sensitivity/specificity constraints. Agarose-based sieving media and running conditions were modulated by using central composite design and response surface methodology for screening and electropherotyping of rotaviruses. The electrophoretic resolution of rotavirus genome was calculated from input parameters characterising the gel matrix structure and running conditions. Resolution of rotavirus genome was calculated by densitometric analysis of the gel. The parameters at critical values were able to resolve 11 segmented rotavirus genome. Better resolution and electropherotypic variation in 11 segmented double-stranded RNA genome of rotavirus was detected at 1.96% (w/v) agarose concentration, 0.073 mol l(-1) ionic strength of Tris base-boric acid-ethylenediamine tetraacetic acid buffer (1.4x) and 4.31 h of electrophoresis at 4.6 V cm(-1) electric field strength. Modified agarose gel electrophoresis can replace other methods as a simplified alternative for routine detection of rotavirus where it is not in practice.

  17. Methodology for selection of attributes and operating conditions for SVM-Based fault locator's

    Directory of Open Access Journals (Sweden)

    Debbie Johan Arredondo Arteaga

    2017-01-01

    Full Text Available Context: Energy distribution companies must employ strategies to meet their timely and high quality service, and fault-locating techniques represent and agile alternative for restoring the electric service in the power distribution due to the size of distribution services (generally large and the usual interruptions in the service. However, these techniques are not robust enough and present some limitations in both computational cost and the mathematical description of the models they use. Method: This paper performs an analysis based on a Support Vector Machine for the evaluation of the proper conditions to adjust and validate a fault locator for distribution systems; so that it is possible to determine the minimum number of operating conditions that allow to achieve a good performance with a low computational effort. Results: We tested the proposed methodology in a prototypical distribution circuit, located in a rural area of Colombia. This circuit has a voltage of 34.5 KV and is subdivided in 20 zones. Additionally, the characteristics of the circuit allowed us to obtain a database of 630.000 records of single-phase faults and different operating conditions. As a result, we could determine that the locator showed a performance above 98% with 200 suitable selected operating conditions. Conclusions: It is possible to improve the performance of fault locators based on Support Vector Machine. Specifically, these improvements are achieved by properly selecting optimal operating conditions and attributes, since they directly affect the performance in terms of efficiency and the computational cost.

  18. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform.

    Science.gov (United States)

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-11-18

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform's mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument's working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform.

  19. A Platform-Based Methodology for System-Level Mixed-Signal Design

    Directory of Open Access Journals (Sweden)

    Alberto Sangiovanni-Vincentelli

    2010-01-01

    Full Text Available The complexity of today's embedded electronic systems as well as their demanding performance and reliability requirements are such that their design can no longer be tackled with ad hoc techniques while still meeting tight time to-market constraints. In this paper, we present a system level design approach for electronic circuits, utilizing the platform-based design (PBD paradigm as the natural framework for mixed-domain design formalization. In PBD, a meet-in-the-middle approach allows systematic exploration of the design space through a series of top-down mapping of system constraints onto component feasibility models in a platform library, which is based on bottom-up characterizations. In this framework, new designs can be assembled from the precharacterized library components, giving the highest priority to design reuse, correct assembly, and efficient design flow from specifications to implementation. We apply concepts from design centering to enforce robustness to modeling errors as well as process, voltage, and temperature variations, which are currently plaguing embedded system design in deep-submicron technologies. The effectiveness of our methodology is finally shown on the design of a pipeline A/D converter and two receiver front-ends for UMTS and UWB communications.

  20. Three-dimensional design methodologies for tree-based FPGA architecture

    CERN Document Server

    Pangracious, Vinod; Mehrez, Habib

    2015-01-01

    This book focuses on the development of 3D design and implementation methodologies for Tree-based FPGA architecture. It also stresses the needs for new and augmented 3D CAD tools to support designs such as, the design for 3D, to manufacture high performance 3D integrated circuits and reconfigurable FPGA-based systems. This book was written as a text that covers the foundations of 3D integrated system design and FPGA architecture design. It was written for the use in an elective or core course at the graduate level in field of Electrical Engineering, Computer Engineering and Doctoral Research programs. No previous background on 3D integration is required, nevertheless fundamental understanding of 2D CMOS VLSI design is required. It is assumed that reader has taken the core curriculum in Electrical Engineering or Computer Engineering, with courses like CMOS VLSI design, Digital System Design and Microelectronics Circuits being the most important. It is accessible for self-study by both senior students and profe...

  1. Finding Needles in a Haystack: A Methodology for Identifying and Sampling Community-Based Youth Smoking Cessation Programs

    Science.gov (United States)

    Emery, Sherry; Lee, Jungwha; Curry, Susan J.; Johnson, Tim; Sporer, Amy K.; Mermelstein, Robin; Flay, Brian; Warnecke, Richard

    2010-01-01

    Background: Surveys of community-based programs are difficult to conduct when there is virtually no information about the number or locations of the programs of interest. This article describes the methodology used by the Helping Young Smokers Quit (HYSQ) initiative to identify and profile community-based youth smoking cessation programs in the…

  2. Scalability of a Methodology for Generating Technical Trading Rules with GAPs Based on Risk-Return Adjustment and Incremental Training

    Science.gov (United States)

    de La Cal, E. A.; Fernández, E. M.; Quiroga, R.; Villar, J. R.; Sedano, J.

    In previous works a methodology was defined, based on the design of a genetic algorithm GAP and an incremental training technique adapted to the learning of series of stock market values. The GAP technique consists in a fusion of GP and GA. The GAP algorithm implements the automatic search for crisp trading rules taking as objectives of the training both the optimization of the return obtained and the minimization of the assumed risk. Applying the proposed methodology, rules have been obtained for a period of eight years of the S&P500 index. The achieved adjustment of the relation return-risk has generated rules with returns very superior in the testing period to those obtained applying habitual methodologies and even clearly superior to Buy&Hold. This work probes that the proposed methodology is valid for different assets in a different market than previous work.

  3. METHODOLOGICAL BASES OF THE ESTIMATION OF CONSEQUENCES OF INFLUENCE OF MINING COMPLEXES ON THE ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    A.I. Semyachkov

    2007-12-01

    Full Text Available The methodology of complex forecasting, normalization and estimation of influence of mining complexes on an environment is developed and finished to practical realization. Practical realization has confirmed efficiency and commonality the developed methodology and has shown opportunities of its application for other categories of objects.

  4. Progressive design methodology for complex engineering systems based on multiobjective genetic algorithms and linguistic decision making

    NARCIS (Netherlands)

    Kumar, P.; Bauer, P.

    2008-01-01

    This work focuses on a design methodology that aids in design and development of complex engineering systems. This design methodology consists of simulation, optimization and decision making. Within this work a framework is presented in which modelling, multi-objective optimization and multi

  5. Hardware accelerated high performance neutron transport computation based on AGENT methodology

    Science.gov (United States)

    Xiao, Shanjie

    The spatial heterogeneity of the next generation Gen-IV nuclear reactor core designs brings challenges to the neutron transport analysis. The Arbitrary Geometry Neutron Transport (AGENT) AGENT code is a three-dimensional neutron transport analysis code being developed at the Laboratory for Neutronics and Geometry Computation (NEGE) at Purdue University. It can accurately describe the spatial heterogeneity in a hierarchical structure through the R-function solid modeler. The previous version of AGENT coupled the 2D transport MOC solver and the 1D diffusion NEM solver to solve the three dimensional Boltzmann transport equation. In this research, the 2D/1D coupling methodology was expanded to couple two transport solvers, the radial 2D MOC solver and the axial 1D MOC solver, for better accuracy. The expansion was benchmarked with the widely applied C5G7 benchmark models and two fast breeder reactor models, and showed good agreement with the reference Monte Carlo results. In practice, the accurate neutron transport analysis for a full reactor core is still time-consuming and thus limits its application. Therefore, another content of my research is focused on designing a specific hardware based on the reconfigurable computing technique in order to accelerate AGENT computations. It is the first time that the application of this type is used to the reactor physics and neutron transport for reactor design. The most time consuming part of the AGENT algorithm was identified. Moreover, the architecture of the AGENT acceleration system was designed based on the analysis. Through the parallel computation on the specially designed, highly efficient architecture, the acceleration design on FPGA acquires high performance at the much lower working frequency than CPUs. The whole design simulations show that the acceleration design would be able to speedup large scale AGENT computations about 20 times. The high performance AGENT acceleration system will drastically shortening the

  6. A methodology for texture feature-based quality assessment in nucleus segmentation of histopathology image

    Directory of Open Access Journals (Sweden)

    Si Wen

    2017-01-01

    Full Text Available Context: Image segmentation pipelines often are sensitive to algorithm input parameters. Algorithm parameters optimized for a set of images do not necessarily produce good-quality-segmentation results for other images. Even within an image, some regions may not be well segmented due to a number of factors, including multiple pieces of tissue with distinct characteristics, differences in staining of the tissue, normal versus tumor regions, and tumor heterogeneity. Evaluation of quality of segmentation results is an important step in image analysis. It is very labor intensive to do quality assessment manually with large image datasets because a whole-slide tissue image may have hundreds of thousands of nuclei. Semi-automatic mechanisms are needed to assist researchers and application developers to detect image regions with bad segmentations efficiently. Aims: Our goal is to develop and evaluate a machine-learning-based semi-automated workflow to assess quality of nucleus segmentation results in a large set of whole-slide tissue images. Methods: We propose a quality control methodology, in which machine-learning algorithms are trained with image intensity and texture features to produce a classification model. This model is applied to image patches in a whole-slide tissue image to predict the quality of nucleus segmentation in each patch. The training step of our methodology involves the selection and labeling of regions by a pathologist in a set of images to create the training dataset. The image regions are partitioned into patches. A set of intensity and texture features is computed for each patch. A classifier is trained with the features and the labels assigned by the pathologist. At the end of this process, a classification model is generated. The classification step applies the classification model to unlabeled test images. Each test image is partitioned into patches. The classification model is applied to each patch to predict the patch

  7. A game-based decision support methodology for competitive systems design

    Science.gov (United States)

    Briceno, Simon Ignacio

    This dissertation describes the development of a game-based methodology that facilitates the exploration and selection of research and development (R&D) projects under uncertain competitive scenarios. The proposed method provides an approach that analyzes competitor positioning and formulates response strategies to forecast the impact of technical design choices on a project's market performance. A critical decision in the conceptual design phase of propulsion systems is the selection of the best architecture, centerline, core size, and technology portfolio. This selection can be challenging when considering evolving requirements from both the airframe manufacturing company and the airlines in the market. Furthermore, the exceedingly high cost of core architecture development and its associated risk makes this strategic architecture decision the most important one for an engine company. Traditional conceptual design processes emphasize performance and affordability as their main objectives. These areas alone however, do not provide decision-makers with enough information as to how successful their engine will be in a competitive market. A key objective of this research is to examine how firm characteristics such as their relative differences in completing R&D projects, differences in the degree of substitutability between different project types, and first/second-mover advantages affect their product development strategies. Several quantitative methods are investigated that analyze business and engineering strategies concurrently. In particular, formulations based on the well-established mathematical field of game theory are introduced to obtain insights into the project selection problem. The use of game theory is explored in this research as a method to assist the selection process of R&D projects in the presence of imperfect market information. The proposed methodology focuses on two influential factors: the schedule uncertainty of project completion times and

  8. A RISK BASED METHODOLOGY TO ASSESS THE ENERGY EFFICIENCY IMPROVEMENTS IN TRADITIONALLY CONSTRUCTED BUILDINGS

    Directory of Open Access Journals (Sweden)

    D. Herrera

    2013-07-01

    Full Text Available In order to achieve the CO2 reduction targets set by the Scottish government, it will be necessary to improve the energy efficiency of existing buildings. Within the total Scottish building stock, historic and traditionally constructed buildings are an important proportion, in the order of 19 % (Curtis, 2010, and represent cultural, emotional and identity values that should be protected. However, retrofit interventions could be a complex operation because of the several aspects that are involved in the hygrothermal performance of traditional buildings. Moreover, all these factors interact with each other and therefore need to be analysed as a whole. Upgrading the envelope of traditional buildings may produce severe changes to the moisture migration leading to superficial or interstitial condensation and thus fabric decay and mould growth. Retrofit projects carried out in the past have failed because of the misunderstanding, or the lack of expert prediction, of the potential consequences associated to the envelope's alteration. The evaluation of potential risks, prior to any alteration on building's physics in order to improve its energy efficiency, is critical to avoid future damage on the wall's performance or occupants' health and well being. The aim of this PhD research project is to point out the most critical aspects related to the energy efficiency improvement of traditional buildings and to develop a risk based methodology that helps owners and practitioners during the decision making process.

  9. Predicting losing and gaining river reaches in lowland New Zealand based on a statistical methodology

    Science.gov (United States)

    Yang, Jing; Zammit, Christian; Dudley, Bruce

    2017-04-01

    The phenomenon of losing and gaining in rivers normally takes place in lowland where often there are various, sometimes conflicting uses for water resources, e.g., agriculture, industry, recreation, and maintenance of ecosystem function. To better support water allocation decisions, it is crucial to understand the location and seasonal dynamics of these losses and gains. We present a statistical methodology to predict losing and gaining river reaches in New Zealand based on 1) information surveys with surface water and groundwater experts from regional government, 2) A collection of river/watershed characteristics, including climate, soil and hydrogeologic information, and 3) the random forests technique. The surveys on losing and gaining reaches were conducted face-to-face at 16 New Zealand regional government authorities, and climate, soil, river geometry, and hydrogeologic data from various sources were collected and compiled to represent river/watershed characteristics. The random forests technique was used to build up the statistical relationship between river reach status (gain and loss) and river/watershed characteristics, and then to predict for river reaches at Strahler order one without prior losing and gaining information. Results show that the model has a classification error of around 10% for "gain" and "loss". The results will assist further research, and water allocation decisions in lowland New Zealand.

  10. A new Inequity-in-Health Index based on Millennium Development Goals: methodology and validation.

    Science.gov (United States)

    Eslava-Schmalbach, J; Alfonso, H; Oliveros, H; Gaitán, H; Agudelo, C

    2008-02-01

    Developing a new Inequity-in-Health Index (IHI) assuming inequity as "inequality of health outcomes," based on Millennium Development Goals (MDG). Ecological study. Countries from around the world were included from United Nations, the World Bank, and a nonprofit organization's databases. The reliability and validity of this bidimensional IHI was tested. Main factor analysis (promax rotation) and main component analysis were used. Six variables were used for constructing the IHI was constructed with six variables: underweight children, child mortality, death from malaria in children aged 0-4, death from malaria at all ages, births attended by skilled health personnel, and immunization against measles. The IHI had high internal consistency (Cronbach's alpha=0.8504), was reliable (Spearman>0.9, P=0.0000), and had 0.3033pi around the world (range: 0pi-0.5984pi). IHI had high correlation with the human development and poverty indexes, health gap indicator, life expectancy at birth, probability of dying before 40 years of age, and Gini coefficients (Spearman>0.7, P=0.0000). IHI discriminated countries by income, region, indebtedness, and corruption level (Kruskal Wallis, P<0.01). IHI had sensitivity to change (P=0.0000). IHI is a bidimensional, valid and reliable index to monitor MDG. A new reliable methodology for developing bidimensional indicators is shown, which could be used for constructing other ones with their corresponding scores and graphs.

  11. Developing More Insights on Sustainable Consumption in China Based on Q Methodology

    Directory of Open Access Journals (Sweden)

    Ying Qu

    2015-10-01

    Full Text Available Being an important aspect of sustainable development, sustainable consumption has attracted great attention among Chinese politicians and academia, and Chinese governments have established policies that encourage sustainable consumption behaviors. However, unsustainable consumption behavior still remains predominant in China. This paper aims to classify consumers with similar traits, in terms of the characteristics of practicing sustainable consumption, into one group, so that their traits can be clearly understood, to enable governments to establish pointed policies for different groups of consumers. Q methodology, generally used to reveal the subjectivity of human beings involved in any situation, is applied in this paper to classify Chinese consumers based on Q sample design and data collection and analysis. Next, the traits of each group are analyzed in detail and comparison analyses are also conducted to compare the common and differentiating factors among the three groups. The results show that Chinese consumers can be classified into three groups: sustainable (Group 1, potential sustainable (Group 2 and unsustainable consumers (Group 3, according to their values and attitudes towards sustainable consumption. As such, Group 1 cares for the environment and has strong environmental values. They understand sustainable consumption and its functions. Group 2 needs more enlightenments and external stimuli to motivate them to consume sustainably. Group 3 needs to be informed about and educated on sustainable consumption to enable them to change their consumption behavior from unsustainable to sustainable. Suggestions and implications of encouraging each group of consumers to engage in sustainable consumption are also provided.

  12. Fast Lemons and Sour Boulders: Testing Crossmodal Correspondences Using an Internet-Based Testing Methodology

    Directory of Open Access Journals (Sweden)

    Andy T. Woods

    2013-09-01

    Full Text Available According to a popular family of hypotheses, crossmodal matches between distinct features hold because they correspond to the same polarity on several conceptual dimensions (such as active–passive, good–bad, etc. that can be identified using the semantic differential technique. The main problem here resides in turning this hypothesis into testable empirical predictions. In the present study, we outline a series of plausible consequences of the hypothesis and test a variety of well-established and previously untested crossmodal correspondences by means of a novel internet-based testing methodology. The results highlight that the semantic hypothesis cannot easily explain differences in the prevalence of crossmodal associations built on the same semantic pattern (fast lemons, slow prunes, sour boulders, heavy red; furthermore, the semantic hypothesis only minimally predicts what happens when the semantic dimensions and polarities that are supposed to drive such crossmodal associations are made more salient (e.g., by adding emotional cues that ought to make the good/bad dimension more salient; finally, the semantic hypothesis does not explain why reliable matches are no longer observed once intramodal dimensions with congruent connotations are presented (e.g., visually presented shapes and colour do not appear to correspond.

  13. Performance-based, cost- and time-effective PCB analytical methodology

    International Nuclear Information System (INIS)

    Alvarado, J. S.

    1998-01-01

    Laboratory applications for the analysis of PCBs (polychlorinated biphenyls) in environmental matrices such as soil/sediment/sludge and oil/waste oil were evaluated for potential reduction in waste, source reduction, and alternative techniques for final determination. As a consequence, new procedures were studied for solvent substitution, miniaturization of extraction and cleanups, minimization of reagent consumption, reduction of cost per analysis, and reduction of time. These new procedures provide adequate data that meet all the performance requirements for the determination of PCBs. Use of the new procedures reduced costs for all sample preparation techniques. Time and cost were also reduced by combining the new sample preparation procedures with the power of fast gas chromatography. Separation of Aroclor 1254 was achieved in less than 6 min by using DB-1 and SPB-608 columns. With the greatly shortened run times, reproducibility can be tested quickly and consequently with low cost. With performance-based methodology, the applications presented here can be applied now, without waiting for regulatory approval

  14. Measuring resource inequalities. The concepts and methodology for an area-based Gini coefficient

    International Nuclear Information System (INIS)

    Druckman, A.; Jackson, T.

    2008-01-01

    Although inequalities in income and expenditure are relatively well researched, comparatively little attention has been paid, to date, to inequalities in resource use. This is clearly a shortcoming when it comes to developing informed policies for sustainable consumption and social justice. This paper describes an indicator of inequality in resource use called the AR-Gini. The AR-Gini is an area-based measure of resource inequality that estimates inequalities between neighbourhoods with regard to the consumption of specific consumer goods. It is also capable of estimating inequalities in the emissions resulting from resource use, such as carbon dioxide emissions from energy use, and solid waste arisings from material resource use. The indicator is designed to be used as a basis for broadening the discussion concerning 'food deserts' to inequalities in other types of resource use. By estimating the AR-Gini for a wide range of goods and services we aim to enhance our understanding of resource inequalities and their drivers, identify which resources have highest inequalities, and to explore trends in inequalities. The paper describes the concepts underlying the construction of the AR-Gini and its methodology. Its use is illustrated by pilot applications (specifically, men's and boys' clothing, carpets, refrigerators/freezers and clothes washer/driers). The results illustrate that different levels of inequality are associated with different commodities. The paper concludes with a brief discussion of some possible policy implications of the AR-Gini. (author)

  15. Cerebral methodology based computing to estimate real phenomena from large-scale nuclear simulation

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2011-01-01

    Our final goal is to estimate real phenomena from large-scale nuclear simulations by using computing processes. Large-scale simulations mean that they include scale variety and physical complexity so that corresponding experiments and/or theories do not exist. In nuclear field, it is indispensable to estimate real phenomena from simulations in order to improve the safety and security of nuclear power plants. Here, the analysis of uncertainty included in simulations is needed to reveal sensitivity of uncertainty due to randomness, to reduce the uncertainty due to lack of knowledge and to lead a degree of certainty by verification and validation (V and V) and uncertainty quantification (UQ) processes. To realize this, we propose 'Cerebral Methodology based Computing (CMC)' as computing processes with deductive and inductive approaches by referring human reasoning processes. Our idea is to execute deductive and inductive simulations contrasted with deductive and inductive approaches. We have established its prototype system and applied it to a thermal displacement analysis of a nuclear power plant. The result shows that our idea is effective to reduce the uncertainty and to get the degree of certainty. (author)

  16. Measuring Effectiveness in Digital Game-Based Learning: A Methodological Review.

    Directory of Open Access Journals (Sweden)

    Anissa All

    2014-06-01

    Full Text Available In recent years, a growing number of studies are being conducted into the effectiveness of digital game-based learning (DGBL. Despite this growing interest, there is a lack of sound empirical evidence on the effectiveness of DGBL due to different outcome measures for assessing effectiveness, varying methods of data collection and inconclusive or difficult to interpret results. This has resulted in a need for an overarching methodology for assessing the effectiveness of DGBL. The present study took a first step in this direction by mapping current methods used for assessing the effectiveness of DGBL. Results showed that currently, comparison of results across studies and thus looking at effectiveness of DGBL on a more general level is problematic due to diversity in and suboptimal study designs. Variety in study design relates to three issues, namely different activities that are implemented in the control groups, different measures for assessing the effectiveness of DGBL and the use of different statistical techniques for analyzing learning outcomes. Suboptimal study designs are the result of variables confounding study results. Possible confounds that were brought forward in this review are elements that are added to the game as part of the educational intervention (e.g., required reading, debriefing session, instructor influences and practice effects when using the same test pre- and post-intervention. Lastly, incomplete information on the study design impedes replication of studies and thus falsification of study results.

  17. Development of flaxseed fortified rice - corn flour blend based extruded product by response surface methodology.

    Science.gov (United States)

    Ganorkar, P M; Jain, R K

    2015-08-01

    Flaxseed imparted the evidence of health benefits in human being. Response surface methodology (RSM) was employed to develop flaxseed fortified rice - corn flour blend based extruded product using twin screw extruder. The effect of roasted flaxseed flour (RFF) fortification (15-25 %), moisture content of feed (12-16 %, wb), extruder barrel temperature (120-140 °C) and screw speed (300-330 RPM) on expansion ratio (ER), breaking strength (BS), bulk density (BD) and overall acceptability (OAA) score of extrudates were investigated using central composite rotatable design (CCRD). Increased RFF level decreased the ER and OAA score significantly while increased BS and BD of extrudates (p flour, 16 % moisture content (wb) of extruder feed, 120 °C extruder barrel temperature and 330 RPM of screw speed gave an optimized product of high desirability with corresponding responses as 3.08 ER, 0.53 kgf BS, 0.106 g.cm(-3) BD and 7.86 OAA.

  18. METHODOLOGICAL BASES OF THE OPTIMIZATION OF ORGANIZATIONAL MANAGEMENT STRUCTURE AT IMPLEMENTING THE MAJOR CONSTRUCTION ENTERPRISE STRATEGY

    Directory of Open Access Journals (Sweden)

    Rodionova Svetlana Vladimirovna

    2015-09-01

    Full Text Available Planning and implementation of innovations on the microlevel of management and on the higher levels is a process of innovative projects portfolio implementation. Project management is aimed at some goal; therefore, defining the mission and aims of implementation is of primary importance. These are the part of the notion of development strategy of an enterprise. Creating a strategy for big construction holding companies is complicated by the necessity to account for different factors effecting each business-block and subsidiary companies. The authors specify an algorithm of development and implementation of the activity strategy of a big construction enterprise. A special importance of the correspondence of organizational management structure to the implemented strategy is shown. The innovative character of organizational structure change is justified. The authors offer methods to optimize the organizational management structure based on communication approach with the use of the elements graph theory. The offered methodological provisions are tested on the example of the Russian JSC “RZhDstroy”.

  19. U.S. Natural Gas Storage Risk-Based Ranking Methodology and Results

    Energy Technology Data Exchange (ETDEWEB)

    Folga, Steve [Argonne National Lab. (ANL), Argonne, IL (United States); Portante, Edgar [Argonne National Lab. (ANL), Argonne, IL (United States); Shamsuddin, Shabbir [Argonne National Lab. (ANL), Argonne, IL (United States); Tompkins, Angeli [Argonne National Lab. (ANL), Argonne, IL (United States); Talaber, Leah [Argonne National Lab. (ANL), Argonne, IL (United States); McLamore, Mike [Argonne National Lab. (ANL), Argonne, IL (United States); Kavicky, Jim [Argonne National Lab. (ANL), Argonne, IL (United States); Conzelmann, Guenter [Argonne National Lab. (ANL), Argonne, IL (United States); Levin, Todd [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-10-01

    This report summarizes the methodology and models developed to assess the risk to energy delivery from the potential loss of underground gas storage (UGS) facilities located within the United States. The U.S. has a total of 418 existing storage fields, of which 390 are currently active. The models estimate the impacts of a disruption of each of the active UGS facilities on their owners/operators, including (1) local distribution companies (LDCs), (2) directly connected transporting pipelines and thus on the customers in downstream States, and (3) third-party entities and thus on contracted customers expecting the gas shipment. Impacts are measured across all natural gas customer classes. For the electric sector, impacts are quantified in terms of natural gas-fired electric generation capacity potentially affected from the loss of a UGS facility. For the purpose of calculating the overall supply risk, the overall consequence of the disruption of an UGS facility across all customer classes is expressed in terms of the number of expected equivalent residential customer outages per year, which combines the unit business interruption cost per customer class and the estimated number of affected natural gas customers with estimated probabilities of UGS disruptions. All models and analyses are based on publicly available data. The report presents a set of findings and recommendations in terms of data, further analyses, regulatory requirements and standards, and needs to improve gas/electric industry coordination for electric reliability.

  20. Development of a skeletal multi-component fuel reaction mechanism based on decoupling methodology

    International Nuclear Information System (INIS)

    Mohan, Balaji; Tay, Kun Lin; Yang, Wenming; Chua, Kian Jon

    2015-01-01

    Highlights: • A compact multi-component skeletal reaction mechanism was developed. • Combined bio-diesel and PRF mechanism was proposed. • The mechanism consists of 68 species and 183 reactions. • Well validated against ignition delay times, flame speed and engine results. - Abstract: A new coupled bio-diesel surrogate and primary reference fuel (PRF) oxidation skeletal mechanism has been developed. The bio-diesel surrogate sub-mechanism consists of oxidation sub-mechanisms of Methyl decanoate (MD), Methyl 9-decenoate (MD9D) and n-Heptane fuel components. The MD and MD9D are chosen to represent the saturated and unsaturated methyl esters respectively in bio-diesel fuels. Then, a reduced iso-Octane oxidation sub-mechanism is added to the bio-diesel surrogate sub-mechanism. Then, all the sub-mechanisms are integrated to a reduced C 2 –C 3 mechanism, detailed H 2 /CO/C 1 mechanism and reduced NO x mechanism based on decoupling methodology. The final mechanism consisted of 68 species and 183 reactions. The mechanism was well validated with shock-tube ignition delay times, laminar flame speed and 3D engine simulations.

  1. A Novel Methodology for Improving Plant Pest Surveillance in Vineyards and Crops Using UAV-Based Hyperspectral and Spatial Data

    Directory of Open Access Journals (Sweden)

    Fernando Vanegas

    2018-01-01

    Full Text Available Recent advances in remote sensed imagery and geospatial image processing using unmanned aerial vehicles (UAVs have enabled the rapid and ongoing development of monitoring tools for crop management and the detection/surveillance of insect pests. This paper describes a (UAV remote sensing-based methodology to increase the efficiency of existing surveillance practices (human inspectors and insect traps for detecting pest infestations (e.g., grape phylloxera in vineyards. The methodology uses a UAV integrated with advanced digital hyperspectral, multispectral, and RGB sensors. We implemented the methodology for the development of a predictive model for phylloxera detection. In this method, we explore the combination of airborne RGB, multispectral, and hyperspectral imagery with ground-based data at two separate time periods and under different levels of phylloxera infestation. We describe the technology used—the sensors, the UAV, and the flight operations—the processing workflow of the datasets from each imagery type, and the methods for combining multiple airborne with ground-based datasets. Finally, we present relevant results of correlation between the different processed datasets. The objective of this research is to develop a novel methodology for collecting, processing, analising and integrating multispectral, hyperspectral, ground and spatial data to remote sense different variables in different applications, such as, in this case, plant pest surveillance. The development of such methodology would provide researchers, agronomists, and UAV practitioners reliable data collection protocols and methods to achieve faster processing techniques and integrate multiple sources of data in diverse remote sensing applications.

  2. A Novel Methodology for Improving Plant Pest Surveillance in Vineyards and Crops Using UAV-Based Hyperspectral and Spatial Data.

    Science.gov (United States)

    Vanegas, Fernando; Bratanov, Dmitry; Powell, Kevin; Weiss, John; Gonzalez, Felipe

    2018-01-17

    Recent advances in remote sensed imagery and geospatial image processing using unmanned aerial vehicles (UAVs) have enabled the rapid and ongoing development of monitoring tools for crop management and the detection/surveillance of insect pests. This paper describes a (UAV) remote sensing-based methodology to increase the efficiency of existing surveillance practices (human inspectors and insect traps) for detecting pest infestations (e.g., grape phylloxera in vineyards). The methodology uses a UAV integrated with advanced digital hyperspectral, multispectral, and RGB sensors. We implemented the methodology for the development of a predictive model for phylloxera detection. In this method, we explore the combination of airborne RGB, multispectral, and hyperspectral imagery with ground-based data at two separate time periods and under different levels of phylloxera infestation. We describe the technology used-the sensors, the UAV, and the flight operations-the processing workflow of the datasets from each imagery type, and the methods for combining multiple airborne with ground-based datasets. Finally, we present relevant results of correlation between the different processed datasets. The objective of this research is to develop a novel methodology for collecting, processing, analising and integrating multispectral, hyperspectral, ground and spatial data to remote sense different variables in different applications, such as, in this case, plant pest surveillance. The development of such methodology would provide researchers, agronomists, and UAV practitioners reliable data collection protocols and methods to achieve faster processing techniques and integrate multiple sources of data in diverse remote sensing applications.

  3. A Novel Methodology for Improving Plant Pest Surveillance in Vineyards and Crops Using UAV-Based Hyperspectral and Spatial Data

    Science.gov (United States)

    Vanegas, Fernando; Weiss, John; Gonzalez, Felipe

    2018-01-01

    Recent advances in remote sensed imagery and geospatial image processing using unmanned aerial vehicles (UAVs) have enabled the rapid and ongoing development of monitoring tools for crop management and the detection/surveillance of insect pests. This paper describes a (UAV) remote sensing-based methodology to increase the efficiency of existing surveillance practices (human inspectors and insect traps) for detecting pest infestations (e.g., grape phylloxera in vineyards). The methodology uses a UAV integrated with advanced digital hyperspectral, multispectral, and RGB sensors. We implemented the methodology for the development of a predictive model for phylloxera detection. In this method, we explore the combination of airborne RGB, multispectral, and hyperspectral imagery with ground-based data at two separate time periods and under different levels of phylloxera infestation. We describe the technology used—the sensors, the UAV, and the flight operations—the processing workflow of the datasets from each imagery type, and the methods for combining multiple airborne with ground-based datasets. Finally, we present relevant results of correlation between the different processed datasets. The objective of this research is to develop a novel methodology for collecting, processing, analysing and integrating multispectral, hyperspectral, ground and spatial data to remote sense different variables in different applications, such as, in this case, plant pest surveillance. The development of such methodology would provide researchers, agronomists, and UAV practitioners reliable data collection protocols and methods to achieve faster processing techniques and integrate multiple sources of data in diverse remote sensing applications. PMID:29342101

  4. Low-Cost Fault Tolerant Methodology for Real Time MPSoC Based Embedded System

    Directory of Open Access Journals (Sweden)

    Mohsin Amin

    2014-01-01

    Full Text Available We are proposing a design methodology for a fault tolerant homogeneous MPSoC having additional design objectives that include low hardware overhead and performance. We have implemented three different FT methodologies on MPSoCs and compared them against the defined constraints. The comparison of these FT methodologies is carried out by modelling their architectures in VHDL-RTL, on Spartan 3 FPGA. The results obtained through simulations helped us to identify the most relevant scheme in terms of the given design constraints.

  5. Equity portfolio optimization: A DEA based methodology applied to the Zagreb Stock Exchange

    Directory of Open Access Journals (Sweden)

    Margareta Gardijan

    2015-10-01

    Full Text Available Most strategies for selection portfolios focus on utilizing solely market data and implicitly assume that stock markets communicate all relevant information to all market stakeholders, and that these markets cannot be influenced by investor activities. However convenient, this is a limited approach, especially when applied to small and illiquid markets such as the Croatian market, where such assumptions are hardly realistic. Thus, there is a demand for including other sources of data, such as financial reports. Research poses the question of whether financial ratios as criteria for stock selection are of any use to Croatian investors. Financial and market data from selected publicly companies listed on the Croatian capital market are used. A two-stage portfolio selection strategy is applied, where the first stage involves selecting stocks based on the respective Data Envelopment Analysis (DEA efficiency scores. DEA models are becoming popular in stock portfolio selection given that the methodology includes numerous models that provide a great flexibility in selecting inputs and outputs, which in turn are considered as criteria for portfolio selection. Accordingly, there is much room for improvement of the current proposed strategies for selecting portfolios. In the second stage, two portfolio-weighting strategies are applied using equal proportions and score-weighting. To show whether these strategies create outstanding out–of–sample portfolios in time, time-dependent DEA Window Analysis is applied using a reference time of one year, and portfolio returns are compared with the market portfolio for each period. It is found that the financial data are a significant indicator of the future performance of a stock and a DEA-based portfolio strategy outperforms market return.

  6. Dental Students' Perceived Clinical Competence in Prosthodontics: Comparison of Traditional and Problem-Based Learning Methodologies.

    Science.gov (United States)

    Montero, Javier; Dib, Abraham; Guadilla, Yasmina; Flores, Javier; Santos, Juan Antonio; Aguilar, Rosa Anaya; Gómez-Polo, Cristina

    2018-02-01

    The aim of this study was to compare the perceived competence for treating prosthodontic patients of two samples of fourth-year dental students: those educated using traditional methodologies and those educated using problem-based learning (PBL). Two cohorts of fourth-year dental students at a dental school in Spain were surveyed: the traditional methods cohort (n=46) was comprised of all students in academic years 2012 and 2013, and the PBL cohort (n=57) was comprised of all students in academic years 2014 and 2015. Students in both cohorts reported the number of prosthodontic treatments they carried out per year and their perceived level of competence in performing such treatments. The results showed that the average number of treatments performed was similar for the two cohorts, except the number of metal-based removable partial dentures was significantly higher for students in the traditional (0.8±1.0) than the PBL (0.4±0.6) cohort. The level of perceived competence to treat complete denture patients for the combined cohorts was significantly higher (7.3±1.1) than that for partial acrylic dentures (6.7±1.5) and combined dentures (5.7±1.3). Students' clinical competence in prosthodontics mainly depended on number of treatments performed as the operator as well as the assistant. Students in the traditional methods cohort considered themselves to be significantly more competent at treating patients for removable partial and fixed prostheses (7.8±1.1 and 7.6±1.1, respectively) than did students in the PBL cohort (6.4±1.5 and 6.6±1.5, respectively). Overall, however, the study found that practical experiences were more important than the teaching method used to achieve students' perceived competence.

  7. Sustainability of evidence-based healthcare: research agenda, methodological advances, and infrastructure support.

    Science.gov (United States)

    Proctor, Enola; Luke, Douglas; Calhoun, Annaliese; McMillen, Curtis; Brownson, Ross; McCrary, Stacey; Padek, Margaret

    2015-06-11

    Little is known about how well or under what conditions health innovations are sustained and their gains maintained once they are put into practice. Implementation science typically focuses on uptake by early adopters of one healthcare innovation at a time. The later-stage challenges of scaling up and sustaining evidence-supported interventions receive too little attention. This project identifies the challenges associated with sustainability research and generates recommendations for accelerating and strengthening this work. A multi-method, multi-stage approach, was used: (1) identifying and recruiting experts in sustainability as participants, (2) conducting research on sustainability using concept mapping, (3) action planning during an intensive working conference of sustainability experts to expand the concept mapping quantitative results, and (4) consolidating results into a set of recommendations for research, methodological advances, and infrastructure building to advance understanding of sustainability. Participants comprised researchers, funders, and leaders in health, mental health, and public health with shared interest in the sustainability of evidence-based health care. Prompted to identify important issues for sustainability research, participants generated 91 distinct statements, for which a concept mapping process produced 11 conceptually distinct clusters. During the conference, participants built upon the concept mapping clusters to generate recommendations for sustainability research. The recommendations fell into three domains: (1) pursue high priority research questions as a unified agenda on sustainability; (2) advance methods for sustainability research; (3) advance infrastructure to support sustainability research. Implementation science needs to pursue later-stage translation research questions required for population impact. Priorities include conceptual consistency and operational clarity for measuring sustainability, developing evidence

  8. Methodological Bases for Ranking the European Union Countries in Terms of Macroeconomic Security

    Directory of Open Access Journals (Sweden)

    Tymoshenko Olena V.

    2015-11-01

    Full Text Available The fundamental contradictions of existing methodical approaches to assessing the level of the state economic security have been substantiated and proposals on the introduction of a unified methodology for its assessment, which would be acceptable for use at the international level or for a specific cluster of countries, have been developed. Based on the conducted researches it has been found that the there are no unified signs for such classification of countries. To determine the most significant coefficients and critical values of the indicators of economic security, it is appropriate that the countries should be grouped in terms of the level of the economic development proposed by the UN Commission and the IMF. Analysis of the economic security level has been conducted for the countries-members of the European Union as a separate cluster of countries on the example of macroeconomic security indicators. Based on the evaluation it has been found that the proposed list of indicators and their critical values is economically sound and built on the principle of adequacy, representativeness and comprehensiveness. In 2004 the most secure countries of the EU corresponding to the macroeconomic security standards were Austria, Denmark, Sweden, Finland, and as in 2014 the percentage of absolutely secure countries decreased from 14.3 to 7.1%, only Denmark and Sweden remained in the ranking. During the analyzed period Bulgaria and Croatia got into the risk zone, Estonia, Lithuania, Latvia, Romania were in a danger zone. In 2014 Ukraine in terms of its macroeconomic security was in a critical state, which testified about serious structural and system imbalances in its development.

  9. Model-based Organization Manning, Strategy, and Structure Design via Team Optimal Design (TOD) Methodology

    National Research Council Canada - National Science Library

    Levchuk, Georgiy; Chopra, Kari; Paley, Michael; Levchuk, Yuri; Clark, David

    2005-01-01

    This paper describes a quantitative Team Optimal Design (TOD) methodology and its application to the design of optimized manning for E-10 Multi-sensor Command and Control Aircraft. The E-10 (USAF, 2002...

  10. A New Methodology of Multicriteria Decision-Making in Supplier Selection Based on Z-Numbers

    OpenAIRE

    Kang, Bingyi; Hu, Yong; Deng, Yong; Zhou, Deyun

    2016-01-01

    Supplier selection is a significant issue of multicriteria decision-making (MCDM), which has been heavily studied with classical fuzzy methodologies, but the reliability of the knowledge from domain experts is not efficiently taken into consideration. Z-number introduced by Zadeh has more power to describe the knowledge of human being with uncertain information considering both restraint and reliability. In this paper, a methodology for supplier selection using Z-numbers is proposed consideri...

  11. Integrated methodological frameworks for modelling agent-based advanced supply chain planning systems: A systematic literature review

    Directory of Open Access Journals (Sweden)

    Luis Antonio Santa-Eulalia

    2011-12-01

    Full Text Available Purpose: The objective of this paper is to provide a systematic literature review of recent developments in methodological frameworks for the modelling and simulation of agent-based advanced supply chain planning systems.Design/methodology/approach: A systematic literature review is provided to identify, select and make an analysis and a critical summary of all suitable studies in the area. It is organized into two blocks: the first one covers agent-based supply chain planning systems in general terms, while the second one specializes the previous search to identify those works explicitly containing methodological aspects.Findings: Among sixty suitable manuscripts identified in the primary literature search, only seven explicitly considered the methodological aspects. In addition, we noted that, in general, the notion of advanced supply chain planning is not considered unambiguously, that the social and individual aspects of the agent society are not taken into account in a clear manner in several studies and that a significant part of the works are of a theoretical nature, with few real-scale industrial applications. An integrated framework covering all phases of the modelling and simulation process is still lacking in the literature visited.Research limitations/implications: The main research limitations are related to the period covered (last four years, the selected scientific databases, the selected language (i.e. English and the use of only one assessment framework for the descriptive evaluation part.Practical implications: The identification of recent works in the domain and discussion concerning their limitations can help pave the way for new and innovative researches towards a complete methodological framework for agent-based advanced supply chain planning systems.Originality/value: As there are no recent state-of-the-art reviews in the domain of methodological frameworks for agent-based supply chain planning, this paper contributes to

  12. Towards a common methodology to simulate tree mortality based on ring-width data

    Science.gov (United States)

    Cailleret, Maxime; Bigler, Christof; Bugmann, Harald; Davi, Hendrik; Minunno, Francesco; Peltoniemi, Mikko; Martínez-Vilalta, Jordi

    2015-04-01

    Individual mortality is a key process of population and community dynamics, especially for long-lived species such as trees. As the rates of vegetation background mortality and of massive diebacks accelerated during the last decades and would continue in the future due to rising temperature and increasing drought, there is a growing demand of early warning signals that announce that the likelihood of death is very high. If physiological indicators have a high potential to predict tree mortality, their development requires an intensive tree monitoring which cannot be currently done on a representative sample of a population and on several species. An easier approach is to use radial growth data such as tree ring-widths measurements. During the last decades, an increasing number of studies aimed to derive these growth-mortality functions. However, as they followed different approaches concerning the choice of the sampling strategy (number of dead and living trees), of the type of growth explanatory variables (growth level, growth trend variables…), and of the length of the time-window (number of rings before death) used to calculate them, it makes difficult to compare results among studies and a subsequent biological interpretation. We detailed a new methodology for assessing reliable tree-ring based growth-mortality relationships using binomial logistic regression models. As examples we used published tree-ring datasets from Abies alba growing in 13 different sites, and from Nothofagus dombeyi and Quercus petraea located in one single site. Our first approach, based on constant samplings, aims to (1) assess the dependency of growth-mortality relationships on the statistical sampling scheme used; (2) determine the best length of the time-window used to calculate each growth variable; and (3) reveal the presence of intra-specific shifts in growth-mortality relationships. We also followed a Bayesian approach to build the best multi-variable logistic model considering

  13. Developing a methodology for the inverse estimation of root architectural parameters from field based sampling schemes

    Science.gov (United States)

    Morandage, Shehan; Schnepf, Andrea; Vanderborght, Jan; Javaux, Mathieu; Leitner, Daniel; Laloy, Eric; Vereecken, Harry

    2017-04-01

    Root traits are increasingly important in breading of new crop varieties. E.g., longer and fewer lateral roots are suggested to improve drought resistance of wheat. Thus, detailed root architectural parameters are important. However, classical field sampling of roots only provides more aggregated information such as root length density (coring), root counts per area (trenches) or root arrival curves at certain depths (rhizotubes). We investigate the possibility of obtaining the information about root system architecture of plants using field based classical root sampling schemes, based on sensitivity analysis and inverse parameter estimation. This methodology was developed based on a virtual experiment where a root architectural model was used to simulate root system development in a field, parameterized for winter wheat. This information provided the ground truth which is normally unknown in a real field experiment. The three sampling schemes coring, trenching, and rhizotubes where virtually applied to and aggregated information computed. Morris OAT global sensitivity analysis method was then performed to determine the most sensitive parameters of root architecture model for the three different sampling methods. The estimated means and the standard deviation of elementary effects of a total number of 37 parameters were evaluated. Upper and lower bounds of the parameters were obtained based on literature and published data of winter wheat root architectural parameters. Root length density profiles of coring, arrival curve characteristics observed in rhizotubes, and root counts in grids of trench profile method were evaluated statistically to investigate the influence of each parameter using five different error functions. Number of branches, insertion angle inter-nodal distance, and elongation rates are the most sensitive parameters and the parameter sensitivity varies slightly with the depth. Most parameters and their interaction with the other parameters show

  14. Application of a new methodology to evaluate Dnb limits based on statistical propagation of uncertainties

    International Nuclear Information System (INIS)

    Machado, Marcio Dornellas

    1998-09-01

    One of the most important thermalhydraulics safety parameters is the DNBR (Departure from Nucleate Boiling Ratio). The current methodology in use at Eletronuclear to determine DNBR is extremely conservative and may result in penalties to the reactor power due to an increase plugging level of steam generator tubes. This work uses a new methodology to evaluate DNBR, named mini-RTDP. The standard methodology (STDP) currently in use establishes a limit design value which cannot be surpassed. This limit value is determined taking into account the uncertainties of the empirical correlation used in COBRA IIC/MIT code, modified to Angra 1 conditions. The correlation used is the Westinghouse's W-3 and the minimum DNBR (MDBR) value cannot be less than 1.3. The new methodology reduces the excessive level of conservatism associated with the parameters used in the DNBR calculation, which take most unfavorable values in the STDP methodology, by using their best estimate values. The final goal is to obtain a new DNBR design limit which will provide a margin gain due to more realistic parameters values used in the methodology. (author)

  15. Too Depleted to Try? Testing the Process Model of Ego Depletion in the Context of Unhealthy Snack Consumption.

    Science.gov (United States)

    Haynes, Ashleigh; Kemps, Eva; Moffitt, Robyn

    2016-11-01

    The process model proposes that the ego depletion effect is due to (a) an increase in motivation toward indulgence, and (b) a decrease in motivation to control behaviour following an initial act of self-control. In contrast, the reflective-impulsive model predicts that ego depletion results in behaviour that is more consistent with desires, and less consistent with motivations, rather than influencing the strength of desires and motivations. The current study sought to test these alternative accounts of the relationships between ego depletion, motivation, desire, and self-control. One hundred and fifty-six undergraduate women were randomised to complete a depleting e-crossing task or a non-depleting task, followed by a lab-based measure of snack intake, and self-report measures of motivation and desire strength. In partial support of the process model, ego depletion was related to higher intake, but only indirectly via the influence of lowered motivation. Motivation was more strongly predictive of intake for those in the non-depletion condition, providing partial support for the reflective-impulsive model. Ego depletion did not affect desire, nor did depletion moderate the effect of desire on intake, indicating that desire may be an appropriate target for reducing unhealthy behaviour across situations where self-control resources vary. © 2016 The International Association of Applied Psychology.

  16. Hsp90 depletion goes wild

    OpenAIRE

    Siegal, Mark L; Masel, Joanna

    2012-01-01

    Abstract Hsp90 reveals phenotypic variation in the laboratory, but is Hsp90 depletion important in the wild? Recent work from Chen and Wagner in BMC Evolutionary Biology has discovered a naturally occurring Drosophila allele that downregulates Hsp90, creating sensitivity to cryptic genetic variation. Laboratory studies suggest that the exact magnitude of Hsp90 downregulation is important. Extreme Hsp90 depletion might reactivate transposable elements and/or induce aneuploidy, in addition to r...

  17. TENCompetence tools and I*Teach methodology in action: development of an active web-based teachers’ community

    NARCIS (Netherlands)

    Nikolova, Nikolina; Stefanov, Krassen; Todorova, Cornelia; Stefanova, Eliza; Ilieva, Miroslava; Sligte, Henk; Hernández-Leo, Davinia

    2009-01-01

    Nikolova, N., Stefanov, K., Todorova, K., Stefanova, E., Ilieva, M., Sligte, H., & Hernández-Leo, D. (2010). TENCompetence tools and I*Teach methodology in action: development of an active web-based teachers’ community. In D. Griffiths, & R. Koper (Eds.), Rethinking Learning and Employment at a

  18. ANALYSIS OF EFFECTIVENESS OF METHODOLOGICAL SYSTEM FOR PROBABILITY AND STOCHASTIC PROCESSES COMPUTER-BASED LEARNING FOR PRE-SERVICE ENGINEERS

    Directory of Open Access Journals (Sweden)

    E. Chumak

    2015-04-01

    Full Text Available The author substantiates that only methodological training systems of mathematical disciplines with implementation of information and communication technologies (ICT can meet the requirements of modern educational paradigm and make possible to increase the educational efficiency. Due to this fact, the necessity of developing the methodology of theory of probability and stochastic processes computer-based learning for pre-service engineers is underlined in the paper. The results of the experimental study for analysis of the efficiency of methodological system of theory of probability and stochastic processes computer-based learning for pre-service engineers are shown. The analysis includes three main stages: ascertaining, searching and forming. The key criteria of the efficiency of designed methodological system are the level of probabilistic and stochastic skills of students and their learning motivation. The effect of implementing the methodological system of probability theory and stochastic processes computer-based learning on the level of students’ IT literacy is shown in the paper. The expanding of the range of objectives of ICT applying by students is described by author. The level of formation of students’ learning motivation on the ascertaining and forming stages of the experiment is analyzed. The level of intrinsic learning motivation for pre-service engineers is defined on these stages of the experiment. For this purpose, the methodology of testing the students’ learning motivation in the chosen specialty is presented in the paper. The increasing of intrinsic learning motivation of the experimental group students (E group against the control group students (C group is demonstrated.

  19. A human error probability estimate methodology based on fuzzy inference and expert judgment on nuclear plants

    International Nuclear Information System (INIS)

    Nascimento, C.S. do; Mesquita, R.N. de

    2009-01-01

    Recent studies point human error as an important factor for many industrial and nuclear accidents: Three Mile Island (1979), Bhopal (1984), Chernobyl and Challenger (1986) are classical examples. Human contribution to these accidents may be better understood and analyzed by using Human Reliability Analysis (HRA), which has being taken as an essential part on Probabilistic Safety Analysis (PSA) of nuclear plants. Both HRA and PSA depend on Human Error Probability (HEP) for a quantitative analysis. These probabilities are extremely affected by the Performance Shaping Factors (PSF), which has a direct effect on human behavior and thus shape HEP according with specific environment conditions and personal individual characteristics which are responsible for these actions. This PSF dependence raises a great problem on data availability as turn these scarcely existent database too much generic or too much specific. Besides this, most of nuclear plants do not keep historical records of human error occurrences. Therefore, in order to overcome this occasional data shortage, a methodology based on Fuzzy Inference and expert judgment was employed in this paper in order to determine human error occurrence probabilities and to evaluate PSF's on performed actions by operators in a nuclear power plant (IEA-R1 nuclear reactor). Obtained HEP values were compared with reference tabled data used on current literature in order to show method coherence and valid approach. This comparison leads to a conclusion that this work results are able to be employed both on HRA and PSA enabling efficient prospection of plant safety conditions, operational procedures and local working conditions potential improvements (author)

  20. Theoretical and methodological bases of studying the symbolization of social and political reality in transit societies

    Directory of Open Access Journals (Sweden)

    O. V. Slavina

    2014-10-01

    Full Text Available This article is an attempt to form a methodological foundation to explore the process of symbolic constructioning of reality in the political systems in a state of democratic transition. From the author’s point of view, such transit systems differ with the phenomenal features of transitional type of sign-symbolic context. There are the most significant of them: the confrontation of symbols of old and new, and the formation of public anxiety due to violation of the established values (significant symbols. The result of these processes is the emergence of the conditions for increasing capacity of perception of new symbols (re-symbolization, transmigration of symbolic forms, the appearance of spontaneous symbolic interactions in the community in the form of political protests, rallies, and panic. In this regard, it is necessary to understand the possibilities of the productive management of the collective consciousness in transit period to achieve mental solidarity of concrete society with democratic values. To perform this task, author develops the appropriate tools, which are based on the phenomenological theory, the Schutz’s theory of the constitution of the multiple realities, the philosophy of symbolic forms of E. Cassirer, the theory of social construction of P. Berger and T. Luckmann, as well as Lotman’s semiotic concept. It is concluded that in the collision of alternative symbolic projects of social order it is advisable to resort to controlled symbolization (the production of special symbolic codes of political legitimation. At the same time it is important to understand the mechanisms of auto- symbolization of the society (changing of mass consciousness by virtue of the progressive development of the political culture of people. Careless use of these technologies in the countries with non-consolidated democracy may become a factor of destabilization and formation of the conditions for authoritarian rollback.

  1. INCREASING THE AIRCRAFT AIRWORTHINESS MAINTENANCE EFICIENCY BASED ON THE PROJECT MANAGEMENT METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Alexander Abramovich Itskovich

    2017-01-01

    Full Text Available The interrelation between aircraft airworthiness maintenance process (AAMP and the process of project man- agement methodology application is demonstrated.A project portfolio can be formed based on the strategic objectives. The projects with the highest priority are car- ried out, including those which strive to improve the efficiency of AAMP. The proposed approach allows to find the priori- ties of specific projects realization, which are included in the airline project portfolio.The project aimed to improve the efficiency of the AAMP of AN-124-100 of "Volga-Dnepr Airlines" is presented as an example. The statistical data analysis of failures of AN-124-100 fleet has demonstrated that wing components most frequently fail, especially spoiler sections, which are subjected to honeycomb skin mass exfoliation and need to be modi- fied. One of the expected project results should be the К1000 reduction of the wing spoilers not less than for 40 % and, respectively, the plane in total - not less than for 4 %.The work is executed in full compliance with the standards of project management. The passport of the project is given, which contains all the necessary information about the project: its goals, outcomes, results, timelines, action plan, budget and participants. A special attention is paid to the risks of the project, their probability assessment and the actions for overcoming possible consequences. It is shown that the implementation of the project "Introduction of aircraft AN-124-100 spoilers technology modi- fication" allows to improve a number of production and technical efficiency indicators, with material, financial and organi- zational resources optimization.

  2. Methodology of using CFD-based risk assessment in road tunnels

    Directory of Open Access Journals (Sweden)

    Vidmar Peter

    2007-01-01

    Full Text Available The definition of the deterministic approach in the safety analyses comes from the need to understand the conditions that come out during the fire accident in a road tunnel. The key factor of the tunnel operations during the fire is the ventilation, which during the initial fazes of the fire, impact strongly on the evacuation of people and latter on the access of the intervention units in the tunnel. The paper presents the use of the computational fluid dynamics model in the tunnel safety assessment process. The model is validated by comparing data with experimental and quantifying the differences. The set-up of the initial and boundary conditions and the requirement for grid density found during the validation tests is used to prepare three kind of fire scenarios 20 MW, 50 MW, and 100 MW, with different ventilation conditions; natural, semi transverse, full transverse, and longitudinal ventilation. The observed variables, soot density and temperature, are presented in minutes time steps trough the entire tunnel length. Comparing the obtained data in a table, allows the analyses of the ventilation conditions for different heat releases from fires. The second step is to add additional criteria of human behaviour inside the tunnel (evacuation and human resistance to the elevated gas concentrations and temperature. What comes out is a fully deterministic risk matrix that is based on the calculated data where the risk is ranged on five levels, from the lowest to a very danger level. The deterministic risk matrix represents the alternative to a probabilistic safety assessment methodology, where the fire risk is represented in detail as well as the computational fluid dynamics model results are physically correct. .

  3. Study on dynamic team performance evaluation methodology based on team situation awareness model

    International Nuclear Information System (INIS)

    Kim, Suk Chul

    2005-02-01

    The purpose of this thesis is to provide a theoretical framework and its evaluation methodology of team dynamic task performance of operating team at nuclear power plant under the dynamic and tactical environment such as radiological accident. This thesis suggested a team dynamic task performance evaluation model so called team crystallization model stemmed from Endsely's situation awareness model being comprised of four elements: state, information, organization, and orientation and its quantification methods using system dynamics approach and a communication process model based on a receding horizon control approach. The team crystallization model is a holistic approach for evaluating the team dynamic task performance in conjunction with team situation awareness considering physical system dynamics and team behavioral dynamics for a tactical and dynamic task at nuclear power plant. This model provides a systematic measure to evaluate time-dependent team effectiveness or performance affected by multi-agents such as plant states, communication quality in terms of transferring situation-specific information and strategies for achieving the team task goal at given time, and organizational factors. To demonstrate the applicability of the proposed model and its quantification method, the case study was carried out using the data obtained from a full-scope power plant simulator for 1,000MWe pressurized water reactors with four on-the-job operating groups and one expert group who knows accident sequences. Simulated results team dynamic task performance with reference key plant parameters behavior and team-specific organizational center of gravity and cue-and-response matrix illustrated good symmetry with observed value. The team crystallization model will be useful and effective tool for evaluating team effectiveness in terms of recruiting new operating team for new plant as cost-benefit manner. Also, this model can be utilized as a systematic analysis tool for

  4. Generalized perturbation theory for LWR depletion analysis and core design applications

    International Nuclear Information System (INIS)

    White, J.R.; Frank, B.R.

    1986-01-01

    A comprehensive time-dependent perturbation theory formulation that includes macroscopic depletion, thermal-hydraulic and poison feedback effects, and a criticality reset mechanism is developed. The methodology is compatible with most current LWR design codes. This new development allows GTP/DTP methods to be used quantitatively in a variety of realistic LWR physics applications that were not possible prior to this work. A GTP-based optimization technique for incore fuel management analyses is addressed as a promising application of the new formulation

  5. Design-based research as a “smart” methodology for studying learning in the context of work

    DEFF Research Database (Denmark)

    Kolbæk, Ditte

    2017-01-01

    Although Design-based Research (DBR) was developed for investigating class-room training this paper discusses methodological issues when DBR is employed for investigating learning in the context of work, as it is an authentic learning environment, a real-world setting for fostering learning...... and creating usable knowledge and knowing. The purpose of this paper is to provide new perspectives on DBR regarding how to conduct DBR for studying learning from experience in the context of work. The research question is: What to consider to make DBR a smart methodology for exploring learning from experience...

  6. ANL calculational methodologies for determining spent nuclear fuel source term

    International Nuclear Information System (INIS)

    McKnight, R. D.

    2000-01-01

    Over the last decade Argonne National Laboratory has developed reactor depletion methods and models to determine radionuclide inventories of irradiated EBR-II fuels. Predicted masses based on these calculational methodologies have been validated using available data from destructive measurements--first from measurements of lead EBR-II experimental test assemblies and later using data obtained from processing irradiated EBR-II fuel assemblies in the Fuel Conditioning Facility. Details of these generic methodologies are described herein. Validation results demonstrate these methods meet the FCF operations and material control and accountancy requirements

  7. Project-Based Learning and Agile Methodologies in Electronic Courses: Effect of Student Population and Open Issues

    Directory of Open Access Journals (Sweden)

    Marina Zapater

    2013-12-01

    Full Text Available Project-Based Learning (PBL and Agile methodologies have proven to be very interesting instructional strategies in Electronics and Engineering education, because they provide practical learning skills that help students understand the basis of electronics. In this paper we analyze two courses, one belonging to a Master in Electronic Engineering and one to a Bachelor in Telecommunication Engineering that apply Agile-PBL methodologies, and compare the results obtained in both courses with a traditional laboratory course. Our results support previous work stating that Agile-PBL methodologies increase student satisfaction. However, we also highlight some open issues that negatively affect the implementation of these methodologies,such as planning overhead or accidental complexity. Moreover,we show how differences in the student population, mostly related to the time spent on-campus, their commitment to the course or part-time dedication, have an impact on the benefits of Agile-PBL methods. In these cases, Agile-PBL methodologies by themselves are not enough and need to be combined with other techniques to increase student motivation.

  8. Genetic Algorithm-Based Optimization Methodology of Bézier Curves to Generate a DCI Microscale-Model

    Directory of Open Access Journals (Sweden)

    Jesus A. Basurto-Hurtado

    2017-11-01

    Full Text Available The aim of this article is to develop a methodology that is capable of generating micro-scale models of Ductile Cast Irons, which have the particular characteristic to preserve the smoothness of the graphite nodules contours that are lost by discretization errors when the contours are extracted using image processing. The proposed methodology uses image processing to extract the graphite nodule contours and a genetic algorithm-based optimization strategy to select the optimal degree of the Bézier curve that best approximate each graphite nodule contour. To validate the proposed methodology, a Finite Element Analysis (FEA was carried out using models that were obtained through three methods: (a using a fixed Bézier degree for all of the graphite nodule contours, (b the present methodology, and (c using a commercial software. The results were compared using the relative error of the equivalent stresses computed by the FEA, where the proposed methodology results were used as a reference. The present paper does not have the aim to define which models are the correct and which are not. However, in this paper, it has been shown that the errors generated in the discretization process should not be ignored when developing geometric models since they can produce relative errors of up to 35.9% when an estimation of the mechanical behavior is carried out.

  9. Process synthesis for natural products from plants based on PAT methodology

    DEFF Research Database (Denmark)

    Malwade, Chandrakant Ramkrishna; Qu, Haiyan; Rong, Ben-Guang

    2017-01-01

    Natural products are defined as secondary metabolites produced by plants and form a vast pool of compounds with unlimited chemical and functional diversity. Many of these secondary metabolites are high value added chemicals that are frequently used as ingredients in food, cosmetics, pharmaceuticals...... and other consumer products. Therefore, process technology towards industrial scale production of such high value chemicals from plants has significant importance. In this chapter, a process synthesis methodology for recovery of natural products from plants at conceptual level is discussed. The methodology...... (QbD) approach, has been included at various steps to obtain molecular level information of process streams and thereby, support the rational decision making. The formulated methodology has been used to isolate and purify artemisinin, an antimalarial drug, from dried leaves of the plant Artemisia...

  10. A methodology aimed at fostering and sustaining the development processes of an IE-based industry

    Science.gov (United States)

    Corallo, Angelo; Errico, Fabrizio; de Maggio, Marco; Giangreco, Enza

    In the current competitive scenario, where business relationships are fundamental in building successful business models and inter/intra organizational business processes are progressively digitalized, an end-to-end methodology is required that is capable of guiding business networks through the Internetworked Enterprise (IE) paradigm: a new and innovative organizational model able to leverage Internet technologies to perform real-time coordination of intra and inter-firm activities, to create value by offering innovative and personalized products/services and reduce transaction costs. This chapter presents the TEKNE project Methodology of change that guides business networks, by means of a modular and flexible approach, towards the IE techno-organizational paradigm, taking into account the competitive environment of the network and how this environment influences its strategic, organizational and technological levels. Contingency, the business model, enterprise architecture and performance metrics are the key concepts that form the cornerstone of this methodological framework.

  11. A Methodology for Equitable Performance Assessment and Presentation of Wave Energy Converters Based on Sea Trials

    DEFF Research Database (Denmark)

    Kofoed, Jens Peter; Pecher, Arthur; Margheritini, Lucia

    2013-01-01

    This paper provides a methodology for the analysis and presentation of data obtained from sea trials of wave energy converters (WEC). The equitable aspect of this methodology lies in its wide application, as any WEC at any scale or stage of development can be considered as long as the tests...... of incomplete data sets. This methodology presents a way to assess the performance of a WEC, being tested in real sea conditions, by evaluating its performance (in a first approach) separately for different wave conditions. These “different wave conditions“ are defined as zones and the range...... to the bi‐variate Hm0‐Te scatter diagram, the influence of other environmental parameters are still expected to be present in the representation of the performance by the inclusion of different data points for every sea state. The extent to which other environmental parameters or even device dependent...

  12. A New Methodology of Multicriteria Decision-Making in Supplier Selection Based on Z-Numbers

    Directory of Open Access Journals (Sweden)

    Bingyi Kang

    2016-01-01

    Full Text Available Supplier selection is a significant issue of multicriteria decision-making (MCDM, which has been heavily studied with classical fuzzy methodologies, but the reliability of the knowledge from domain experts is not efficiently taken into consideration. Z-number introduced by Zadeh has more power to describe the knowledge of human being with uncertain information considering both restraint and reliability. In this paper, a methodology for supplier selection using Z-numbers is proposed considering information transformation. It includes two parts: one solves the issue of how to convert Z-number to the classic fuzzy number according to the fuzzy expectation; the other solves the problem of how to get the optimal priority weight for supplier selection with genetic algorithm (GA, which is an efficient and flexible method for calculating the priority weight of the judgement matrix. Finally, an example for supplier selection is used to illustrate the effectiveness the proposed methodology.

  13. [Acute tryptophan depletion in eating disorders].

    Science.gov (United States)

    Díaz-Marsa, M; Lozano, C; Herranz, A S; Asensio-Vegas, M J; Martín, O; Revert, L; Saiz-Ruiz, J; Carrasco, J L

    2006-01-01

    This work describes the rational bases justifying the use of acute tryptophan depletion technique in eating disorders (ED) and the methods and design used in our studies. Tryptophan depletion technique has been described and used in previous studies safely and makes it possible to evaluate the brain serotonin activity. Therefore it is used in the investigation of hypotheses on serotonergic deficiency in eating disorders. Furthermore, and given the relationship of the dysfunctions of serotonin activity with impulsive symptoms, the technique may be useful in biological differentiation of different subtypes, that is restrictive and bulimic, of ED. 57 female patients with DSM-IV eating disorders and 20 female controls were investigated with the tryptophan depletion test. A tryptophan-free amino acid solution was administered orally after a two-day low tryptophan diet to patients and controls. Free plasma tryptophan was measured at two and five hours following administration of the drink. Eating and emotional responses were measured with specific scales for five hours following the depletion. A study of the basic characteristics of the personality and impulsivity traits was also done. Relationship of the response to the test with the different clinical subtypes and with the temperamental and impulsive characteristics of the patients was studied. The test was effective in considerably reducing plasma tryptophan in five hours from baseline levels (76%) in the global sample. The test was well tolerated and no severe adverse effects were reported. Two patients withdrew from the test due to gastric intolerance. The tryptophan depletion test could be of value to study involvement of serotonin deficits in the symptomatology and pathophysiology of eating disorders.

  14. Ensemble Empirical Mode Decomposition based methodology for ultrasonic testing of coarse grain austenitic stainless steels.

    Science.gov (United States)

    Sharma, Govind K; Kumar, Anish; Jayakumar, T; Purnachandra Rao, B; Mariyappa, N

    2015-03-01

    A signal processing methodology is proposed in this paper for effective reconstruction of ultrasonic signals in coarse grained high scattering austenitic stainless steel. The proposed methodology is comprised of the Ensemble Empirical Mode Decomposition (EEMD) processing of ultrasonic signals and application of signal minimisation algorithm on selected Intrinsic Mode Functions (IMFs) obtained by EEMD. The methodology is applied to ultrasonic signals obtained from austenitic stainless steel specimens of different grain size, with and without defects. The influence of probe frequency and data length of a signal on EEMD decomposition is also investigated. For a particular sampling rate and probe frequency, the same range of IMFs can be used to reconstruct the ultrasonic signal, irrespective of the grain size in the range of 30-210 μm investigated in this study. This methodology is successfully employed for detection of defects in a 50mm thick coarse grain austenitic stainless steel specimens. Signal to noise ratio improvement of better than 15 dB is observed for the ultrasonic signal obtained from a 25 mm deep flat bottom hole in 200 μm grain size specimen. For ultrasonic signals obtained from defects at different depths, a minimum of 7 dB extra enhancement in SNR is achieved as compared to the sum of selected IMF approach. The application of minimisation algorithm with EEMD processed signal in the proposed methodology proves to be effective for adaptive signal reconstruction with improved signal to noise ratio. This methodology was further employed for successful imaging of defects in a B-scan. Copyright © 2014. Published by Elsevier B.V.

  15. Depleted uranium plasma reduction system study

    International Nuclear Information System (INIS)

    Rekemeyer, P.; Feizollahi, F.; Quapp, W.J.; Brown, B.W.

    1994-12-01

    A system life-cycle cost study was conducted of a preliminary design concept for a plasma reduction process for converting depleted uranium to uranium metal and anhydrous HF. The plasma-based process is expected to offer significant economic and environmental advantages over present technology. Depleted Uranium is currently stored in the form of solid UF 6 , of which approximately 575,000 metric tons is stored at three locations in the U.S. The proposed system is preconceptual in nature, but includes all necessary processing equipment and facilities to perform the process. The study has identified total processing cost of approximately $3.00/kg of UF 6 processed. Based on the results of this study, the development of a laboratory-scale system (1 kg/h throughput of UF6) is warranted. Further scaling of the process to pilot scale will be determined after laboratory testing is complete

  16. Ego Depletion in Real-Time: An Examination of the Sequential-Task Paradigm

    Directory of Open Access Journals (Sweden)

    Madeleine M. Arber

    2017-09-01

    Full Text Available Current research into self-control that is based on the sequential task methodology is currently at an impasse. The sequential task methodology involves completing a task that is designed to tax self-control resources which in turn has carry-over effects on a second, unrelated task. The current impasse is in large part due to the lack of empirical research that tests explicit assumptions regarding the initial task. Five studies test one key, untested assumption underpinning strength (finite resource models of self-regulation: Performance will decline over time on a task that depletes self-regulatory resources. In the aftermath of high profile replication failures using a popular letter-crossing task and subsequent criticisms of that task, the current studies examined whether depletion effects would occur in real time using letter-crossing tasks that did not invoke habit-forming and breaking, and whether these effects were moderated by administration type (paper and pencil vs. computer administration. Sample makeup and sizes as well as response formats were also varied across the studies. The five studies yielded a clear and consistent pattern of increasing performance deficits (errors as a function of time spent on task with generally large effects and in the fifth study the strength of negative transfer effects to a working memory task were related to individual differences in depletion. These results demonstrate that some form of depletion is occurring on letter-crossing tasks though whether an internal regulatory resource reservoir or some other factor is changing across time remains an important question for future research.

  17. Crack Growth-Based Predictive Methodology for the Maintenance of the Structural Integrity of Repaired and Nonrepaired Aging Engine Stationary Components

    National Research Council Canada - National Science Library

    Barron, Michael

    1999-01-01

    .... Specifically, the FAA's goal was to develop "Crack Growth-Based Predictive Methodologies for the Maintenance of the Structural Integrity of Repaired and Nonrepaired Aging Engine Stationary Components...

  18. Towards a complete propagation uncertainties in depletion calculations

    Energy Technology Data Exchange (ETDEWEB)

    Martinez, J.S. [Universidad Politecnica de Madrid (Spain). Dept. of Nuclear Engineering; Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Garching (Germany); Zwermann, W.; Gallner, L.; Puente-Espel, Federico; Velkov, K.; Hannstein, V. [Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Garching (Germany); Cabellos, O. [Universidad Politecnica de Madrid (Spain). Dept. of Nuclear Engineering

    2013-07-01

    Propagation of nuclear data uncertainties to calculated values is interesting for design purposes and libraries evaluation. XSUSA, developed at GRS, propagates cross section uncertainties to nuclear calculations. In depletion simulations, fission yields and decay data are also involved and are a possible source of uncertainty that should be taken into account. We have developed tools to generate varied fission yields and decay libraries and to propagate uncertainties through depletion in order to complete the XSUSA uncertainty assessment capabilities. A generic test to probe the methodology is defined and discussed. (orig.)

  19. Context Based Inferences in Research Methodology: The Role of Culture in Justifying Knowledge Claims

    Science.gov (United States)

    Evers, Colin W.; Mason, Mark

    2011-01-01

    Drawing on work in epistemology and the philosophy of science, this paper seeks to provide very general reasons for why a comparative perspective needs to be applied to the inferential procedures of research methodologies where these concern the issue of justifying knowledge claims. In particular, the paper explores the role of culture on a number…

  20. Methodological Flaws in Corpus-Based Studies on Malaysian ESL Textbooks

    Science.gov (United States)

    Zarifi, Abdolvahed; Mukundan, Jayakaran; Rezvani Kalajahi, Seyed Ali

    2014-01-01

    With the increasing interest among the pedagogy researchers in the use of corpus linguistics methodologies to study textbooks, there has emerged a similar enthusiasm among the materials developers to draw on empirical findings in the development of the state-of-the-art curricula and syllabi. In order for these research findings to have their…

  1. Web-Based Collaborative Writing in L2 Contexts: Methodological Insights from Text Mining

    Science.gov (United States)

    Yim, Soobin; Warschauer, Mark

    2017-01-01

    The increasingly widespread use of social software (e.g., Wikis, Google Docs) in second language (L2) settings has brought a renewed attention to collaborative writing. Although the current methodological approaches to examining collaborative writing are valuable to understand L2 students' interactional patterns or perceived experiences, they can…

  2. Abort Trigger False Positive and False Negative Analysis Methodology for Threshold-Based Abort Detection

    Science.gov (United States)

    Melcher, Kevin J.; Cruz, Jose A.; Johnson Stephen B.; Lo, Yunnhon

    2015-01-01

    This paper describes a quantitative methodology for bounding the false positive (FP) and false negative (FN) probabilities associated with a human-rated launch vehicle abort trigger (AT) that includes sensor data qualification (SDQ). In this context, an AT is a hardware and software mechanism designed to detect the existence of a specific abort condition. Also, SDQ is an algorithmic approach used to identify sensor data suspected of being corrupt so that suspect data does not adversely affect an AT's detection capability. The FP and FN methodologies presented here were developed to support estimation of the probabilities of loss of crew and loss of mission for the Space Launch System (SLS) which is being developed by the National Aeronautics and Space Administration (NASA). The paper provides a brief overview of system health management as being an extension of control theory; and describes how ATs and the calculation of FP and FN probabilities relate to this theory. The discussion leads to a detailed presentation of the FP and FN methodology and an example showing how the FP and FN calculations are performed. This detailed presentation includes a methodology for calculating the change in FP and FN probabilities that result from including SDQ in the AT architecture. To avoid proprietary and sensitive data issues, the example incorporates a mixture of open literature and fictitious reliability data. Results presented in the paper demonstrate the effectiveness of the approach in providing quantitative estimates that bound the probability of a FP or FN abort determination.

  3. Improved Methodology of Weather Window Prediction for Offshore Operations Based on Probabilities of Operation Failure

    Directory of Open Access Journals (Sweden)

    Tomas Gintautas

    2017-05-01

    Full Text Available The offshore wind industry is building and planning new wind farms further offshore due to increasing demand on sustainable energy production and already occupied prime resource locations closer to shore. Costs of operation and maintenance, transport and installation of offshore wind turbines already contribute significantly to the cost of produced electricity and will continue to increase, due to moving further offshore, if the current techniques of predicting offshore wind farm accessibility are to stay the same. The majority of offshore operations are carried out by specialized ships that must be hired for the duration of the operation. Therefore, offshore wind farm accessibility and costs of offshore activities are primarily driven by the expected number of operational hours offshore and waiting times for weather windows, suitable for offshore operations. Having more reliable weather window estimates would result in better wind farm accessibility predictions and, as a consequence, potentially reduce the cost of offshore wind energy. This paper presents an updated methodology of weather window prediction that uses physical offshore vessel and equipment responses to establish the expected probabilities of operation failure, which, in turn, can be compared to maximum allowable probability of failure to obtain weather windows suitable for operation. Two case studies were performed to evaluate the feasibility of the improved methodology, and the results indicated that it produced consistent and improved results. In fact, the updated methodology predicts 57% and 47% more operational hours during the test period when compared to standard alpha-factor and the original methodologies.

  4. A Dynamically Configurable Log-based Distributed Security Event Detection Methodology using Simple Event Correlator

    Science.gov (United States)

    2010-06-01

    13 2.3 Security Information and Event Management . . . . . . 14 2.4 Insider Threat Detection . . . . . . . . . . . . . . . . . . 15...organizations fail to properly implement and properly resource Security Information and Event Management (SIEM) capa- bilities [32] [37]. Several...motivate the development of a distributed log event correlation methodology. Back- ground literature in the areas of log management , event correlation and

  5. RANS Based Methodology for Predicting the Influence of Leading Edge Erosion on Airfoil Performance

    Energy Technology Data Exchange (ETDEWEB)

    Langel, Christopher M. [Univ. of California, Davis, CA (United States). Dept. of Mechanical and Aerospace Engineering; Chow, Raymond C. [Univ. of California, Davis, CA (United States). Dept. of Mechanical and Aerospace Engineering; van Dam, C. P. [Univ. of California, Davis, CA (United States). Dept. of Mechanical and Aerospace Engineering; Maniaci, David Charles [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Wind Energy Technologies Dept.

    2017-10-01

    The impact of surface roughness on flows over aerodynamically designed surfaces is of interested in a number of different fields. It has long been known the surface roughness will likely accelerate the laminar- turbulent transition process by creating additional disturbances in the boundary layer. However, there are very few tools available to predict the effects surface roughness will have on boundary layer flow. There are numerous implications of the premature appearance of a turbulent boundary layer. Increases in local skin friction, boundary layer thickness, and turbulent mixing can impact global flow properties compounding the effects of surface roughness. With this motivation, an investigation into the effects of surface roughness on boundary layer transition has been conducted. The effort involved both an extensive experimental campaign, and the development of a high fidelity roughness model implemented in a R ANS solver. Vast a mounts of experimental data was generated at the Texas A&M Oran W. Nicks Low Speed Wind Tunnel for the calibration and validation of the roughness model described in this work, as well as future efforts. The present work focuses on the development of the computational model including a description of the calibration process. The primary methodology presented introduces a scalar field variable and associated transport equation that interacts with a correlation based transition model. The additional equation allows for non-local effects of surface roughness to be accounted for downstream of rough wall sections while maintaining a "local" formulation. The scalar field is determined through a boundary condition function that has been calibrated to flat plate cases with sand grain roughness. The model was initially tested on a NACA 0012 airfoil with roughness strips applied to the leading edge. Further calibration of the roughness model was performed using results from the companion experimental study on a NACA 633 -418 airfoil

  6. A Methodology for Calculating EGS Electricity Generation Potential Based on the Gringarten Model for Heat Extraction From Fractured Rock

    Energy Technology Data Exchange (ETDEWEB)

    Augustine, Chad

    2017-05-01

    Existing methodologies for estimating the electricity generation potential of Enhanced Geothermal Systems (EGS) assume thermal recovery factors of 5% or less, resulting in relatively low volumetric electricity generation potentials for EGS reservoirs. This study proposes and develops a methodology for calculating EGS electricity generation potential based on the Gringarten conceptual model and analytical solution for heat extraction from fractured rock. The electricity generation potential of a cubic kilometer of rock as a function of temperature is calculated assuming limits on the allowed produced water temperature decline and reservoir lifetime based on surface power plant constraints. The resulting estimates of EGS electricity generation potential can be one to nearly two-orders of magnitude larger than those from existing methodologies. The flow per unit fracture surface area from the Gringarten solution is found to be a key term in describing the conceptual reservoir behavior. The methodology can be applied to aid in the design of EGS reservoirs by giving minimum reservoir volume, fracture spacing, number of fractures, and flow requirements for a target reservoir power output. Limitations of the idealized model compared to actual reservoir performance and the implications on reservoir design are discussed.

  7. The IAEA collaborating centre for neutron activation based methodologies of research reactors

    International Nuclear Information System (INIS)

    Bode, P.

    2010-01-01

    The Reactor Institute Delft of the Delft University of Technology houses the Netherlands' only academic nuclear research reactor, with associated instrumentation and laboratories, for scientific education and research with ionizing radiation. The Institute's swimming pool type research reactor reached first criticality in 1963 and is currently operated at 2MW thermal powers on a 100 h/week basis. The reactor is equipped with neutron mirror guides serving ultra modern neutron beam physics instruments and with a very bright positron facility. Fully automated gamma-ray spectrometry systems are used by the laboratory for neutron activation analysis, providing large scale services under an ISO/IEC 17025:2005 compliant management system, being (since 1993) the first accredited laboratory of its kind in the world. Already for several years, this laboratory is sustainable by rendering these services to both the public and the private sector. The prime user of the Institute's fac ilities is the scientific Research Department of Radiation, Radionuclide and Reactors of the Faculty of Applied Sciences, housed inside the building. All reactor facilities are also made available for use by or for services to, external clients (industry, government, private sector, other (international research institutes and universities). The Reactor Institute Delft was inaugurated in May 2009 as a new lAEA Collaborating Centre for Neutron Activation Based Methodologies of Research Reactors. The collaboration involves education, research and development in (I) Production of reactor-produced, no-carrier added radioisotopes of high specific activity via neutron activation; (II) Neutron activation analysis with emphasis on automation as well as analysis of large samples, and radiotracer techniques and as a cross-cutting activity, (IIl) Quality assurance and management in research and application of research reactor based techniques and in research reactor operations. This c ollaboration will

  8. Applying distance-to-target weighing methodology to evaluate the environmental performance of bio-based energy, fuels, and materials

    International Nuclear Information System (INIS)

    Weiss, Martin; Patel, Martin; Heilmeier, Hermann; Bringezu, Stefan

    2007-01-01

    The enhanced use of biomass for the production of energy, fuels, and materials is one of the key strategies towards sustainable production and consumption. Various life cycle assessment (LCA) studies demonstrate the great potential of bio-based products to reduce both the consumption of non-renewable energy resources and greenhouse gas emissions. However, the production of biomass requires agricultural land and is often associated with adverse environmental effects such as eutrophication of surface and ground water. Decision making in favor of or against bio-based and conventional fossil product alternatives therefore often requires weighing of environmental impacts. In this article, we apply distance-to-target weighing methodology to aggregate LCA results obtained in four different environmental impact categories (i.e., non-renewable energy consumption, global warming potential, eutrophication potential, and acidification potential) to one environmental index. We include 45 bio- and fossil-based product pairs in our analysis, which we conduct for Germany. The resulting environmental indices for all product pairs analyzed range from -19.7 to +0.2 with negative values indicating overall environmental benefits of bio-based products. Except for three options of packaging materials made from wheat and cornstarch, all bio-based products (including energy, fuels, and materials) score better than their fossil counterparts. Comparing the median values for the three options of biomass utilization reveals that bio-energy (-1.2) and bio-materials (-1.0) offer significantly higher environmental benefits than bio-fuels (-0.3). The results of this study reflect, however, subjective value judgments due to the weighing methodology applied. Given the uncertainties and controversies associated not only with distance-to-target methodologies in particular but also with weighing approaches in general, the authors strongly recommend using weighing for decision finding only as a

  9. Hsp90 depletion goes wild

    Directory of Open Access Journals (Sweden)

    Siegal Mark L

    2012-02-01

    Full Text Available Abstract Hsp90 reveals phenotypic variation in the laboratory, but is Hsp90 depletion important in the wild? Recent work from Chen and Wagner in BMC Evolutionary Biology has discovered a naturally occurring Drosophila allele that downregulates Hsp90, creating sensitivity to cryptic genetic variation. Laboratory studies suggest that the exact magnitude of Hsp90 downregulation is important. Extreme Hsp90 depletion might reactivate transposable elements and/or induce aneuploidy, in addition to revealing cryptic genetic variation. See research article http://wwww.biomedcentral.com/1471-2148/12/25

  10. A methodology for secure recovery of spacecrafts based on a trusted hardware platform

    Science.gov (United States)

    Juliato, Marcio; Gebotys, Catherine

    2017-02-01

    This paper proposes a methodology for the secure recovery of spacecrafts and the recovery of its cryptographic capabilities in emergency scenarios recurring from major unintentional failures and malicious attacks. The proposed approach employs trusted modules to achieve higher reliability and security levels in space missions due to the presence of integrity check capabilities as well as secure recovery mechanisms. Additionally, several recovery protocols are thoroughly discussed and analyzed against a wide variety of attacks. Exhaustive search attacks are shown in a wide variety of contexts and are shown to be infeasible and totally independent of the computational power of attackers. Experimental results have shown that the proposed methodology allows for the fast and secure recovery of spacecrafts, demanding minimum implementation area, power consumption and bandwidth.

  11. Business analysis methodology in telecommunication industry – the research based on the grounded theory

    Directory of Open Access Journals (Sweden)

    Hana Nenickova

    2013-10-01

    Full Text Available The objective of this article is to present the grounded theory using in the qualitative research as a basis to build a business analysis methodology for the implementation of information systems in telecommunication enterprises in Czech Republic. In the preparation of the methodology I have used the current needs of telecommunications companies, which are characterized mainly by high dependence on information systems. Besides that, this industry is characterized by high flexibility and competition and compressing of the corporate strategy timeline. The grounded theory of business analysis defines the specifics of the telecommunications industry, focusing on the very specific description of the procedure for collecting the business requirements and following the business strategy.

  12. C-Based Design Methodology and Topological Change for an Indian Agricultural Tractor Component

    Science.gov (United States)

    Matta, Anil Kumar; Raju, D. Ranga; Suman, K. N. S.; Kranthi, A. S.

    2018-02-01

    The failure of tractor components and their replacement has now become very common in India because of re-cycling, re-sale, and duplication. To over come the problem of failure we propose a design methodology for topological change co-simulating with software's. In the proposed Design methodology, the designer checks Paxial, Pcr, Pfailue, τ by hand calculations, from which refined topological changes of R.S.Arm are formed. We explained several techniques employed in the component for reduction, removal of rib material to change center of gravity and centroid point by using system C for mixed level simulation and faster topological changes. The design process in system C can be compiled and executed with software, TURBO C7. The modified component is developed in proE and analyzed in ANSYS. The topologically changed component with slot 120 × 4.75 × 32.5 mm at the center showed greater effectiveness than the original component.

  13. Application of knowledge tools in training, based on problems’ solving: methodology and it support

    OpenAIRE

    D. V. Kudryavtsev; S. A. Kostousov

    2017-01-01

    The development of information accessibility in the 21st century necessitates the production of own knowledge in the learning process, and not justthe transfer of information. The computer should be used as a universal tool for working with knowledge, which is the study of the world, information obtaining, the organization and structuring of their own knowledge and presentation to other people. The aim of the work is to develop a methodology for the use of tools for working with knowledge in ...

  14. Health and environmental impact of depleted uranium

    International Nuclear Information System (INIS)

    Furitsu, Katsumi

    2010-01-01

    Depleted Uranium (DU) is 'nuclear waste' produced from the enrichment process and is mostly made up of 238 U and is depleted in the fissionable isotope 235 U compared to natural uranium (NU). Depleted uranium has about 60% of the radioactivity of natural uranium. Depleted uranium and natural uranium are identical in terms of the chemical toxicity. Uranium's high density gives depleted uranium shells increased range and penetrative power. This density, combined with uranium's pyrophoric nature, results in a high-energy kinetic weapon that can punch and burn through armour plating. Striking a hard target, depleted uranium munitions create extremely high temperatures. The uranium immediately burns and vaporizes into an aerosol, which is easily diffused in the environment. People can inhale the micro-particles of uranium oxide in an aerosol and absorb them mainly from lung. Depleted uranium has both aspects of radiological toxicity and chemical toxicity. The possible synergistic effect of both kinds of toxicities is also pointed out. Animal and cellular studies have been reported the carcinogenic, neurotoxic, immuno-toxic and some other effects of depleted uranium including the damage on reproductive system and foetus. In addition, the health effects of micro/ nano-particles, similar in size of depleted uranium aerosols produced by uranium weapons, have been reported. Aerosolized DU dust can easily spread over the battlefield spreading over civilian areas, sometimes even crossing international borders. Therefore, not only the military personnel but also the civilians can be exposed. The contamination continues after the cessation of hostilities. Taking these aspects into account, DU weapon is illegal under international humanitarian laws and is considered as one of the inhumane weapons of 'indiscriminate destruction'. The international society is now discussing the prohibition of DU weapons based on 'precautionary principle'. The 1991 Gulf War is reportedly the first

  15. Multicriteria decision-making analysis based methodology for predicting carbonate rocks' uniaxial compressive strength

    Directory of Open Access Journals (Sweden)

    Ersoy Hakan

    2012-10-01

    Full Text Available

    ABSTRACT

    Uniaxial compressive strength (UCS deals with materials' to ability to withstand axially-directed pushing forces and especially considered to be rock materials' most important mechanical properties. However, the UCS test is an expensive, very time-consuming test to perform in the laboratory and requires high-quality core samples having regular geometry. Empirical equations were thus proposed for predicting UCS as a function of rocks' index properties. Analytical hierarchy process and multiple regression analysis based methodology were used (as opposed to traditional linear regression methods on data-sets obtained from carbonate rocks in NE Turkey. Limestone samples ranging from Devonian to late Cretaceous ages were chosen; travertine-onyx samples were selected from morphological environments considering their surface environmental conditions Test results from experiments carried out on about 250 carbonate rock samples were used in deriving the model. While the hierarchy model focused on determining the most important index properties affecting on UCS, regression analysis established meaningful relationships between UCS and index properties; 0. 85 and 0. 83 positive coefficient correlations between the variables were determined by regression analysis. The methodology provided an appropriate alternative to quantitative estimation of UCS and avoided the need for tedious and time consuming laboratory testing


    RESUMEN

    La resistencia a la compresión uniaxial (RCU trata con la capacidad de los materiales para soportar fuerzas empujantes dirigidas axialmente y, especialmente, es considerada ser uno de las más importantes propiedades mecánicas de

  16. A multicriteria-based methodology for site prioritisation in sediment management.

    Science.gov (United States)

    Alvarez-Guerra, Manuel; Viguri, Javier R; Voulvoulis, Nikolaos

    2009-08-01

    Decision-making for sediment management is a complex task that incorporates the selections of areas for remediation and the assessment of options for any mitigation required. The application of Multicriteria Analysis (MCA) to rank different areas, according to their need for sediment management, provides a great opportunity for prioritisation, a first step in an integrated methodology that finally aims to assess and select suitable alternatives for managing the identified priority sites. This paper develops a methodology that starts with the delimitation of management units within areas of study, followed by the application of MCA methods that allows ranking of these management units, according to their need for remediation. This proposed process considers not only scientific evidence on sediment quality, but also other relevant aspects such as social and economic criteria associated with such decisions. This methodology is illustrated with its application to the case study area of the Bay of Santander, in northern Spain, highlighting some of the implications of utilising different MCA methods in the process. It also uses site-specific data to assess the subjectivity in the decision-making process, mainly reflected through the assignment of the criteria weights and uncertainties in the criteria scores. Analysis of the sensitivity of the results to these factors is used as a way to assess the stability and robustness of the ranking as a first step of the sediment management decision-making process.

  17. Design-based research as a “smart” methodology for studying learning in the context of work

    DEFF Research Database (Denmark)

    Kolbæk, Ditte

    Although Design-based Research (DBR) was developed for investigating class-room training this paper discusses methodological issues when DBR is employed for investigating learning in the context of work, as it is an authentic learning environment, a real-world setting for fostering learning...... and creating usable knowledge and knowing. The purpose of this paper is to provide new perspectives on DBR regarding how to conduct DBR for studying learning from experience in the context of work. The research question is: What to consider to make DBR a smart methodology for exploring learning from experience...... in the context of work? The exploration of DBR is based on a literature review and experience with DBR in the context of work....

  18. A Multi-Criteria Decision Analysis based methodology for quantitatively scoring the reliability and relevance of ecotoxicological data.

    Science.gov (United States)

    Isigonis, Panagiotis; Ciffroy, Philippe; Zabeo, Alex; Semenzin, Elena; Critto, Andrea; Giove, Silvio; Marcomini, Antonio

    2015-12-15

    Ecotoxicological data are highly important for risk assessment processes and are used for deriving environmental quality criteria, which are enacted for assuring the good quality of waters, soils or sediments and achieving desirable environmental quality objectives. Therefore, it is of significant importance the evaluation of the reliability of available data for analysing their possible use in the aforementioned processes. The thorough analysis of currently available frameworks for the assessment of ecotoxicological data has led to the identification of significant flaws but at the same time various opportunities for improvement. In this context, a new methodology, based on Multi-Criteria Decision Analysis (MCDA) techniques, has been developed with the aim of analysing the reliability and relevance of ecotoxicological data (which are produced through laboratory biotests for individual effects), in a transparent quantitative way, through the use of expert knowledge, multiple criteria and fuzzy logic. The proposed methodology can be used for the production of weighted Species Sensitivity Weighted Distributions (SSWD), as a component of the ecological risk assessment of chemicals in aquatic systems. The MCDA aggregation methodology is described in detail and demonstrated through examples in the article and the hierarchically structured framework that is used for the evaluation and classification of ecotoxicological data is shortly discussed. The methodology is demonstrated for the aquatic compartment but it can be easily tailored to other environmental compartments (soil, air, sediments). Copyright © 2015 Elsevier B.V. All rights reserved.

  19. EMD-Based Methodology for the Identification of a High-Speed Train Running in a Gear Operating State

    Directory of Open Access Journals (Sweden)

    Alejandro Bustos

    2018-03-01

    Full Text Available An efficient maintenance is a key consideration in systems of railway transport, especially in high-speed trains, in order to avoid accidents with catastrophic consequences. In this sense, having a method that allows for the early detection of defects in critical elements, such as the bogie mechanical components, is a crucial for increasing the availability of rolling stock and reducing maintenance costs. The main contribution of this work is the proposal of a methodology that, based on classical signal processing techniques, provides a set of parameters for the fast identification of the operating state of a critical mechanical system. With this methodology, the vibratory behaviour of a very complex mechanical system is characterised, through variable inputs, which will allow for the detection of possible changes in the mechanical elements. This methodology is applied to a real high-speed train in commercial service, with the aim of studying the vibratory behaviour of the train (specifically, the bogie before and after a maintenance operation. The results obtained with this methodology demonstrated the usefulness of the new procedure and allowed for the disclosure of reductions between 15% and 45% in the spectral power of selected Intrinsic Mode Functions (IMFs after the maintenance operation.

  20. Methodology based on genetic heuristics for in-vivo characterizing the patient-specific biomechanical behavior of the breast tissues.

    Science.gov (United States)

    Lago, M A; Rúperez, M J; Martínez-Martínez, F; Martínez-Sanchis, S; Bakic, P R; Monserrat, C

    2015-11-30

    This paper presents a novel methodology to in-vivo estimate the elastic constants of a constitutive model proposed to characterize the mechanical behavior of the breast tissues. An iterative search algorithm based on genetic heuristics was constructed to in-vivo estimate these parameters using only medical images, thus avoiding invasive measurements of the mechanical response of the breast tissues. For the first time, a combination of overlap and distance coefficients were used for the evaluation of the similarity between a deformed MRI of the breast and a simulation of that deformation. The methodology was validated using breast software phantoms for virtual clinical trials, compressed to mimic MRI-guided biopsies. The biomechanical model chosen to characterize the breast tissues was an anisotropic neo-Hookean hyperelastic model. Results from this analysis showed that the algorithm is able to find the elastic constants of the constitutive equations of the proposed model with a mean relative error of about 10%. Furthermore, the overlap between the reference deformation and the simulated deformation was of around 95% showing the good performance of the proposed methodology. This methodology can be easily extended to characterize the real biomechanical behavior of the breast tissues, which means a great novelty in the field of the simulation of the breast behavior for applications such as surgical planing, surgical guidance or cancer diagnosis. This reveals the impact and relevance of the presented work.

  1. EMD-Based Methodology for the Identification of a High-Speed Train Running in a Gear Operating State.

    Science.gov (United States)

    Bustos, Alejandro; Rubio, Higinio; Castejón, Cristina; García-Prada, Juan Carlos

    2018-03-06

    An efficient maintenance is a key consideration in systems of railway transport, especially in high-speed trains, in order to avoid accidents with catastrophic consequences. In this sense, having a method that allows for the early detection of defects in critical elements, such as the bogie mechanical components, is a crucial for increasing the availability of rolling stock and reducing maintenance costs. The main contribution of this work is the proposal of a methodology that, based on classical signal processing techniques, provides a set of parameters for the fast identification of the operating state of a critical mechanical system. With this methodology, the vibratory behaviour of a very complex mechanical system is characterised, through variable inputs, which will allow for the detection of possible changes in the mechanical elements. This methodology is applied to a real high-speed train in commercial service, with the aim of studying the vibratory behaviour of the train (specifically, the bogie) before and after a maintenance operation. The results obtained with this methodology demonstrated the usefulness of the new procedure and allowed for the disclosure of reductions between 15% and 45% in the spectral power of selected Intrinsic Mode Functions (IMFs) after the maintenance operation.

  2. Twenty years of Internet-based research at SCiP: A discussion of surviving concepts and new methodologies.

    Science.gov (United States)

    Wolfe, Christopher R

    2017-10-01

    This discussion of the symposium 20 Years of Internet-Based Research at SCiP: Surviving Concepts, New Methodologies compares the issues faced by the pioneering Internet-based psychology researchers who presented at the first symposia on the topic, at the 1996 annual meeting of the Society for Computers in Psychology, to the issues facing researchers today. New methodologies unavailable in the early days of Web-based psychological research are discussed, with an emphasis on mobile computing with smartphones that is capitalizing on capabilities such as touch screens and gyro sensors. A persistent issue spanning the decades has been the challenge of conducting scientific research with consumer-grade electronics. In the 1996 symposia on Internet-based research, four advantages were identified: easy access to a geographically unlimited subject population, including subjects from very specific and previously inaccessible target populations; bringing the experiment to the subject; high statistical power through large sample size; and reduced cost. In retrospect, it appears that Internet-based research has largely lived up to this early promise-with the possible exception of sample size, since the public demand for controlled psychology experiments has not always been greater than the supply offered by researchers. There are many reasons for optimism about the future of Internet-based research. However, unless courses and textbooks on psychological research methods begin to give Web-based research the attention it deserves, the future of Internet-based psychological research will remain in doubt.

  3. An Indicator Based Assessment Methodology Proposal for the Identification of Domestic Systemically Important Banks within the Turkish Banking Sector

    OpenAIRE

    Ozge ULKUTAS SACCI; Guven SAYILGAN

    2014-01-01

    This study aims to identify domestic systemically important banks (D-SIB) operating within the Turkish Banking Sector. In this regard, adopting an indicator based assessment methodology together with the cluster analysis application, banks in the sample are classified in terms of their degree of systemic importance by using publicly available year-end data of 2012. The study has shown that a total of 7 banks with the highest systemic importance clustered away from the remaining 21 banks in th...

  4. Development of the point-depletion code DEPTH

    Energy Technology Data Exchange (ETDEWEB)

    She, Ding [Department of Engineering Physics, Tsinghua University, Beijing 100084 (China); Wang, Kan, E-mail: wangkan@mail.tsinghua.edu.cn [Department of Engineering Physics, Tsinghua University, Beijing 100084 (China); Yu, Ganglin [Department of Engineering Physics, Tsinghua University, Beijing 100084 (China)

    2013-05-15

    Highlights: ► The DEPTH code has been developed for the large-scale depletion system. ► DEPTH uses the data library which is convenient to couple with MC codes. ► TTA and matrix exponential methods are implemented and compared. ► DEPTH is able to calculate integral quantities based on the matrix inverse. ► Code-to-code comparisons prove the accuracy and efficiency of DEPTH. -- Abstract: The burnup analysis is an important aspect in reactor physics, which is generally done by coupling of transport calculations and point-depletion calculations. DEPTH is a newly-developed point-depletion code of handling large burnup depletion systems and detailed depletion chains. For better coupling with Monte Carlo transport codes, DEPTH uses data libraries based on the combination of ORIGEN-2 and ORIGEN-S and allows users to assign problem-dependent libraries for each depletion step. DEPTH implements various algorithms of treating the stiff depletion systems, including the Transmutation trajectory analysis (TTA), the Chebyshev Rational Approximation Method (CRAM), the Quadrature-based Rational Approximation Method (QRAM) and the Laguerre Polynomial Approximation Method (LPAM). Three different modes are supported by DEPTH to execute the decay, constant flux and constant power calculations. In addition to obtaining the instantaneous quantities of the radioactivity, decay heats and reaction rates, DEPTH is able to calculate the integral quantities by a time-integrated solver. Through calculations compared with ORIGEN-2, the validity of DEPTH in point-depletion calculations is proved. The accuracy and efficiency of depletion algorithms are also discussed. In addition, an actual pin-cell burnup case is calculated to illustrate the DEPTH code performance in coupling with the RMC Monte Carlo code.

  5. A new methodology for estimating rainfall aggressiveness risk based on daily rainfall records for multi-decennial periods.

    Science.gov (United States)

    García-Barrón, Leoncio; Morales, Julia; Sousa, Arturo

    2018-02-15

    The temporal irregularity of rainfall, characteristic of a Mediterranean climate, corresponds to the irregularity of the environmental effects on soil. We used aggressiveness as an indicator to quantify the potential environmental impact of rainfall. However, quantifying rainfall aggressiveness is conditioned by the lack of sub-hourly frequency records on which intensity models are based. On the other hand, volume models are characterized by a lack of precision in the treatment of heavy rainfall events because they are based on monthly series. Therefore, in this study, we propose a new methodology for estimating rainfall aggressiveness risk. A new synthesis parameter based on reformulation using daily data of the Modified Fournier and Oliver's Precipitation Concentration indices is defined. The weighting of both indices for calculating the aggressiveness risk is established by multiple regression with respect to the local erosion R factor estimated in the last decades. We concluded that the proposed methodology overcomes the previously mentioned limitations of the traditional intensity and volume models and provides accurate information; therefore, it is appropriate for determining potential rainfall impact over long time periods. Specifically, we applied this methodology to the daily rainfall time series from the San Fernando Observatory (1870-2010) in southwest Europe. An interannual aggressiveness risk series was generated, which allowed analysis of its evolution and determination of the temporal variability. The results imply that environmental management can use data from long-term historical series as a reference for decision making. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Impact of mineral resource depletion

    CSIR Research Space (South Africa)

    Brent, AC

    2006-09-01

    Full Text Available In a letter to the editor, the authors comment on BA Steen's article on "Abiotic Resource Depletion: different perceptions of the problem with mineral deposits" published in the special issue of the International Journal of Life Cycle Assessment...

  7. Ligand and structure-based methodologies for the prediction of the activity of G protein-coupled receptor ligands

    Science.gov (United States)

    Costanzi, Stefano; Tikhonova, Irina G.; Harden, T. Kendall; Jacobson, Kenneth A.

    2009-11-01

    Accurate in silico models for the quantitative prediction of the activity of G protein-coupled receptor (GPCR) ligands would greatly facilitate the process of drug discovery and development. Several methodologies have been developed based on the properties of the ligands, the direct study of the receptor-ligand interactions, or a combination of both approaches. Ligand-based three-dimensional quantitative structure-activity relationships (3D-QSAR) techniques, not requiring knowledge of the receptor structure, have been historically the first to be applied to the prediction of the activity of GPCR ligands. They are generally endowed with robustness and good ranking ability; however they are highly dependent on training sets. Structure-based techniques generally do not provide the level of accuracy necessary to yield meaningful rankings when applied to GPCR homology models. However, they are essentially independent from training sets and have a sufficient level of accuracy to allow an effective discrimination between binders and nonbinders, thus qualifying as viable lead discovery tools. The combination of ligand and structure-based methodologies in the form of receptor-based 3D-QSAR and ligand and structure-based consensus models results in robust and accurate quantitative predictions. The contribution of the structure-based component to these combined approaches is expected to become more substantial and effective in the future, as more sophisticated scoring functions are developed and more detailed structural information on GPCRs is gathered.

  8. It is chloride depletion alkalosis, not contraction alkalosis.

    Science.gov (United States)

    Luke, Robert G; Galla, John H

    2012-02-01

    Maintenance of metabolic alkalosis generated by chloride depletion is often attributed to volume contraction. In balance and clearance studies in rats and humans, we showed that chloride repletion in the face of persisting alkali loading, volume contraction, and potassium and sodium depletion completely corrects alkalosis by a renal mechanism. Nephron segment studies strongly suggest the corrective response is orchestrated in the collecting duct, which has several transporters integral to acid-base regulation, the most important of which is pendrin, a luminal Cl/HCO(3)(-) exchanger. Chloride depletion alkalosis should replace the notion of contraction alkalosis.

  9. Phoenix – A model-based Human Reliability Analysis methodology: Qualitative Analysis Procedure

    International Nuclear Information System (INIS)

    Ekanem, Nsimah J.; Mosleh, Ali; Shen, Song-Hua

    2016-01-01

    Phoenix method is an attempt to address various issues in the field of Human Reliability Analysis (HRA). Built on a cognitive human response model, Phoenix incorporates strong elements of current HRA good practices, leverages lessons learned from empirical studies, and takes advantage of the best features of existing and emerging HRA methods. Its original framework was introduced in previous publications. This paper reports on the completed methodology, summarizing the steps and techniques of its qualitative analysis phase. The methodology introduces the “Crew Response Tree” which provides a structure for capturing the context associated with Human Failure Events (HFEs), including errors of omission and commission. It also uses a team-centered version of the Information, Decision and Action cognitive model and “macro-cognitive” abstractions of crew behavior, as well as relevant findings from cognitive psychology literature and operating experience, to identify potential causes of failures and influencing factors during procedure-driven and knowledge-supported crew-plant interactions. The result is the set of identified HFEs and likely scenarios leading to each. The methodology itself is generic in the sense that it is compatible with various quantification methods, and can be adapted for use across different environments including nuclear, oil and gas, aerospace, aviation, and healthcare. - Highlights: • Produces a detailed, consistent, traceable, reproducible and properly documented HRA. • Uses “Crew Response Tree” to capture context associated with Human Failure Events. • Models dependencies between Human Failure Events and influencing factors. • Provides a human performance model for relating context to performance. • Provides a framework for relating Crew Failure Modes to its influencing factors.

  10. A problem-based approach to teaching research methodology to medical graduates in Iran

    Directory of Open Access Journals (Sweden)

    Mehrdad Jalalian Hosseini

    2009-08-01

    Full Text Available Physicians are reticent to participate in research projects for avariety of reasons. Facilitating the active involvement ofdoctors in research projects is a high priority for the IranianBlood Transfusion Organization (IBTO. A one-month trainingcourse on research methodology was conducted for a groupof physicians in Mashhad, in northeast Iran. The participantswere divided in ten groups. They prepared a researchproposal under the guidance of a workshop leader. Thequality of the research proposals, which were prepared by allparticipants, went beyond our expectations. All of theresearch proposals were relevant to blood safety. In this briefreport we describe our approach.

  11. Near‐IR laser cleaning of Cu‐ based artefacts: a comprehensive study of the methodology standardization

    DEFF Research Database (Denmark)

    Hrnjic, Mahir

    2015-01-01

    . In this study, laser cleaning was performed with near-IR lasers on artificially aged copper specimens and on two copper coins coming from Bubastis (Egypt) in order to remove the patinas in a totally non invasive way. Different irradiance and different number of passes were utilised and compared. Treated surface......Laser cleaning, as a conservation technique, is a selective, precise and minimal intrusive method of removing corrosion product layers. Nevertheless, in order to optimise this method as a standard conservation technique, it is still necessary to define different laser cleaning methodologies...

  12. Methodology and Applications in Non-linear Model-based Geostatistics

    DEFF Research Database (Denmark)

    Christensen, Ole Fredslund

    that are approximately Gaussian. Parameter estimation and prediction for the transformed Gaussian model is studied. In some cases a transformation cannot possibly render the data Gaussian. A methodology for analysing such data was introduced by Diggle, Tawn and Moyeed (1998): The generalised linear spatial model....... Conditioned by an underlying and unobserved Gaussian process the observations at the measured locations follow a generalised linear model. Concerning inference Markov chain Monte Carlo methods are used. The study of these models is the main topic of the thesis. Construction of priors, and the use of flat...... contains functions for inference in generalised linear spatial models.    ...

  13. PCR-based methodologies for detection and characterization of Listeria monocytogenes and Listeria ivanovii in foods and environmental sources

    Directory of Open Access Journals (Sweden)

    Jin-Qiang Chen

    2017-06-01

    Full Text Available Listeria monocytogenes is an important foodborne pathogen responsible for listeriosis, a fatal disease. It is widely distributed in various foods and environmental sources. In this review, we focused on addressing PCR-based technologies, including conventional PCR, qPCR and droplet digital PCR (ddPCR. Specifically, we described (a conventional PCR and mono-, duplex- and multiplex-qPCR methodologies; (b development and applications of gene HlyA-, Iap-, PrfA – and SsrA-based conventional and qPCR assays as well as PCR assays targeting newly identified gene targets for specific detection of L. monocytogenes; differentiation of viable from dead L. monocytogenes by qPCR in conjugation with propidium monoazide pretreatment; PCR-based serotype identification of L. monocytogenes isolates; PCR-based detection of L. ivanovii, infecting ruminants, differentiation of L. monocytogenes from other Listeria species; and sigB-gene based PCR identification of Listeria spp; (c applications of ddPCR in detection of L. monocytogenes; and (d application of qPCR assays in detection and subtyping of L. monocytogenes in milk and dairy products; meats, meat products and meat-processing environment; and seafood, seafood products and processing environment. Our goal was to provide a relatively comprehensive overview of PCR-based methodologies available in detection, characterization and subtyping of various strains of L. monocytogenes in foods and environmental sources.

  14. A METHODOLOGY BASED ON AN ECOLOGICAL ECONOMY APPROACH FOR THE INTEGRATING MANAGEMENT OF THE SULPHUROUS WATER IN AN OIL REFINERY

    Directory of Open Access Journals (Sweden)

    Gabriel Orlando Lobelles Sardiñas

    2016-10-01

    Full Text Available Despite the current highly stringent international standards regulating the contaminating emissions to the environment, the Oil refinery of Cienfuegos is still generating liquid and gaseous emissions contaminating the environment. The construction of new units as part of the Refinery expansion leads to an increase of these emissions due to the lack of technologies for the reutilization of the sulphurous water. The objective of this paper is to propose a methodology for the integral management of the sulphurous residual water in the oil refining process, including the evaluation and selection of the most feasible technological variant to minimize the sulphur contamination of water and the resulting emissions during the process. The methodology is based on the ecological economy tools, allowing a comprehensible evaluation of six technological variants at the refinery of Cienfuegos. The Life Cycle Assessment was applied (ACV by its Spanish acronym, by means of the software SimaPro 7.1. It was evaluated through the Eco Speed Method, to minimize the possible uncertainty. An economic evaluation was performed, taking into account the external costs for a more comprehensive analysis, enabling, along with the ecological indicators, the selection of the best technological variant, achieving a methodology based on a comprehensive evaluation, and as a positive impact, the implementation of the chosen variant (V5, 98.27% of the process water was recovered, as well as the sulphur that recovered from 94 to 99.8 %, reducing the emissions from 12 200 to 120 mg/Nm3 as SO2.

  15. A Methodology for Mapping Meanings in Text-Based Sustainability Communication

    Directory of Open Access Journals (Sweden)

    Mark Brown

    2013-06-01

    Full Text Available In moving society towards more sustainable forms of consumption and production, social learning must play an important role. Making the assumption that it occurs as a consequence of changes in understanding, this article presents a methodology for mapping meanings in sustainability communication texts. The methodology uses techniques from corpus linguistics and framing theory. Two large databases of text were constructed by copying material down from the websites of two different groups of social actors: (i environmental NGOs and (ii British green business, and saving it as .txt files. The findings on individual words show that the NGOs and business use them very differently. Focusing on words expressing concern for the natural environment, it is proposed that the two actors also conceptualize their concern differently. Green business’s cognitive system of concern has two well-developed frames; good intentions and risk management. However, three frames—concern for the natural environment, perception of the damage, and responsibility, are light on detail. In contrast, within the NGOs’ system of concern, the frames of concern for the natural environment, perception of the damage and responsibility, contain words making detailed representations.

  16. Preparation of guides, on mechanics of fluids, for the physics teaching based on the investigatory methodology

    Energy Technology Data Exchange (ETDEWEB)

    Munoz, Loreto Mora; Buzzo, Ricardo; Martinez-Mardones, Javier; Romero, Angel [Instituto de Fisica, Pontificia Universidad Catolica de Valparaiso Av. Brasil 2950, Valparaiso (Chile)], E-mail: jmartine@ucv.cl

    2008-11-01

    The challenges in the present educational reform emphasize the professional character of the teacher in the planning of classes, execution of activities and evaluation of learning. A set of planned activities is not to a class of science as a pile of thrown bricks is to a house; this is, that if it is not counted on the knowledge of the preconceptions of the students, the daily realities that they face and of the expectations that these have at the time of participating in the science classes, cannot be obtained the proposed objectives. The well-known investigatory method applied to the education of sciences approaches the conceptual contents from practical activities of easy reproduction that are guided in effective form towards the topics to teach. Guides OPPS (Operation Primary Physics Science), of Louisiana University, are excellent examples of the application of this methodology, as much in the material that corresponds to the students as in the material for the guide of the learning activities (call Leader of the class). This international experience, within the framework used of the Plans and Programs of the Ministry of Education of Chile (MINEDUC), is the main axis of this work in which the accomplishment of guides with this methodology considers, approaching contained of a unit of the common plan of physics for third grade of high school.

  17. PAGIS summary report of phase 1: a common methodological approach based on European data and models

    International Nuclear Information System (INIS)

    Cadelli, N.; Cottone, G.; Bertozzi, G.; Girardi, F.

    1984-01-01

    Since 1982 a joint study has been launched by the CEC with the participation of national institutions in the E.C., aiming at a Performance Assessment of Geological Isolation Systems (PAGIS) for HLW disposal. This document is a summary of the first phase of the study which was devoted to the collection of data and models and to the choice of an appropriate methodology. To this purpose, real or national sites have been chosen, which are representative of three types of continental geological formations in the E.C.: clay, granite and salt (although the choices imply no committment of any kind about their final use). Moreover, sub-seabed areas have also been identified. The study covers the following items: - basic data on waste characteristics, site data and repository designs; - methodology, which allows sensitivity and uncertainty analyses to be performed, as well as the assessment of radiation doses to individuals and populations; - preliminary modelling of radionuclide release and migration through the geosphere (near- and far-field) and the biosphere following their various pathways to man; - selection of the most relevant radionuclide release scenarios and their probability of occurrence. Reference values have been selected for the basic data as well as variants covering the various options which are under consideration in the different Countries of the E.C.

  18. A New Methodology Based on Imbalanced Classification for Predicting Outliers in Electricity Demand Time Series

    Directory of Open Access Journals (Sweden)

    Francisco Javier Duque-Pintor

    2016-09-01

    Full Text Available The occurrence of outliers in real-world phenomena is quite usual. If these anomalous data are not properly treated, unreliable models can be generated. Many approaches in the literature are focused on a posteriori detection of outliers. However, a new methodology to a priori predict the occurrence of such data is proposed here. Thus, the main goal of this work is to predict the occurrence of outliers in time series, by using, for the first time, imbalanced classification techniques. In this sense, the problem of forecasting outlying data has been transformed into a binary classification problem, in which the positive class represents the occurrence of outliers. Given that the number of outliers is much lower than the number of common values, the resultant classification problem is imbalanced. To create training and test sets, robust statistical methods have been used to detect outliers in both sets. Once the outliers have been detected, the instances of the dataset are labeled accordingly. Namely, if any of the samples composing the next instance are detected as an outlier, the label is set to one. As a study case, the methodology has been tested on electricity demand time series in the Spanish electricity market, in which most of the outliers were properly forecast.

  19. Application of a methodology based on the Theory of Constraints in the sector of tourism services

    Directory of Open Access Journals (Sweden)

    Reyner Pérez Campdesuñer

    2017-04-01

    Full Text Available Purpose: The objective of the research was aimed at achieving the implementation of the theory of constraints on the operating conditions of a hotel, which differs by its characteristics of traditional processes that have applied this method, from the great heterogeneity of resources needed to meet the demand of customers. Design/methodology/approach: To achieve this purpose, a method of generating conversion equations that allowed to express all the resources of the organization under study depending on the number of customers to serve facilitating comparison between different resources and estimated demand through techniques developed traditional forecasting, these features were integrated into the classical methodology of theory of constraints. Findings: The application of tools designed for hospitality organizations allowed to demonstrate the applicability of the theory of constraints on entities under conditions different from the usual, develop a set of conversion equations of different resources facilitating comparison with demand and consequently achieve improve levels of efficiency and effectiveness of the organization. Originality/value: The originality of the research is summarized in the application of the theory of constraints in a very different from the usual conditions, covering 100% of the processes and resources in hospitality organizations.

  20. A methodology based in particle swarm optimization algorithm for preventive maintenance focused in reliability and cost

    International Nuclear Information System (INIS)

    Luz, Andre Ferreira da

    2009-01-01

    In this work, a Particle Swarm Optimization Algorithm (PSO) is developed for preventive maintenance optimization. The proposed methodology, which allows the use flexible intervals between maintenance interventions, instead of considering fixed periods (as usual), allows a better adaptation of scheduling in order to deal with the failure rates of components under aging. Moreover, because of this flexibility, the planning of preventive maintenance becomes a difficult task. Motivated by the fact that the PSO has proved to be very competitive compared to other optimization tools, this work investigates the use of PSO as an alternative tool of optimization. Considering that PSO works in a real and continuous space, it is a challenge to use it for discrete optimization, in which scheduling may comprise variable number of maintenance interventions. The PSO model developed in this work overcome such difficulty. The proposed PSO searches for the best policy for maintaining and considers several aspects, such as: probability of needing repair (corrective maintenance), the cost of such repairs, typical outage times, costs of preventive maintenance, the impact of maintaining the reliability of systems as a whole, and the probability of imperfect maintenance. To evaluate the proposed methodology, we investigate an electro-mechanical system consisting of three pumps and four valves, High Pressure Injection System (HPIS) of a PWR. Results show that PSO is quite efficient in finding the optimum preventive maintenance policies for the HPIS. (author)

  1. Depletion of the Complex Multiple Aquifer System of Jordan

    Science.gov (United States)

    Rödiger, T.; Siebert, C.; Geyer, S.; Merz, R.

    2017-12-01

    In many countries worldwide water scarcity pose a significant risk to the environment and the socio-economy. Particularly in countries where the available water resources are strongly limited by climatic conditions an accurate determination of the available water resources is of high priority, especially when water supply predominantly rely oon groundwater resources and their recharge. If groundwater abstraction exceeds the natural groundwater recharge in heavily used well field areas, overexploitation or persistent groundwater depletion occurs. This is the case in the Kingdom of Jordan, where a multi-layer aquifer complex forms the eastern subsurface catchment of the Dead Sea basin. Since the begin of the industrial and agricultural development of the country, dramatically falling groundwater levels, the disappearance of springs and saltwater intrusions from deeper aquifers is documented nation-wide. The total water budget is influenced by (i) a high climatic gradient from hyperarid to semiarid and (ii) the intnese anthropogenic abstraction. For this multi-layered aquifer system we developed a methodology to evaluate groundwater depletion by linking a hydrological and a numerical flow model including estimates of groundwater abstraction. Hence, we define groundwater depletion as the rate of groundwater abstraction in excess of natural recharge rate. Restricting our analysis, we calculated a range of groundwater depletion from 0% in the eastern Hamad basin to around 40% in the central part of Jordan and to extreme values of 100% of depletion in the Azraq and Disi basin.

  2. A methodology for the development of software agent based interoperable telemedicine systems: a tele-electrocardiography perspective.

    Science.gov (United States)

    Ganguly, P; Ray, P

    2000-01-01

    Telemedicine involves the integration of information, human-machine, and healthcare technologies. Because different modalities of patient care require applications running on heterogeneous computing environment, software interoperability is a major issue in telemedicine. Software agent technology provides a range of promising techniques to solve this problem. This article discusses the development of a methodology for the design of interoperable telemedicine systems (illustrated with a tele-electrocardiography application). Software interoperability between different applications can be modeled at different levels of abstraction such as physical interoperability, data-type interoperability, specification-level interoperability, and semantic interoperability. Software agents address the issue of software interoperability at semantic level. A popular object-oriented software development methodology - unified modeling language (UML) - has been used for this development. This research has demonstrated the feasibility of the development of agent-based interoperable telemedicine systems. More research is needed before widespread deployment of such systems can take place.

  3. A Comparative Depletion Analysis using MCNP6 and REBUS-3 for Advanced SFR Burner Core

    Energy Technology Data Exchange (ETDEWEB)

    You, Wu Seung; Hong, Ser Gi [Kyung Hee University, Yongin (Korea, Republic of)

    2016-05-15

    In this paper, we evaluated the accuracy of fast reactor design codes by comparing with MCNP6-based Monte Carlo simulation and REBUS-3-based the nodal transport theory for an initial cycle of an advanced uranium-free fueled SFR burner core having large heterogeneities. It was shown that the nodal diffusion calculation in REBUS-3 gave a large difference in initial k-effective value by 2132pcm when compared with MCNP6 depletion calculation using heterogeneous model.The code system validation for fast reactor design is one of the important research topics. In our previous studies, depletion analysis and physics parameter evaluation of fast reactor core were done with REBUS-3 code and DIF3D code, respectively. In particular, the depletion analysis was done with lumped fission products. However, it is need to verify the accuracy of these calculation methodologies by using Monte Carlo neutron transport calculation coupled with explicit treatment of fission products. In this study, the accuracy of fast reactor design codes and procedures were evaluated using MCNP6 code and VARIANT nodal transport calculation for an initial cycle of an advanced sodium-cooled burner core loaded with uranium-free fuels. It was considered that the REBUS-3 nodal diffusion option can not be used to accurately estimate the depletion calculations and VARIANT nodal transport or VARIANT SP3 options are required for this purpose for this kind of heterogeneous burner core loaded with uranium-free fuel. The control rod worths with nodal diffusion and transport options were estimated with discrepancies less than 12% while these methods for sodium void worth at BOC gave large discrepancies of 12.2% and 16.9%, respectively. It is considered that these large discrepancies in sodium void worth are resulted from the inaccurate consideration of spectrum change in multi-group cross section.

  4. Methodologic quality assessment of red blood cell transfusion guidelines and the evidence base of more restrictive transfusion thresholds.

    Science.gov (United States)

    Van Remoortel, Hans; De Buck, Emmy; Dieltjens, Tessa; Pauwels, Nele S; Compernolle, Veerle; Vandekerckhove, Philippe

    2016-02-01

    Recent literature suggests that more restrictive red blood cell (RBC) transfusion practices are equivalent or better than more liberal transfusion practices. The methodologic quality of guidelines recommending more restrictive transfusion thresholds and their underlying scientific evidence is unclear. Therefore, we aimed to evaluate the quality of the development process of RBC transfusion guidelines and to investigate the underlying evidence of guidelines recommending a more restrictive hemoglobin (Hb) threshold. Via systematic literature screening of relevant databases (NGC, GIN, Medline, and Embase), RBC transfusion guidelines recommending a more restrictive Hb level (methodologic quality by scoring the rigor of development domain (AGREE II checklist). The level of evidence served as a reference for the quality of the underlying evidence. The methodologic quality of 13 RBC transfusion guidelines was variable (18%-72%) but highest for those developed by Advancing Transfusion and Cellular Therapies Worldwide (72%), the Task Force of Advanced Bleeding Care in Trauma (70%), and the Dutch Institute for Healthcare Improvement (61%). A Hb level of less than 7 g/dL (intensive care unit patients) or less than 8 g/dL (postoperative patients) were the only thresholds based on high-quality evidence. Only four of 32 recommendations had a high-quality evidence base. Methodologic quality should be guaranteed in future RBC transfusion guideline development to ensure that the best available evidence is captured when recommending restrictive transfusion strategies. More high-quality trials are needed to provide a stronger scientific basis for RBC transfusion guidelines that recommend more restrictive transfusion thresholds. © 2015 AABB.

  5. On Competitiveness of Nearest-Neighbor-Based Music Classification: A Methodological Critique

    DEFF Research Database (Denmark)

    Pálmason, Haukur; Jónsson, Björn Thór; Amsaleg, Laurent

    2017-01-01

    The traditional role of nearest-neighbor classification in music classification research is that of a straw man opponent for the learning approach of the hour. Recent work in high-dimensional indexing has shown that approximate nearest-neighbor algorithms are extremely scalable, yielding results...... of reasonable quality from billions of high-dimensional features. With such efficient large-scale classifiers, the traditional music classification methodology of aggregating and compressing the audio features is incorrect; instead the approximate nearest-neighbor classifier should be given an extensive data...... collection to work with. We present a case study, using a well-known MIR classification benchmark with well-known music features, which shows that a simple nearest-neighbor classifier performs very competitively when given ample data. In this position paper, we therefore argue that nearest...

  6. A Capacitance-Based Methodology for the Estimation of Piezoelectric Coefficients of Poled Piezoelectric Materials

    KAUST Repository

    Al Ahmad, Mahmoud

    2010-10-04

    A methodology is proposed to estimate the piezoelectric coefficients of bulk piezoelectric materials using simple capacitance measurements. The extracted values of d33 and d31 from the capacitance measurements were 506 pC/N and 247 pC/N, respectively. The d33 value is in agreement with that obtained from the Berlincourt method, which gave a d33 value of 500 pC/N. In addition, the d31 value is in agreement with the value obtained from the optical method, which gave a d 31 value of 223 pC/V. These results suggest that the proposed method is a viable way to quickly estimate piezoelectric coefficients of bulk unclamped samples. © 2010 The Electrochemical Society.

  7. Quantifying price risk of electricity retailer based on CAPM and RAROC methodology

    Energy Technology Data Exchange (ETDEWEB)

    Karandikar, R.G.; Khaparde, S.A.; Kulkarni, S.V. [Electrical Engineering Department, Indian Institute of Technology Bombay, Mumbai 400 076 (India)

    2007-12-15

    In restructured electricity markets, electricity retailers set up contracts with generation companies (GENCOs) and with end users to meet their load requirements at agreed upon tariff. The retailers invest consumer payments as capital in the volatile competitive market. In this paper, a model for quantifying price risk of electricity retailer is proposed. An IEEE 30 Bus test system is used to demonstrate the model. The Capital Asset Pricing Model (CAPM) is demonstrated to determine the retail electricity price for the end users. The factor Risk Adjusted Recovery on Capital (RAROC) is used to quantify the price risk involved. The methodology proposed in this paper can be used by retailer while submitting proposal for electricity tariff to the regulatory authority. (author)

  8. Quantifying price risk of electricity retailer based on CAPM and RAROC methodology

    International Nuclear Information System (INIS)

    Karandikar, R.G.; Khaparde, S.A.; Kulkarni, S.V.

    2007-01-01

    In restructured electricity markets, electricity retailers set up contracts with generation companies (GENCOs) and with end users to meet their load requirements at agreed upon tariff. The retailers invest consumer payments as capital in the volatile competitive market. In this paper, a model for quantifying price risk of electricity retailer is proposed. An IEEE 30 Bus test system is used to demonstrate the model. The Capital Asset Pricing Model (CAPM) is demonstrated to determine the retail electricity price for the end users. The factor Risk Adjusted Recovery on Capital (RAROC) is used to quantify the price risk involved. The methodology proposed in this paper can be used by retailer while submitting proposal for electricity tariff to the regulatory authority. (author)

  9. Optimisation of wire-cut EDM process parameter by Grey-based response surface methodology

    Science.gov (United States)

    Kumar, Amit; Soota, Tarun; Kumar, Jitendra

    2018-03-01

    Wire electric discharge machining (WEDM) is one of the advanced machining processes. Response surface methodology coupled with Grey relation analysis method has been proposed and used to optimise the machining parameters of WEDM. A face centred cubic design is used for conducting experiments on high speed steel (HSS) M2 grade workpiece material. The regression model of significant factors such as pulse-on time, pulse-off time, peak current, and wire feed is considered for optimising the responses variables material removal rate (MRR), surface roughness and Kerf width. The optimal condition of the machining parameter was obtained using the Grey relation grade. ANOVA is applied to determine significance of the input parameters for optimising the Grey relation grade.

  10. The Case of Value Based Communication—Epistemological and Methodological Reflections from a System Theoretical Perspective

    Directory of Open Access Journals (Sweden)

    Victoria von Groddeck

    2010-09-01

    Full Text Available The aim of this paper is to reflect the epistemological and methodological aspects of an empirical research study which analyzes the phenomenon of increased value communication within business organizations from a system theoretical perspective in the tradition of Niklas LUHMANN. Drawing on the theoretical term of observation it shows how a research perspective can be developed which opens up the scope for an empirical analysis of communication practices. This analysis focuses on the reconstruction of these practices by first understanding how these practices stabilize themselves and second by contrasting different practices to educe an understanding of different forms of observation of the relevant phenomenon and of the functions of these forms. Thus, this approach combines system theoretical epistemology, analytical research strategies, such as form and functional analysis, and qualitative research methods, such as narrative interviews, participant observation and document analysis. URN: urn:nbn:de:0114-fqs1003177

  11. [Customer satisfaction in home care: methodological issues based on a survey carried out in Lazio].

    Science.gov (United States)

    Pasquarella, A; Marceca, M; Casagrande, S; Gentile, D; Zeppilli, D; Buonaiuto, N; Cozzolino, M; Guasticchi, G

    2007-01-01

    Home care customer satisfaction has been, until now, rarely evaluated. After illustrating the main italian regional surveys on this issue, the article presents a customer satisfaction survey carried out in the district of Civitavecchia (Local Health Unit 'Rome F'), Lazio, regarding 30 home care beneficiaries. Methodological aspects emerging from the survey are basically focused on: advantages and disadvantages of quantitative and qualitative approaches (possibly associated each other); main criteria of eligibility of people selected for interviewing, both patients or caregivers; conditions that maximize answers reliability, including training on interviewers. Authors highlight opportunity of using such kind of survey, integrated with other different tools, into a systemic vision, for promoting management changes coming from suggested problems, aimed at total quality management.

  12. An interpretable fuzzy rule-based classification methodology for medical diagnosis.

    Science.gov (United States)

    Gadaras, Ioannis; Mikhailov, Ludmil

    2009-09-01

    The aim of this paper is to present a novel fuzzy classification framework for the automatic extraction of fuzzy rules from labeled numerical data, for the development of efficient medical diagnosis systems. The proposed methodology focuses on the accuracy and interpretability of the generated knowledge that is produced by an iterative, flexible and meaningful input partitioning mechanism. The generated hierarchical fuzzy rule structure is composed by linguistic; multiple consequent fuzzy rules that considerably affect the model comprehensibility. The performance of the proposed method is tested on three medical pattern classification problems and the obtained results are compared against other existing methods. It is shown that the proposed variable input partitioning leads to a flexible decision making framework and fairly accurate results with a small number of rules and a simple, fast and robust training process.

  13. Improved Methodology of Weather Window Prediction for Offshore Operations Based on Probabilities of Operation Failure

    DEFF Research Database (Denmark)

    Gintautas, Tomas; Sørensen, John Dalsgaard

    2017-01-01

    The offshore wind industry is building and planning new wind farms further offshore due to increasing demand on sustainable energy production and already occupied prime resource locations closer to shore. Costs of operation and maintenance, transport and installation of offshore wind turbines...... already contribute significantly to the cost of produced electricity and will continue to increase, due to moving further offshore, if the current techniques of predicting offshore wind farm accessibility are to stay the same. The majority of offshore operations are carried out by specialized ships...... window estimates would result in better wind farm accessibility predictions and, as a consequence, potentially reduce the cost of offshore wind energy. This paper presents an updated methodology of weather window prediction that uses physical offshore vessel and equipment responses to establish...

  14. Delirium diagnosis methodology used in research: a survey-based study.

    Science.gov (United States)

    Neufeld, Karin J; Nelliot, Archana; Inouye, Sharon K; Ely, E Wesley; Bienvenu, O Joseph; Lee, Hochang Benjamin; Needham, Dale M

    2014-12-01

    To describe methodology used to diagnose delirium in research studies evaluating delirium detection tools. The authors used a survey to address reference rater methodology for delirium diagnosis, including rater characteristics, sources of patient information, and diagnostic process, completed via web or telephone interview according to respondent preference. Participants were authors of 39 studies included in three recent systematic reviews of delirium detection instruments in hospitalized patients. Authors from 85% (N = 33) of the 39 eligible studies responded to the survey. The median number of raters per study was 2.5 (interquartile range: 2-3); 79% were physicians. The raters' median duration of clinical experience with delirium diagnosis was 7 years (interquartile range: 4-10), with 5% having no prior clinical experience. Inter-rater reliability was evaluated in 70% of studies. Cognitive tests and delirium detection tools were used in the delirium reference rating process in 61% (N = 21) and 45% (N = 15) of studies, respectively, with 33% (N = 11) using both and 27% (N = 9) using neither. When patients were too drowsy or declined to participate in delirium evaluation, 70% of studies (N = 23) used all available information for delirium diagnosis, whereas 15% excluded such patients. Significant variability exists in reference standard methods for delirium diagnosis in published research. Increasing standardization by documenting inter-rater reliability, using standardized cognitive and delirium detection tools, incorporating diagnostic expert consensus panels, and using all available information in patients declining or unable to participate with formal testing may help advance delirium research by increasing consistency of case detection and improving generalizability of research results. Copyright © 2014 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.

  15. Methodology for Assessing the Quality of Agribusiness Activity Based on the Environmentally Responsible Approach

    Directory of Open Access Journals (Sweden)

    Anna Antonovna Anfinogentova

    2017-06-01

    Full Text Available The article is devoted to the research and development of quality evaluation methods of agro-industrial enterprises activity in the regional economy with the use of the ecological approach. The hypothesis of the study is that the activity of the economic entities (as well as of agribusiness must be assessed not only in the context of economic efficiency and effectiveness, but also in the context of environmental ethics and environmental aggression. As the initial data, we have used the indicators of economic statistics of Russian agrarian-oriented regions, as well as the data received from management reporting on the sample of enterprises of three regions (the Belgorod and Moscow regions, Krasnodar Territory. The article offers the economic and mathematical approach for measuring the level of the environmental responsibility of agro-industrial enterprises on the basic formula of the Mandelbrot set and statistical indicator of Hurst. Our scientific contribution is the development of a modified methodology for assessing the quality of the activity of agro-industrial enterprises using the parameter characterizing the level of environmental ethics and environmental aggression of these entities. The main result of the study is the approbation of the method, which has shown its practical applicability and relative coherence with certain indicators of regional ecological statistics. The proposed method is characterized by the integration of the different mathematical approaches and as an adaptive assessment tool that can be used to assess the quality of the activity of both agro-industrial enterprises and enterprises of other industries and fields of the economy. In the further works, the authors plan to develop methodological approaches to the assessment of the quality of agro-industrial products. At the same time, the main attention will be paid to the ecological and social component of the quality.

  16. A methodology based on openEHR archetypes and software agents for developing e-health applications reusing legacy systems.

    Science.gov (United States)

    Cardoso de Moraes, João Luís; de Souza, Wanderley Lopes; Pires, Luís Ferreira; do Prado, Antonio Francisco

    2016-10-01

    In Pervasive Healthcare, novel information and communication technologies are applied to support the provision of health services anywhere, at anytime and to anyone. Since health systems may offer their health records in different electronic formats, the openEHR Foundation prescribes the use of archetypes for describing clinical knowledge in order to achieve semantic interoperability between these systems. Software agents have been applied to simulate human skills in some healthcare procedures. This paper presents a methodology, based on the use of openEHR archetypes and agent technology, which aims to overcome the weaknesses typically found in legacy healthcare systems, thereby adding value to the systems. This methodology was applied in the design of an agent-based system, which was used in a realistic healthcare scenario in which a medical staff meeting to prepare a cardiac surgery has been supported. We conducted experiments with this system in a distributed environment composed by three cardiology clinics and a center of cardiac surgery, all located in the city of Marília (São Paulo, Brazil). We evaluated this system according to the Technology Acceptance Model. The case study confirmed the acceptance of our agent-based system by healthcare professionals and patients, who reacted positively with respect to the usefulness of this system in particular, and with respect to task delegation to software agents in general. The case study also showed that a software agent-based interface and a tools-based alternative must be provided to the end users, which should allow them to perform the tasks themselves or to delegate these tasks to other people. A Pervasive Healthcare model requires efficient and secure information exchange between healthcare providers. The proposed methodology allows designers to build communication systems for the message exchange among heterogeneous healthcare systems, and to shift from systems that rely on informal communication of actors to

  17. ICP-MS/MS-Based Ionomics: A Validated Methodology to Investigate the Biological Variability of the Human Ionome.

    Science.gov (United States)

    Konz, Tobias; Migliavacca, Eugenia; Dayon, Loïc; Bowman, Gene; Oikonomidi, Aikaterini; Popp, Julius; Rezzi, Serge

    2017-05-05

    We here describe the development, validation and application of a quantitative methodology for the simultaneous determination of 29 elements in human serum using state-of-the-art inductively coupled plasma triple quadrupole mass spectrometry (ICP-MS/MS). This new methodology offers high-throughput elemental profiling using simple dilution of minimal quantity of serum samples. We report the outcomes of the validation procedure including limits of detection/quantification, linearity of calibration curves, precision, recovery and measurement uncertainty. ICP-MS/MS-based ionomics was used to analyze human serum of 120 older adults. Following a metabolomic data mining approach, the generated ionome profiles were subjected to principal component analysis revealing gender and age-specific differences. The ionome of female individuals was marked by higher levels of calcium, phosphorus, copper and copper to zinc ratio, while iron concentration was lower with respect to male subjects. Age was associated with lower concentrations of zinc. These findings were complemented with additional readouts to interpret micronutrient status including ceruloplasmin, ferritin and inorganic phosphate. Our data supports a gender-specific compartmentalization of the ionome that may reflect different bone remodelling in female individuals. Our ICP-MS/MS methodology enriches the panel of validated "Omics" approaches to study molecular relationships between the exposome and the ionome in relation with nutrition and health.

  18. Comparative Analysis of VERA Depletion Problems

    International Nuclear Information System (INIS)

    Park, Jinsu; Kim, Wonkyeong; Choi, Sooyoung; Lee, Hyunsuk; Lee, Deokjung

    2016-01-01

    Each code has its own solver for depletion, which can produce different depletion calculation results. In order to produce reference solutions for depletion calculation comparison, sensitivity studies should be preceded for each depletion solver. The sensitivity tests for burnup interval, number of depletion zones, and recoverable energy per fission (Q-value) were performed in this paper. For the comparison of depletion calculation results, usually the multiplication factors are compared as a function of burnup. In this study, new comparison methods have been introduced by using the number density of isotope or element, and a cumulative flux instead of burnup. In this paper, optimum depletion calculation options are determined through the sensitivity study of the burnup intervals and the number of depletion intrazones. Because the depletion using CRAM solver performs well for large burnup intervals, smaller number of burnup steps can be used to produce converged solutions. It was noted that the depletion intra-zone sensitivity is only pin-type dependent. The 1 and 10 depletion intra-zones for the normal UO2 pin and gadolinia rod, respectively, are required to obtain the reference solutions. When the optimized depletion calculation options are used, the differences of Q-values are found to be a main cause of the differences of solutions. In this paper, new comparison methods were introduced for consistent code-to-code comparisons even when different kappa libraries were used in the depletion calculations

  19. Comparative Analysis of VERA Depletion Problems

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jinsu; Kim, Wonkyeong; Choi, Sooyoung; Lee, Hyunsuk; Lee, Deokjung [Ulsan National Institute of Science and Technology, Ulsan (Korea, Republic of)

    2016-10-15

    Each code has its own solver for depletion, which can produce different depletion calculation results. In order to produce reference solutions for depletion calculation comparison, sensitivity studies should be preceded for each depletion solver. The sensitivity tests for burnup interval, number of depletion zones, and recoverable energy per fission (Q-value) were performed in this paper. For the comparison of depletion calculation results, usually the multiplication factors are compared as a function of burnup. In this study, new comparison methods have been introduced by using the number density of isotope or element, and a cumulative flux instead of burnup. In this paper, optimum depletion calculation options are determined through the sensitivity study of the burnup intervals and the number of depletion intrazones. Because the depletion using CRAM solver performs well for large burnup intervals, smaller number of burnup steps can be used to produce converged solutions. It was noted that the depletion intra-zone sensitivity is only pin-type dependent. The 1 and 10 depletion intra-zones for the normal UO2 pin and gadolinia rod, respectively, are required to obtain the reference solutions. When the optimized depletion calculation options are used, the differences of Q-values are found to be a main cause of the differences of solutions. In this paper, new comparison methods were introduced for consistent code-to-code comparisons even when different kappa libraries were used in the depletion calculations.

  20. Methodology for developing evidence-based clinical imaging guidelines: Joint recommendations by Korea society of radiology and national evidence-based healthcare collaborating agency

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Sol Ji; Jo, Ae Jeong; Choi, Jin A [Div. for Healthcare Technology Assessment Research, National Evidence-Based Healthcare Collaborating Agency, Seoul (Korea, Republic of); and others

    2017-01-15

    This paper is a summary of the methodology including protocol used to develop evidence-based clinical imaging guidelines (CIGs) in Korea, led by the Korean Society of Radiology and the National Evidence-based Healthcare Collaborating Agency. This is the first protocol to reflect the process of developing diagnostic guidelines in Korea. The development protocol is largely divided into the following sections: set-up, process of adaptation, and finalization. The working group is composed of clinical imaging experts, and the developmental committee is composed of multidisciplinary experts to validate the methodology. The Korean CIGs will continue to develop based on this protocol, and these guidelines will act for decision supporting tools for clinicians as well as reduce medical radiation exposure.

  1. Methodology for qualification of wood-based ash according to REACH - prestudy

    Energy Technology Data Exchange (ETDEWEB)

    Sjoeblom, Rolf (Tekedo AB, Nykoeping (Sweden)); Tivegaard, Anna-Maria (SSAB Merox AB, Oxeloesund (Sweden))

    2010-02-15

    The new European Union framework directive on waste is to be implemented during the year 2010. According to this directive, much of what today is regarded as waste will instead be assessed as by-products and in many cases fall under the new European union regulation REACH (Registration, Evaluation, Authorisation and Restriction of Chemicals). REACH applies in conjunction with the new European Union regulation CLP (Classification, Labelling and Packaging of substances and mixtures). There are introductory periods for both of these regulations, and in the case of CLP this regards transition from the present and previous rules under the dangerous substances and dangerous preparations directives (DSD and DPD, respectively). Similarly, the new framework directive on waste supersedes the previous directive and some other statements. There is a connection between the directives of waste and the rules for classification and labelling in that the classification of waste (in the categories hazardous and non-hazardous) build on (but are not identical to) the rules for labelling. Similarly, the national Swedish rules for acceptance of recycled material (waste) for use in geotechnical constructions relate to the provisions in REACH on assessment of chemical safety in the both request that the risk be assessed to be small, and that the same or similar methodologies can be applied to verify this. There is a 'reference alternative' in REACH that implies substantial testing prior to registration. Registration is the key to use of a substance even though a substance may be used as such, in a mixture, or to be released from an article. However, REACH as well as CLP contain a number of provisions for using literature data, data on similar chemicals e t c in order to avoid unnecessary testing. This especially applies to testing on humans and vertebrate animals. Vaermeforsk, through its Programme on Environmentally Friendly Use of Non-Coal Ashes has developed methodologies and

  2. A methodology for obtaining the control rods patterns in a BWR using systems based on ants colonies

    International Nuclear Information System (INIS)

    Ortiz S, J.J.; Requena R, I.

    2003-01-01

    In this work the AZCATL-PBC system based on a technique of ants colonies for the search of control rods patterns of those reactors of the Nuclear Power station of Laguna Verde (CNLV) is presented. The technique was applied to a transition cycle and one of balance. For both cycles they were compared the k ef values obtained with a Haling calculation and the control rods pattern proposed by AZCATL-PBC for a burnt one fixed. It was found that the methodology is able to extend the length of the cycle with respect to the Haling prediction, maintaining sure to the reactor. (Author)

  3. Prevalence of cluster headache in the Republic of Georgia: results of a population-based study and methodological considerations

    DEFF Research Database (Denmark)

    Katsarava, Z; Dzagnidze, A; Kukava, M

    2009-01-01

    We present a study of the general-population prevalence of cluster headache in the Republic of Georgia and discuss the advantages and challenges of different methodological approaches. In a community-based survey, specially trained medical residents visited 500 adjacent households in the capital...... city, Tbilisi, and 300 households in the eastern rural area of Kakheti. They interviewed all (n = 1145) biologically unrelated adult occupants using a previously validated questionnaire. The household responses rates were 92% in Tbilisi and 100% in Kakheti. The survey identified 32 persons...... approach, which has an obvious advantage of high-quality data collection, but is very demanding of manpower and time....

  4. Stability of Circulating Blood-Based MicroRNAs - Pre-Analytic Methodological Considerations

    DEFF Research Database (Denmark)

    Glinge, Charlotte; Clauss, Sebastian; Boddum, Kim

    2017-01-01

    BACKGROUND AND AIM: The potential of microRNAs (miRNA) as non-invasive diagnostic, prognostic, and predictive biomarkers, as well as therapeutic targets, has recently been recognized. Previous studies have highlighted the importance of consistency in the methodology used, but to our knowledge...... was evaluated by measuring expression changes of miR-1, miR-21 and miR-29b at different conditions: varying processing time of whole blood (up to 72 hours (h)), long-term storage (9 months at -80°C), physical disturbance (1 and 8 h), as well as in a series of freeze/thaw cycles (1 and 4 times). RESULTS...... = 4) freeze-thaw cycles resulted in a significant reduction of miRNA concentration both in plasma and serum samples. CONCLUSION: This study highlights the importance of proper and systematic sample collection and preparation when measuring circulating miRNAs, e.g., in context of clinical trials. We...

  5. Vibrational Study and Force Field of the Citric Acid Dimer Based on the SQM Methodology

    Directory of Open Access Journals (Sweden)

    Laura Cecilia Bichara

    2011-01-01

    Full Text Available We have carried out a structural and vibrational theoretical study for the citric acid dimer. The Density Functional Theory (DFT method with the B3LYP/6-31G∗ and B3LYP/6-311++G∗∗ methods have been used to study its structure and vibrational properties. Then, in order to get a good assignment of the IR and Raman spectra in solid phase of dimer, the best fit possible between the calculated and recorded frequencies was carry out and the force fields were scaled using the Scaled Quantum Mechanic Force Field (SQMFF methodology. An assignment of the observed spectral features is proposed. A band of medium intensity at 1242 cm−1 together with a group of weak bands, previously not assigned to the monomer, was in this case assigned to the dimer. Furthermore, the analysis of the Natural Bond Orbitals (NBOs and the topological properties of electronic charge density by employing Bader's Atoms in Molecules theory (AIM for the dimer were carried out to study the charge transference interactions of the compound.

  6. OPTIMIZATION OF POTASSIUM NITRATE BASED SOLID PROPELLANT GRAINS FORMULATION USING RESPONSE SURFACE METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Oladipupo Olaosebikan Ogunleye

    2015-08-01

    Full Text Available This study was designed to evaluate the effect of propellant formulation and geometry on the solid propellant grains internal ballistic performance using core, bates, rod and tubular and end-burn geometries. Response Surface Methodology (RSM was used to analyze and optimize the effect of sucrose, potassium nitrate and carbon on the chamber pressure, temperature, thrust and specific impulse of the solid propellant grains through Central Composite Design (CCD of the experiment. An increase in potassium nitrate increased the specific impulse while an increase in sucrose and carbon decreased specific impulse. The coefficient of determination (R2 for models of chamber pressure, temperature, thrust and specific impulse in terms of composition and geometry were 0.9737, 0.9984, 0.9745 and 0.9589, respectively. The optimum specific impulse of 127.89 s, pressure (462201 Pa, temperature (1618.3 K and thrust (834.83 N were obtained using 0.584 kg of sucrose, 1.364 kg of potassium nitrate and 0.052 kg of carbon as well as bate geometry. There was no significant difference between the calculated and experimented ballistic properties at p < 0.05. The bate grain geometry is more efficient for minimizing the oscillatory pressure in the combustion chamber.

  7. Development of a system dynamics model based on Six Sigma methodology

    Directory of Open Access Journals (Sweden)

    José Jovani Cardiel Ortega

    2017-01-01

    Full Text Available A dynamic model to analyze the complexity associated with the manufacturing systems and to improve the performance of the process through the Six Sigma philosophy is proposed. The research focuses on the implementation of the system dynamics tool to comply with each of the phases of the DMAIC methodology. In the first phase, define, the problem is articulated, collecting data, selecting the variables, and representing them in a mental map that helps build the dynamic hypothesis. In the second phase, measure, model is formulated, equations are developed, and Forrester diagram is developed to carry out the simulation. In the third phase, analyze, the simulation results are studied. For the fourth phase, improving, the model is validated through a sensitivity analysis. Finally, in control phase, operation policies are proposed. This paper presents the development of a dynamic model of the system of knitted textile production knitted developed; the implementation was done in a textile company in southern Guanajuato. The results show an improvement in the process performance by increasing the level of sigma allowing the validation of the proposed approach.

  8. Study on an ISO 15926 based data modeling methodology for nuclear power industry

    Energy Technology Data Exchange (ETDEWEB)

    Cheon, Yang Ho; Park, Byeong Ho; Park, Seong Chan; Kim, Eun Kee [KEPCO E-C, Yongin (Korea, Republic of)

    2014-10-15

    The scope is therefore data integration and data to support the whole life of a plant. This representation is specified by a generic, conceptual Data Model (DM) that is independent of any particular application, but that is able to record data from the applications used in plant design, fabrication and operation. The data model is designed to be used in conjunction with Reference Data (RD): standard instances of the DM that represent information common to a number of users, plants, or both. This paper introduces a high level description of the structure of ISO 15926 and how this can be adapted to the nuclear power plant industry in particular. This paper introduces ISO 15926 methodology and how to extend the existing RDL for nuclear power industry. As the ISO 15926 representation is independent of applications, interfaces to existing or future applications have to be developed. Such interfaces are provided by Templates that takes input from external sources and 'lifts' it into an ISO 15926 repository, and/or 'lowers' the data into other applications. This is a similar process to the process defined by W3C. Data exchange can be done using e.g. XML messages, but the modelling is independent of technology used for the exchange.

  9. Spatial analysis of electricity demand patterns in Greece: Application of a GIS-based methodological framework

    Science.gov (United States)

    Tyralis, Hristos; Mamassis, Nikos; Photis, Yorgos N.

    2016-04-01

    We investigate various uses of electricity demand in Greece (agricultural, commercial, domestic, industrial use as well as use for public and municipal authorities and street lightning) and we examine their relation with variables such as population, total area, population density and the Gross Domestic Product. The analysis is performed on data which span from 2008 to 2012 and have annual temporal resolution and spatial resolution down to the level of prefecture. We both visualize the results of the analysis and we perform cluster and outlier analysis using the Anselin local Moran's I statistic as well as hot spot analysis using the Getis-Ord Gi* statistic. The definition of the spatial patterns and relationships of the aforementioned variables in a GIS environment provides meaningful insight and better understanding of the regional development model in Greece and justifies the basis for an energy demand forecasting methodology. Acknowledgement: This research has been partly financed by the European Union (European Social Fund - ESF) and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: ARISTEIA II: Reinforcement of the interdisciplinary and/ or inter-institutional research and innovation (CRESSENDO project; grant number 5145).

  10. A Methodology for Multiple Rule System Integration and Resolution Within a Singular Knowledge Base

    Science.gov (United States)

    Kautzmann, Frank N., III

    1988-01-01

    Expert Systems which support knowledge representation by qualitative modeling techniques experience problems, when called upon to support integrated views embodying description and explanation, especially when other factors such as multiple causality, competing rule model resolution, and multiple uses of knowledge representation are included. A series of prototypes are being developed to demonstrate the feasibility of automating the process of systems engineering, design and configuration, and diagnosis and fault management. A study involves not only a generic knowledge representation; it must also support multiple views at varying levels of description and interaction between physical elements, systems, and subsystems. Moreover, it will involve models of description and explanation for each level. This multiple model feature requires the development of control methods between rule systems and heuristics on a meta-level for each expert system involved in an integrated and larger class of expert system. The broadest possible category of interacting expert systems is described along with a general methodology for the knowledge representation and control of mutually exclusive rule systems.

  11. Visual methodologies and participatory action research: Performing women's community-based health promotion in post-Katrina New Orleans.

    Science.gov (United States)

    Lykes, M Brinton; Scheib, Holly

    2016-01-01

    Recovery from disaster and displacement involves multiple challenges including accompanying survivors, documenting effects, and rethreading community. This paper demonstrates how African-American and Latina community health promoters and white university-based researchers engaged visual methodologies and participatory action research (photoPAR) as resources in cross-community praxis in the wake of Hurricane Katrina and the flooding of New Orleans. Visual techniques, including but not limited to photonarratives, facilitated the health promoters': (1) care for themselves and each other as survivors of and responders to the post-disaster context; (2) critical interrogation of New Orleans' entrenched pre- and post-Katrina structural racism as contributing to the racialised effects of and responses to Katrina; and (3) meaning-making and performances of women's community-based, cross-community health promotion within this post-disaster context. This feminist antiracist participatory action research project demonstrates how visual methodologies contributed to the co-researchers' cross-community self- and other caring, critical bifocality, and collaborative construction of a contextually and culturally responsive model for women's community-based health promotion post 'unnatural disaster'. Selected limitations as well as the potential for future cross-community antiracist feminist photoPAR in post-disaster contexts are discussed.

  12. Methodological specifics of the study of micro HPP based on internal combustion engines with air cooling and cogeneration

    Science.gov (United States)

    Shchinnikov, P. A.; Tomilov, V. G.; Sinelnikov, D. S.

    2017-01-01

    The article considers some aspects of the research methodology of micro heat power plants based on internal combustion engines with air cooling and cogeneration based on energy balance equations and the laws of heat transfer. The research is conducted for such a setup based on the Hitachi internal combustion engine with 2.4 kW capacity. It has shown the efficiency of cogeneration use in the form of useful heat flow from air, cooling the cylinder head, with its further heating by utilizing the heat of flue gases in an additional plate heat exchanger. It has been shown that the cogeneration can save fuel costs 3-10 times compared with heat guns, depending on the duration of the setup use.

  13. Brain-Based Learning and Classroom Practice: A Study Investigating Instructional Methodologies of Urban School Teachers

    Science.gov (United States)

    Morris, Lajuana Trezette

    2010-01-01

    The purpose of this study was to examine the implementation of brain-based instructional strategies by teachers serving at Title I elementary, middle, and high schools within the Memphis City School District. This study was designed to determine: (a) the extent to which Title I teachers applied brain-based strategies, (b) the differences in…

  14. Technology-Enhanced Problem-Based Learning Methodology in Geographically Dispersed Learners of Tshwane University of Technology

    Directory of Open Access Journals (Sweden)

    Sibitse M. Tlhapane

    2010-03-01

    Full Text Available Improving teaching and learning methodologies is not just a wish but rather strife for most educational institutions globally. To attain this, the Adelaide Tambo School of Nursing Science implemented a Technology-enhanced Problem-Based Learning methodology in the programme B Tech Occupational Nursing, in 2006. This is a two-year post-basic nursing program. The students are geographically dispersed and the curriculum design is the typically student-centred outcomes-based education. The research question posed by this paper is: How does technology-enhanced problem-based learning enhance student-centred learning, thinking skills, social skills and social space for learners? To answer the above question, a case study with both qualitative and quantitative data was utilised. The participants consisted of all students registered for the subject Occupational Health level 4. The sample group was chosen from willing participants from the Pretoria, eMalahleni and Polokwane learning sites, using the snowball method. This method was seen as appropriate due to the timing of the study. Data was collected using a questionnaire with both open and closed-ended questions. An analyses of the students‟ end of year examination was also done, including a comparison of performances by students on technology enhanced problem-based learning and those on problem-based learning only. The findings revealed that with Technology-enhanced Problem Based Learning (PBL, students‟ critical thinking, problem solving, and social skills improved and that social space was enhanced. This was supported by improved grades in students‟ on Technology-enhanced PBL as compared to those on PBL only.

  15. Towards a cognitive robotics methodology for reward-based decision-making: dynamical systems modelling of the Iowa Gambling Task

    Science.gov (United States)

    Lowe, Robert; Ziemke, Tom

    2010-09-01

    The somatic marker hypothesis (SMH) posits that the role of emotions and mental states in decision-making manifests through bodily responses to stimuli of import to the organism's welfare. The Iowa Gambling Task (IGT), proposed by Bechara and Damasio in the mid-1990s, has provided the major source of empirical validation to the role of somatic markers in the service of flexible and cost-effective decision-making in humans. In recent years the IGT has been the subject of much criticism concerning: (1) whether measures of somatic markers reveal that they are important for decision-making as opposed to behaviour preparation; (2) the underlying neural substrate posited as critical to decision-making of the type relevant to the task; and (3) aspects of the methodological approach used, particularly on the canonical version of the task. In this paper, a cognitive robotics methodology is proposed to explore a dynamical systems approach as it applies to the neural computation of reward-based learning and issues concerning embodiment. This approach is particularly relevant in light of a strongly emerging alternative hypothesis to the SMH, the reversal learning hypothesis, which links, behaviourally and neurocomputationally, a number of more or less complex reward-based decision-making tasks, including the 'A-not-B' task - already subject to dynamical systems investigations with a focus on neural activation dynamics. It is also suggested that the cognitive robotics methodology may be used to extend systematically the IGT benchmark to more naturalised, but nevertheless controlled, settings that might better explore the extent to which the SMH, and somatic states per se, impact on complex decision-making.

  16. Methodology for heritage conservation in Belgium based on multi-temporal interferometry

    Science.gov (United States)

    Bejarano-Urrego, L.; Verstrynge, E.; Shimoni, M.; Lopez, J.; Walstra, J.; Declercq, P.-Y.; Derauw, D.; Hayen, R.; Van Balen, K.

    2017-09-01

    Soil differential settlements that cause structural damage to heritage buildings are precipitating cultural and economic value losses. Adequate damage assessment as well as protection and preservation of the built patrimony are priorities at national and local levels, so they require advanced integration and analysis of environmental, architectural and historical parameters. The GEPATAR project (GEotechnical and Patrimonial Archives Toolbox for ARchitectural conservation in Belgium) aims to create an online interactive geo-information tool that allows the user to view and to be informed about the Belgian heritage buildings at risk due to differential soil settlements. Multi-temporal interferometry techniques (MTI) have been proven to be a powerful technique for analyzing earth surface deformation patterns through time series of Synthetic Aperture Radar (SAR) images. These techniques allow to measure ground movements over wide areas at high precision and relatively low cost. In this project, Persistent Scatterer Synthetic Aperture Radar Interferometry (PS-InSAR) and Multidimensional Small Baseline Subsets (MSBAS) are used to measure and monitor the temporal evolution of surface deformations across Belgium. This information is integrated with the Belgian heritage data by means of an interactive toolbox in a GIS environment in order to identify the level of risk. At country scale, the toolbox includes ground deformation hazard maps, geological information, location of patrimony buildings and land use; while at local scale, it includes settlement rates, photographic and historical surveys as well as architectural and geotechnical information. Some case studies are investigated by means of on-site monitoring techniques and stability analysis to evaluate the applied approaches. This paper presents a description of the methodology being implemented in the project together with the case study of the Saint Vincent's church which is located on a former colliery zone. For

  17. New insights into survival trend analyses in cancer population-based studies: the SUDCAN methodology.

    Science.gov (United States)

    Uhry, Zoé; Bossard, Nadine; Remontet, Laurent; Iwaz, Jean; Roche, Laurent

    2017-01-01

    The main objective of the SUDCAN study was to compare, for 15 cancer sites, the trends in net survival and excess mortality rates from cancer 5 years after diagnosis between six European Latin countries (Belgium, France, Italy, Portugal, Spain and Switzerland). The data were extracted from the EUROCARE-5 database. The study period ranged from 6 (Portugal, 2000-2005) to 18 years (Switzerland, 1989-2007). Trend analyses were carried out separately for each country and cancer site; the number of cases ranged from 1500 to 104 000 cases. We developed an original flexible excess rate modelling strategy that accounts for the continuous effects of age, year of diagnosis, time since diagnosis and their interactions. Nineteen models were constructed; they differed in the modelling of the effect of the year of diagnosis in terms of linearity, proportionality and interaction with age. The final model was chosen according to the Akaike Information Criterion. The fit was assessed graphically by comparing model estimates versus nonparametric (Pohar-Perme) net survival estimates. Out of the 90 analyses carried out, the effect of the year of diagnosis on the excess mortality rate depended on age in 61 and was nonproportional in 64; it was nonlinear in 27 out of the 75 analyses where this effect was considered. The model fit was overall satisfactory. We analysed successfully 15 cancer sites in six countries. The refined methodology proved necessary for detailed trend analyses. It is hoped that three-dimensional parametric modelling will be used more widely in net survival trend studies as it has major advantages over stratified analyses.

  18. An accelerometry-based methodology for assessment of real-world bilateral upper extremity activity.

    Directory of Open Access Journals (Sweden)

    Ryan R Bailey

    Full Text Available The use of both upper extremities (UE is necessary for the completion of many everyday tasks. Few clinical assessments measure the abilities of the UEs to work together; rather, they assess unilateral function and compare it between affected and unaffected UEs. Furthermore, clinical assessments are unable to measure function that occurs in the real-world, outside the clinic. This study examines the validity of an innovative approach to assess real-world bilateral UE activity using accelerometry.Seventy-four neurologically intact adults completed ten tasks (donning/doffing shoes, grooming, stacking boxes, cutting playdough, folding towels, writing, unilateral sorting, bilateral sorting, unilateral typing, and bilateral typing while wearing accelerometers on both wrists. Two variables, the Bilateral Magnitude and Magnitude Ratio, were derived from accelerometry data to distinguish between high- and low-intensity tasks, and between bilateral and unilateral tasks. Estimated energy expenditure and time spent in simultaneous UE activity for each task were also calculated.The Bilateral Magnitude distinguished between high- and low-intensity tasks, and the Magnitude Ratio distinguished between unilateral and bilateral UE tasks. The Bilateral Magnitude was strongly correlated with estimated energy expenditure (ρ = 0.74, p<0.02, and the Magnitude Ratio was strongly correlated with time spent in simultaneous UE activity (ρ = 0.93, p<0.01 across tasks.These results demonstrate face validity and construct validity of this methodology to quantify bilateral UE activity during the performance of everyday tasks performed in a laboratory setting, and can now be used to assess bilateral UE activity in real-world environments.

  19. Validation of a Crowdsourcing Methodology for Developing a Knowledge Base of Related Problem-Medication Pairs.

    Science.gov (United States)

    McCoy, A B; Wright, A; Krousel-Wood, M; Thomas, E J; McCoy, J A; Sittig, D F

    2015-01-01

    Clinical knowledge bases of problem-medication pairs are necessary for many informatics solutions that improve patient safety, such as clinical summarization. However, developing these knowledge bases can be challenging. We sought to validate a previously developed crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large, non-university health care system with a widely used, commercially available electronic health record. We first retrieved medications and problems entered in the electronic health record by clinicians during routine care during a six month study period. Following the previously published approach, we calculated the link frequency and link ratio for each pair then identified a threshold cutoff for estimated problem-medication pair appropriateness through clinician review; problem-medication pairs meeting the threshold were included in the resulting knowledge base. We selected 50 medications and their gold standard indications to compare the resulting knowledge base to the pilot knowledge base developed previously and determine its recall and precision. The resulting knowledge base contained 26,912 pairs, had a recall of 62.3% and a precision of 87.5%, and outperformed the pilot knowledge base containing 11,167 pairs from the previous study, which had a recall of 46.9% and a precision of 83.3%. We validated the crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large non-university health care system with a widely used, commercially available electronic health record, indicating that the approach may be generalizable across healthcare settings and clinical systems. Further research is necessary to better evaluate the knowledge, to compare crowdsourcing with other approaches, and to evaluate if incorporating the knowledge into electronic health records improves patient outcomes.

  20. A gradient-based method for segmenting FDG-PET images: methodology and validation

    International Nuclear Information System (INIS)

    Geets, Xavier; Lee, John A.; Gregoire, Vincent; Bol, Anne; Lonneux, Max

    2007-01-01

    A new gradient-based method for segmenting FDG-PET images is described and validated. The proposed method relies on the watershed transform and hierarchical cluster analysis. To allow a better estimation of the gradient intensity, iteratively reconstructed images were first denoised and deblurred with an edge-preserving filter and a constrained iterative deconvolution algorithm. Validation was first performed on computer-generated 3D phantoms containing spheres, then on a real cylindrical Lucite phantom containing spheres of different volumes ranging from 2.1 to 92.9 ml. Moreover, laryngeal tumours from seven patients were segmented on PET images acquired before laryngectomy by the gradient-based method and the thresholding method based on the source-to-background ratio developed by Daisne (Radiother Oncol 2003;69:247-50). For the spheres, the calculated volumes and radii were compared with the known values; for laryngeal tumours, the volumes were compared with the macroscopic specimens. Volume mismatches were also analysed. On computer-generated phantoms, the deconvolution algorithm decreased the mis-estimate of volumes and radii. For the Lucite phantom, the gradient-based method led to a slight underestimation of sphere volumes (by 10-20%), corresponding to negligible radius differences (0.5-1.1 mm); for laryngeal tumours, the segmented volumes by the gradient-based method agreed with those delineated on the macroscopic specimens, whereas the threshold-based method overestimated the true volume by 68% (p = 0.014). Lastly, macroscopic laryngeal specimens were totally encompassed by neither the threshold-based nor the gradient-based volumes. The gradient-based segmentation method applied on denoised and deblurred images proved to be more accurate than the source-to-background ratio method. (orig.)

  1. Mental health community based funding: Ohio's experience in revising its funding allocation methodology.

    Science.gov (United States)

    Seiber, Eric E; Sweeney, Helen Anne; Partridge, Jamie; Dembe, Allard E; Jones, Holly

    2012-10-01

    Over the past 20 years, states have increasingly moved away from centrally financed, state-operated facilities to financing models built around community-based service delivery mechanisms. This paper identifies four important broad factors to consider when developing a funding formula to allocate state funding for community mental health services to local boards in an equitable manner, based on local community need: (1) funding factors used by other states; (2) state specific legislative requirements; (3) data availability; and (4) local variation of factors in the funding formula. These considerations are illustrated with the recent experience of Ohio using available evidence and data sources to develop a new community-based allocation formula. We discuss opportunities for implementing changes in formula based mental health funding related to Medicaid expansions for low income adults scheduled to go into effect under the new Patient Protection and Affordable Care Act.

  2. Associations - Communities - Residents. Building together a citizen-based project of renewable energies - Methodological guide

    International Nuclear Information System (INIS)

    Ramard, Dominique; Fleury, Laurianne; Peyret, Albert; Ghesquiere, Christine; Kauber, Markus; Jourdain, Pierre

    2012-11-01

    This guide first outlines the challenges and stakes of citizen-based renewable energies: example of a necessary energy transition in Brittany, interest of a local production of renewable energies, examples in other European countries, and emergence of a citizen-based energy movement in France. The second part presents the four main phases of such a project (diagnosis, development, construction, and exploitation), the main issues to be addressed, and the main steps of a citizen-based renewable energy project (technical, legal and financial, and citizen-related aspects during the different phases). The third part describes how to elaborate a citizen-based project: by addressing the project dimensions, by defining a legal specification, by performing a provisional business model, by choosing an appropriate legal structure, by creating a project company, and by mobilizing local actors). The last part addresses how to finance the project: by building up own funds, by asking banks for support, and by citizen participation to investment

  3. Redesigning Acquisition Processes: A New Methodology Based on the Flow of Knowledge and Information

    National Research Council Canada - National Science Library

    Kock, Ned F; Murphy, Frederic

    2001-01-01

    Current business process redesign practices, in the defense sector as well as in business in general, are based on several assumptions inherited from Taylor s scientific management method, including...

  4. Design of Economic Evaluations of Mindfulness-Based Interventions: Ten Methodological Questions of Which to Be Mindful.

    Science.gov (United States)

    Edwards, Rhiannon Tudor; Bryning, Lucy; Crane, Rebecca

    Mindfulness-based interventions (MBIs) are being increasingly applied in a variety of settings. A growing body of evidence to support the effectiveness of these interventions exists and there are a few published cost-effectiveness studies. With limited resources available within public sectors (health care, social care, and education), it is necessary to build in concurrent economic evaluations alongside trials in order to inform service commissioning and policy. If future research studies are well-designed, they have strong potential to investigate the economic impact of MBIs. The particular challenge to the health economist is how best to capture the ways that MBIs help people adjust to or build resilience to difficult life circumstances, and to disseminate effectively to enable policy makers to judge the value of the contribution that MBIs can make within the context of the limited resourcing of public services. In anticipation of more research worldwide evaluating MBIs in various settings, this article suggests ten health economics methodological design questions that researchers may want to consider prior to conducting MBI research. These questions draw on both published standards of good methodological practice in economic evaluation of medical interventions, and on the authors' knowledge and experience of mindfulness-based practice. We argue that it is helpful to view MBIs as both complex interventions and as public health prevention initiatives. Our suggestions for well-designed economic evaluations of MBIs in health and other settings, mirror current thinking on the challenges and opportunities of public health economics.

  5. A GIS-based methodology for drought vulnerability modelling: application at the region of el Hodna, central Algeria

    Directory of Open Access Journals (Sweden)

    Meriem Boultif

    2017-06-01

    Full Text Available Boultif, M. and Benmessaoud, H. 2017. A GIS-based methodology for drought vulnerability modelling: application at the region of el Hodna, central Algeria. Lebanese Science Journal, 18(1: 53-72. Desert covers 80% of the Algerian territory, while the remaining area is covered by Mediterranean forests and arid climate steppe that are characterized by severe vulnerability to different stresses such as drought, especially with the increase of nefarious human impact and the overuse of natural resources. The objective of this study is to analyse and assess drought vulnerability in the area of El Hodna in central Algeria. The methodology was based on the use of GIS tools and multi-criteria analysis (Analytical hierarchy process to develop a model of vulnerability mapping. The results showed that 35.67% of the study area was very vulnerable, 32.77% in fragile situation, 19.72% are potentially vulnerable, and only 11.83% of the surface is not affected. The drought-vulnerability map provides a basis from which it will be possible to prevent and prepare for a drought response.

  6. Computing elastic‐rebound‐motivated rarthquake probabilities in unsegmented fault models: a new methodology supported by physics‐based simulators

    Science.gov (United States)

    Field, Edward H.

    2015-01-01

    A methodology is presented for computing elastic‐rebound‐based probabilities in an unsegmented fault or fault system, which involves computing along‐fault averages of renewal‐model parameters. The approach is less biased and more self‐consistent than a logical extension of that applied most recently for multisegment ruptures in California. It also enables the application of magnitude‐dependent aperiodicity values, which the previous approach does not. Monte Carlo simulations are used to analyze long‐term system behavior, which is generally found to be consistent with that of physics‐based earthquake simulators. Results cast doubt that recurrence‐interval distributions at points on faults look anything like traditionally applied renewal models, a fact that should be considered when interpreting paleoseismic data. We avoid such assumptions by changing the "probability of what" question (from offset at a point to the occurrence of a rupture, assuming it is the next event to occur). The new methodology is simple, although not perfect in terms of recovering long‐term rates in Monte Carlo simulations. It represents a reasonable, improved way to represent first‐order elastic‐rebound predictability, assuming it is there in the first place, and for a system that clearly exhibits other unmodeled complexities, such as aftershock triggering.

  7. A proposal for transmission pricing methodology in Thailand based on electricity tracing and long-run average incremental cost

    International Nuclear Information System (INIS)

    Limpasuwan, T.; Bialek, J.W.; Ongsakul, W.; Limmeechokchai, B.

    2004-01-01

    Although there is no universally accepted methodology for restructuring of electricity supply industry, the transformations often involve separation of generation and transmission. Such separation results in a need for a transmission service charge to be levied on the system users. The National Energy Policy Office (NEPO) of Thailand has commissioned PricewaterhouseCooper (PwC) to propose a transmission service charge that is to be used during the market reform for the transmission business unit of the Electricity Generating Authority of Thailand (EGAT). Although the PwCs transmission use of system charge (TUOS) based on the long-run average incremental cost (LRAIC) and average transmission loss can satisfy the financial requirements, the charge allocations are not economically efficient since they do not provide any locational signal which could reflect costs imposed on the system by locating a system user in a particular geographical location. This paper describes the TUOS methodology suggested by PwC and makes a comparison with a transmission pricing method based on combination of the electricity tracing and LRAIC. The results indicate that, with electricity tracing, the charge allocations are improved in terms of fairness, as the charge reflects the geographical location and system conditions

  8. A stale challenge to the philosophy of science: commentary on "Is psychology based on a methodological error?" by Michael Schwarz.

    Science.gov (United States)

    Ruck, Nora; Slunecko, Thomas

    2010-06-01

    In his article "Is psychology based on a methodological error?" and based on a quite convincing empirical basis, Michael Schwarz offers a methodological critique of one of mainstream psychology's key test theoretical axioms, i.e., that of the in principle normal distribution of personality variables. It is characteristic of this paper--and at first seems to be a strength of it--that the author positions his critique within a frame of philosophy of science, particularly positioning himself in the tradition of Karl Popper's critical rationalism. When scrutinizing Schwarz's arguments, however, we find Schwarz's critique profound only as an immanent critique of test theoretical axioms. We raise doubts, however, as to Schwarz's alleged 'challenge' to the philosophy of science because the author not at all seems to be in touch with the state of the art of contemporary philosophy of science. Above all, we question the universalist undercurrent that Schwarz's 'bio-psycho-social model' of human judgment boils down to. In contrast to such position, we close our commentary with a plea for a context- and culture sensitive philosophy of science.

  9. Uranium, depleted uranium, biological effects

    International Nuclear Information System (INIS)

    2001-01-01

    Physicists, chemists and biologists at the CEA are developing scientific programs on the properties and uses of ionizing radiation. Since the CEA was created in 1945, a great deal of research has been carried out on the properties of natural, enriched and depleted uranium in cooperation with university laboratories and CNRS. There is a great deal of available data about uranium; thousands of analyses have been published in international reviews over more than 40 years. This presentation on uranium is a very brief summary of all these studies. (author)

  10. Setting the property tax base: Some methodological issues, practice in Serbia and the results

    Directory of Open Access Journals (Sweden)

    Vasiljević Dušan

    2016-01-01

    Full Text Available Since the beginning of decentralization of responsibility for administration of the property tax in Serbia in 2006, this tax has significantly gained in importance for the budgets of local governments. While the first steps towards decentralization of property tax were aimed at broadening the tax base, the latest amendments to the Law on Property Tax make significant steps to introduce modern model of determining the tax base, as a particularly complex and important element of administering this tax from the standpoint of tax equity and revenue generation. In this paper we discuss different approaches to determining the property tax base and their implications. Then we review the current legal framework for determining the property tax base in Serbia. We find positive developments in terms of narrowing application of the dual model of determining the tax base that treated differently physical and legal persons. In the quantitative segment of this research, we use a robust database of local governments' ordinances setting the key elements for determining the property tax in 138 of the total 145 local government units in the country in 2014 and 2015 to apply analyses of the coefficient of variation of individual values of the average prices of different categories of immovable properties in different zones. Then we compare values determined for the year of 2015 with those set for 2014. Based on that research, we find relatively good performance of the current system of taxation of residential buildings but also identify existence of impermissible inconsistencies in determining the property tax base in the case of land (agriculture, forest and construction land, which undermines the principle of tax equity and limits the revenue potential of the property tax.

  11. Methodological advances

    Directory of Open Access Journals (Sweden)

    Lebreton, J.-D.

    2004-06-01

    Full Text Available The study of population dynamics has long depended on methodological progress. Among many striking examples, continuous time models for populations structured in age (Sharpe & Lotka, 1911 were made possible by progress in the mathematics of integral equations. Therefore the relationship between population ecology and mathematical and statistical modelling in the broad sense raises a challenge in interdisciplinary research. After the impetus given in particular by Seber (1982, the regular biennial EURING conferences became a major vehicle to achieve this goal. It is thus not surprising that EURING 2003 included a session entitled “Methodological advances”. Even if at risk of heterogeneity in the topics covered and of overlap with other sessions, such a session was a logical way of ensuring that recent and exciting new developments were made available for discussion, further development by biometricians and use by population biologists. The topics covered included several to which full sessions were devoted at EURING 2000 (Anderson, 2001 such as: individual covariates, Bayesian methods, and multi–state models. Some other topics (heterogeneity models, exploited populations and integrated modelling had been addressed by contributed talks or posters. Their presence among “methodological advances”, as well as in other sessions of EURING 2003, was intended as a response to their rapid development and potential relevance to biological questions. We briefly review all talks here, including those not published in the proceedings. In the plenary talk, Pradel et al. (in prep. developed GOF tests for multi–state models. Until recently, the only goodness–of–fit procedures for multistate models were ad hoc, and non optimal, involving use of standard tests for single state models (Lebreton & Pradel, 2002. Pradel et al. (2003 proposed a general approach based in particular on mixtures of multinomial distributions. Pradel et al. (in prep. showed

  12. Restoring depleted resources: Efficacy and mechanisms of change of an internet-based unguided recovery training for better sleep and psychological detachment from work.

    Science.gov (United States)

    Ebert, David D; Berking, Matthias; Thiart, Hanne; Riper, Heleen; Laferton, Johannes A C; Cuijpers, Pim; Sieland, Bernhard; Lehr, Dirk

    2015-12-01

    This randomized controlled trial evaluated the efficacy of an Internet-based intervention, which aimed to improve recovery from work-related strain in teachers with sleeping problems and work-related rumination. In addition, mechanisms of change were also investigated. A sample of 128 teachers with elevated symptoms of insomnia (Insomnia Severity Index [ISI] ≥ 15) and work-related rumination (Cognitive Irritation Scale ≥ 15) was assigned to either an Internet-based recovery training (intervention condition [IC]) or to a waitlist control condition (CC). The IC consisted of 6 Internet-based sessions that aimed to promote healthy restorative behavior. Self-report data were assessed at baseline and again after 8 weeks. Additionally, a sleep diary was used starting 1 week before baseline and ending 1 week after postassessment. The primary outcome was insomnia severity. Secondary outcomes included perseverative cognitions (i.e., work-related rumination and worrying), a range of recovery measures and depression. An extended 6-month follow-up was assessed in the IC only. A serial multiple mediator analysis was carried out to investigate mechanisms of change. IC participants displayed a significantly greater reduction in insomnia severity (d = 1.37, 95% confidence interval: 0.99-1.77) than did participants of the CC. The IC was also superior with regard to changes in all investigated secondary outcomes. Effects were maintained until a naturalistic 6-month follow-up. Effects on insomnia severity were mediated by both a reduction in perseverative cognitions and sleep effort. Additionally, a greater increase in number of recovery activities per week was found to be associated with lower perseverative cognitions that in turn led to a greater reduction in insomnia severity. This study provides evidence for the efficacy of an unguided, Internet-based occupational recovery training and provided first evidence for a number of assumed mechanisms of change. (PsycINFO Database

  13. Achieving process intensification form the application of a phenomena based synthesis, Design and intensification methodology

    DEFF Research Database (Denmark)

    Babi, Deenesh Kavi; Lutze, Philip; Woodley, John

    for the manufacture of methyl acetate by replacing with one single reactive distillation column the multi-step process which consisted of one reactor, extractive distillation, liquid-liquid separation and azeotropic distillation. However, except for reactive distillation and dividing wall columns, the implementation...... of PI still faces challenges [2] because the identification and design of intensified processes is not simple [3]. Lutze et al [3] has developed a systematic PI synthesis/design method at the unit operations (Unit-Ops) level, where the search space is based on a knowledge-base of existing PI equipment....... Siirola [4] has proposed a task-based approach known as the means-ends analysis. A limitation with the means-ends analysis is that it becomes difficult to apply if too many corrective tasks should be identified and replaced and if too many alternatives should be considered From the above PI methods...

  14. The assessment of emotional intelligence: a comparison of performance-based and self-report methodologies.

    Science.gov (United States)

    Goldenberg, Irina; Matheson, Kimberly; Mantler, Janet

    2006-02-01

    We assessed the patterns of convergent validity for the Mayer-Salovey-Caruso Emotional Intelligence Test (Mayer, Salovey, & Caruso, 2002), a performance-based measure of emotional intelligence (EI) that entails presenting problems thought to have correct responses, and a self-report measure of EI (Schutte et al., 1998). The relations between EI and demographic characteristics of a diverse community sample (N = 223) concurred with previous research. However, the performance-based and self-report scales were not related to one another. Only self-reported EI scores showed a consistent pattern of relations with self-reported coping styles and depressive affect, whereas the performance-based measure demonstrated stronger relations with age, education, and receiving psychotherapy. We discuss implications for the validity of these measures and their utility.

  15. A sampling and metagenomic sequencing-based methodology for monitoring antimicrobial resistance in swine herds

    DEFF Research Database (Denmark)

    Munk, Patrick; Dalhoff Andersen, Vibe; de Knegt, Leonardo

    2016-01-01

    Objectives Reliable methods for monitoring antimicrobial resistance (AMR) in livestock and other reservoirs are essential to understand the trends, transmission and importance of agricultural resistance. Quantification of AMR is mostly done using culture-based techniques, but metagenomic read...... on known antimicrobial consumption in 10 Danish integrated slaughter pig herds. In addition, we evaluated whether fresh or manure floor samples constitute suitable proxies for intestinal sampling, using cfu counting, qPCR and metagenomic shotgun sequencing. Results Metagenomic read-mapping outperformed...... cultivation-based techniques in terms of predicting expected tetracycline resistance based on antimicrobial consumption. Our metagenomic approach had sufficient resolution to detect antimicrobial-induced changes to individual resistance gene abundances. Pen floor manure samples were found to represent rectal...

  16. Find-rate methodology and resource base estimates of the Hydrocarbon Supply Model (1990 update). Topical report

    International Nuclear Information System (INIS)

    Woods, T.

    1991-02-01

    The Hydrocarbon Supply Model is used to develop long-term trends in Lower-48 gas production and costs. The model utilizes historical find-rate patterns to predict the discovery rate and size distribution of future oil and gas field discoveries. The report documents the methodologies used to quantify historical oil and gas field find-rates and to project those discovery patterns for future drilling. It also explains the theoretical foundations for the find-rate approach. The new field and reserve growth resource base is documented and compared to other published estimates. The report has six sections. Section 1 provides background information and an overview of the model. Sections 2, 3, and 4 describe the theoretical foundations of the model, the databases, and specific techniques used. Section 5 presents the new field resource base by region and depth. Section 6 documents the reserve growth model components

  17. Application of Microprocessor-Based Equipment in Nuclear Power Plants - Technical Basis for a Qualification Methodology

    International Nuclear Information System (INIS)

    Korsah, K.

    2001-01-01

    This document (1) summarizes the most significant findings of the ''Qualification of Advanced Instrumentation and Control (I and C) Systems'' program initiated by the Nuclear Regulatory Commission (NRC); (2) documents a comparative analysis of U.S. and European qualification standards; and (3) provides recommendations for enhancing regulatory guidance for environmental qualification of microprocessor-based safety-related systems. Safety-related I and C system upgrades of present-day nuclear power plants, as well as I and C systems of Advanced Light-Water Reactors (ALWRs), are expected to make increasing use of microprocessor-based technology. The Nuclear Regulatory Commission (NRC) recognized that the use of such technology may pose environmental qualification challenges different from current, analog-based I and C systems. Hence, it initiated the ''Qualification of Advanced Instrumentation and Control Systems'' program. The objectives of this confirmatory research project are to (1) identify any unique environmental-stress-related failure modes posed by digital technologies and their potential impact on the safety systems and (2) develop the technical basis for regulatory guidance using these findings. Previous findings from this study have been documented in several technical reports. This final report in the series documents a comparative analysis of two environmental qualification standards--Institute of Electrical and Electronics Engineers (IEEE) Std 323-1983 and International Electrotechnical Commission (IEC) 60780 (1998)--and provides recommendations for environmental qualification of microprocessor-based systems based on this analysis as well as on the findings documented in the previous reports. The two standards were chosen for this analysis because IEEE 323 is the standard used in the U.S. for the qualification of safety-related equipment in nuclear power plants, and IEC 60780 is its European counterpart. In addition, the IEC document was published in

  18. Computer Class Role Playing Games, an innovative teaching methodology based on STEM and ICT: first experimental results

    Science.gov (United States)

    Maraffi, S.

    2016-12-01

    Context/PurposeWe experienced a new teaching and learning technology: a Computer Class Role Playing Game (RPG) to perform educational activity in classrooms through an interactive game. This approach is new, there are some experiences on educational games, but mainly individual and not class-based. Gaming all together in a class, with a single scope for the whole class, it enhances peer collaboration, cooperative problem solving and friendship. MethodsTo perform the research we experimented the games in several classes of different degrees, acquiring specific questionnaire by teachers and pupils. Results Experimental results were outstanding: RPG, our interactive activity, exceed by 50% the overall satisfaction compared to traditional lessons or Power Point supported teaching. InterpretationThe appreciation of RPG was in agreement with the class level outcome identified by the teacher after the experimentation. Our work experience get excellent feedbacks by teachers, in terms of efficacy of this new teaching methodology and of achieved results. Using new methodology more close to the student point of view improves the innovation and creative capacities of learners, and it support the new role of teacher as learners' "coach". ConclusionThis paper presents the first experimental results on the application of this new technology based on a Computer game which project on a wall in the class an adventure lived by the students. The plots of the actual adventures are designed for deeper learning of Science, Technology, Engineering, Mathematics (STEM) and Social Sciences & Humanities (SSH). The participation of the pupils it's based on the interaction with the game by the use of their own tablets or smartphones. The game is based on a mixed reality learning environment, giving the students the feel "to be IN the adventure".

  19. A FEM based methodology to simulate multiple crack propagation in friction stir welds

    DEFF Research Database (Denmark)

    Lepore, Marcello; Carlone, Pierpaolo; Berto, Filippo

    2017-01-01

    In this work a numerical procedure, based on a finite element approach, is proposed to simulate multiple three-dimensional crack propagation in a welded structure. Cracks are introduced in a friction stir welded AA2024-T3 butt joint, affected by a process-induced residual stress scenario...

  20. Genome-based prediction of common diseases: Methodological considerations for future research

    NARCIS (Netherlands)

    A.C.J.W. Janssens (Cécile); P. Tikka-Kleemola (Päivi)

    2009-01-01

    textabstractThe translation of emerging genomic knowledge into public health and clinical care is one of the major challenges for the coming decades. At the moment, genome-based prediction of common diseases, such as type 2 diabetes, coronary heart disease and cancer, is still not informative. Our

  1. A Methodology for Integrating Tools in a Web-Based Environment

    National Research Council Canada - National Science Library

    Arslan, Musa

    2000-01-01

    ...." The Internet and the World Wide Web are getting more important and bigger than ever. Because of the increase in the importance of the Internet and the Web, migrating old applications and tools to a web-based environment is becoming more important...

  2. Cognitive Activity-based Design Methodology for Novice Visual Communication Designers

    Science.gov (United States)

    Kim, Hyunjung; Lee, Hyunju

    2016-01-01

    The notion of design thinking is becoming more concrete nowadays, as design researchers and practitioners study the thinking processes involved in design and employ the concept of design thinking to foster better solutions to complex and ill-defined problems. The goal of the present research is to develop a cognitive activity-based design…

  3. Methodology for Design of Web-Based Laparoscopy e-Training System

    Science.gov (United States)

    Borissova, Daniela; Mustakerov, Ivan

    2011-01-01

    The Web-based e-learning can benefit from the modern multimedia tools combined with network capabilities to overcome traditional education. The objective of this paper is focused on e-training system development to improve performance of theoretical knowledge and providing ample opportunity for practical attainment of manual skills in virtual…

  4. An Investigation of Image Fusion Algorithms using a Visual Performance-based Image Evaluation Methodology

    Science.gov (United States)

    2009-02-01

    Stathaki, Tania (2006). Adaptive image fusion using ICA bases. Proceedings of IEEE International Conference on Acoustics, Speech and Signal Processing, 2...selection strategy. Li , H.; Manjunath, B. S. and Mitra, S. K. (1995). Multisensor image fusion using the wavelet transform. Graphical Models and Image

  5. Effectiveness and Accountability of the Inquiry-Based Methodology in Middle School Science

    Science.gov (United States)

    Hardin, Cade

    2009-01-01

    When teaching science, the time allowed for students to make discoveries on their own through the inquiry method directly conflicts with the mandated targets of a broad spectrum of curricula. Research shows that using an inquiry-based approach can encourage student motivation and increase academic achievement (Wolf & Fraser, 2008, Bryant, 2006,…

  6. A Museum in a Book: Teaching Culture through Decolonizing, Arts-Based Methodologies

    Science.gov (United States)

    Chappell, Sharon Verner; Chappell, Drew

    2011-01-01

    This paper explores the positivist, museum-based, and touristic constructions of indigenous cultures in the Americas, as represented in the DK "Eyewitness" series, and then overturns these constructions using an artist book created by the authors. In our analysis of the nonfiction series, we identified three trajectories: cataloguing, consignment…

  7. Demonstrating the Potential for Web-Based Survey Methodology with a Case Study.

    Science.gov (United States)

    Mertler, Craig

    2002-01-01

    Describes personal experience with using the Internet to administer a teacher-motivation and job-satisfaction survey to elementary and secondary teachers. Concludes that advantages of Web-base surveys, such as cost savings and efficiency of data collection, outweigh disadvantages, such as the limitations of listservs. (Contains 10 references.)…

  8. The methodology of the pedagogical tests in computer science based on the integrated model

    Directory of Open Access Journals (Sweden)

    Дмитрий Владимирович Шойтов

    2011-09-01

    Full Text Available The article deals with the creation of pedagogical tests based on the integrated model. The algorithm of the system design of test tasks in the formation of students' tests are defined by a positive test students in mastering course material.

  9. [Impact of diffusion of the methodology of evidence-based nursing through Facebook].

    Science.gov (United States)

    Santillán García, Azucena

    2013-05-01

    Evaluate the impact of diffusion of the contents of the blog "Evidence-Based Nursing" through Facebook. Cross-sectional study carried out via a web link to the online survey (previously tested) to a population of 2132 Facebook profiles that had a "friendship" with the profile studied. The survey had 8 items and a descriptive study of variables was conducted using SPSS 19. 75.9% of the sample has a Facebook profile of a personal character and 94.1% of cases are interested in evidence-based practices. 55.6% of the sample knows the blog, plus 46.5% answered that they read it occasionally, compared with 17.1% who does regularly and 35.7% who say they do not read it. Of all readers, 75.75% say they have improved their knowledge in terms of evidence-based practice after reading it. 88% said they did not follow the blog by other means or social network and in the case that they did, the most used are Google Reader, and Twitter Networked Blogs. Reading the contents of this blog improve the knowledge about evidence-based practices of the "friends" of the social profile, as they themselves relate. The adequacy of the social profile as a dissemination tool is successful as it is necessary to investigate in depth the functioning of social networks.

  10. Methodological exploratory study applied to occupational epidemiology

    International Nuclear Information System (INIS)

    Carneiro, Janete C.G. Gaburo; Vasques, MOnica Heloisa B.; Fontinele, Ricardo S.; Sordi, Gian Maria A.

    2007-01-01

    The utilization of epidemiologic methods and techniques has been object of practical experimentation and theoretical-methodological reflection in health planning and programming process. Occupational Epidemiology is the study of the causes and prevention of diseases and injuries from exposition and risks in the work environment. In this context, there is no intention to deplete such a complex theme but to deal with basic concepts of Occupational Epidemiology, presenting the main characteristics of the analysis methods used in epidemiology, as investigate the possible determinants of exposition (chemical, physical and biological agents). For this study, the social-demographic profile of the IPEN-CNEN/SP work force was used. The knowledge of this reference population composition is based on sex, age, educational level, marital status and different occupations, aiming to know the relation between the health aggravating factors and these variables. The methodology used refers to a non-experimental research based on a theoretical methodological practice. The work performed has an exploratory character, aiming a later survey of indicators in the health area in order to analyze possible correlations related to epidemiologic issues. (author)

  11. Range and Battery Depletion Concerns with Electric Vehicles

    Directory of Open Access Journals (Sweden)

    Tomio Miwa

    2017-01-01

    Full Text Available This study investigates the effects of the range of a battery electric vehicle (EV by using questionnaire data. The concern about battery depletion changes according to charging station deployment. Firstly, the methodology for deriving the probabilistic distribution of the daily travel distance is developed, which enables us to analyze people’s tolerance of the risk of battery depletion. Secondly, the desired range of an EV is modeled. This model considers the effect of changing charging station deployment and can analyze the variation in the desired range. Then, the intention of a household to purchase an EV is analyzed by incorporating range-related variables. The results show that people can live with a risk of battery depletion of around 2% to 5%. The deployment of charging stations at large retail facilities and/or workplace parking spaces reduces the desired range of an EV. Finally, the answers to the questionnaire show that the probability of battery depletion on a driving day has little effect on the intention to purchase an EV. Instead, people tend to evaluate the range by itself or directly compare it with their desired range.

  12. Development of depletion perturbation theory for a reactor nodal code

    Energy Technology Data Exchange (ETDEWEB)

    Bowman, S.M.

    1981-09-01

    A generalized depletion perturbation (DPT) theory formulation for light water reactor (LWR) depletion problems is developed and implemented into the three-dimensional LWR nodal code SIMULATE. This development applies the principles of the original derivation by M.L. Williams to the nodal equations solved by SIMULATE. The present formulation is first described in detail, and the nodal coupling methodology in SIMULATE is used to determine partial derivatives of the coupling coefficients. The modifications to the original code and the new DPT options available to the user are discussed. Finally, the accuracy and the applicability of the new DPT capability to LWR design analysis are examined for several LWR depletion test cases. The cases range from simple static cases to a realistic PWR model for an entire fuel cycle. Responses of interest included K/sub eff/, nodal peaking, and peak nodal exposure. The nonlinear behavior of responses with respect to perturbations of the various types of cross sections was also investigated. The time-dependence of the sensitivity coefficients for different responses was examined and compared. Comparison of DPT results for these examples to direct calculations reveals the limited applicability of depletion perturbation theory to LWR design calculations at the present. The reasons for these restrictions are discussed, and several methods which might improve the computational accuracy of DPT are proposed for future research.

  13. Methodological Overview of an African American Couple-Based HIV/STD Prevention Trial

    Science.gov (United States)

    2010-01-01

    Objective To provide an overview of the NIMH Multisite HIV/STD Prevention Trial for African American Couples conducted in four urban areas: Atlanta, Los Angeles, New York, and Philadelphia. The rationale, study design methods, proposed data analyses, and study management are described. Design This is a two arm randomized Trial, implementing a modified randomized block design, to evaluate the efficacy of a couples based intervention designed for HIV serodiscordant African American couples. Methods The study phases consisted of formative work, pilot studies, and a randomized clinical trial. The sample is 535 HIV serodiscordant heterosexual African American couples. There are two theoretically derived behavioral interventions with eight group and individual sessions: the Eban HIV/STD Risk Reduction Intervention (treatment) versus the Eban Health Promotion Intervention (control). The treatment intervention was couples based and focused on HIV/STD risk reduction while the control was individual based and focused on health promotion. The two study conditions were structurally similar in length and types of activities. At baseline, participants completed an Audio Computer-assisted Self Interview (ACASI) interview as well as interviewer-administered questionnaire, and provided biological specimens to assess for STDs. Similar follow-up assessments were conducted immediately after the intervention, at 6 months, and at 12 months. Results The Trial results will be analyzed across the four sites by randomization assignment. Generalized estimating equations (GEE) and mixed effects modeling (MEM) are planned to test: (1) the effects of the intervention on STD incidence and condom use as well as on mediator variables of these outcomes, and (2) whether the effects of the intervention differ depending on key moderator variables (e.g., gender of the HIV-seropositive partners, length of relationship, psychological distress, sexual abuse history, and substance abuse history

  14. Efficient use of design-based binning methodology in a DRAM fab

    Science.gov (United States)

    Karsenti, Laurent; Wehner, Arno; Fischer, Andreas; Seifert, Uwe; Goeckeritz, Jens; Geshel, Mark; Gscheidlen, Dieter; Bartov, Avishai

    2009-03-01

    It is a well established fact that as design rules and printed features shrink, sophisticated techniques are required to ensure the design intent is indeed printed on the wafer. Techniques of this kind are Optical Proximity Correction (OPC), Resolution Enhancement Techniques (RET) and DFM Design for Manufacturing (DFM). As these methods are applied to the overall chip and rely on complex modeling and simulations, they increase the risk of creating local areas or layouts with a limiting process window. Hence, it is necessary to verify the manufacturability (sufficient depth of focus) of the overall die and not only of a pre-defined set of metrology structures. The verification process is commonly based on full chip defect density inspection of a Focus Exposure Matrix (FEM) wafer, combined with appropriate post processing of the inspection data. This is necessary to avoid time consuming search for the Defects of Interest (DOI's) as defect counts are usually too high to be handled by manual SEM review. One way to post process defect density data is the so called design based binning (DBB). The Litho Qualification Monitor (LQM) system allows to classify and also to bin defects based on design information. In this paper we will present an efficient way to combine classification and binning in order to check design rules and to determine the marginal features (layout with low depth of focus). The Design Based Binning has been connected to the Yield Management System (YMS) to allow new process monitoring approaches towards Design Based SPC. This could dramatically cut the time to detect systematic defects inline.

  15. Influence of phosphoproteins' biomimetic analogs on remineralization of mineral-depleted resin-dentin interfaces created with ion-releasing resin-based systems.

    Science.gov (United States)

    Sauro, Salvatore; Osorio, Raquel; Watson, Timothy F; Toledano, Manuel

    2015-07-01

    The study aimed at evaluating the remineralization of acid-etched dentin pre-treated with primers containing biomimetic analogs and bonded using an ion-releasing light-curable resin-based material. An experimental etch-and-rinse adhesive system filled with Ca(2+), PO4(3-)-releasing Ca-Silicate micro-fillers was created along with two experimental primers containing biomimetic analogs such as sodium trimetaphosphate (TMP) and/or polyaspartic acid (PLA). Dentin specimens etched with 37% H3PO4 were pre-treated with two different aqueous primers containing the polyanionic biomimetic analogs or deionized water and subsequently bonded using the experimental resin-based materials. The specimens were sectioned and analyzed by AFM/nanoindentation to evaluate changes in the modulus of elasticity (Ei) across the resin-dentin interface at different AS storage periods (up to 90 days). Raman cluster analysis was also performed to evaluate the chemical changes along the interface. The phosphate uptake by the acid-etched dentin was evaluated using the ATR-FTIR. Additional resin-dentin specimens were tested for microtensile bond strength. SEM examination was performed after de-bonding, while confocal laser microscopy was used to evaluate the interfaces ultramorphology and micropermeability. Both biomimetic primers induced phosphate uptake by acid-etched dentin. Specimens created with the ion-releasing resin in combination with the pre-treatment primers containing either PLA and TMA showed the greatest recovery of the Ei of the hybrid layer, with no decrease in μTBS (p>0.05) after 3-month AS storage. The ion-releasing resin applied after use of the biomimetic primers showed the greatest reduction in micropermeability due to mineral precipitation; these results were confirmed using SEM. The use of the ion-releasing resin-based system applied to acid-etched dentin pre-treated with biomimetic primers containing analogs of phosphoproteins such as poly-l-aspartic acid and/or sodium

  16. Challenges in implementing a Planetary Boundaries based Life-Cycle Impact Assessment methodology

    DEFF Research Database (Denmark)

    Ryberg, Morten; Owsianiak, Mikolaj; Richardson, Katherine

    2016-01-01

    Impacts on the environment from human activities are now threatening to exceed thresholds for central Earth System processes, potentially moving the Earth System out of the Holocene state. To avoid such consequences, the concept of Planetary Boundaries was defined in 2009, and updated in 2015......, for a number of processes which are essential for maintaining the Earth System in its present state. Life-Cycle Assessment was identified as a suitable tool for linking human activities to the Planetary Boundaries. However, to facilitate proper use of Life-Cycle Assessment for non-global environmental...... management based on the Planetary Boundaries, there is a need for linking non-global activities to impacts on a planetary level. In this study, challenges related to development and operationalization of a Planetary Boundary based Life-Cycle Impact Assessment method are identified and the feasibility...

  17. Attribute Based Selection of Thermoplastic Resin for Vacuum Infusion Process: A Decision Making Methodology

    DEFF Research Database (Denmark)

    Raghavalu Thirumalai, Durai Prabhakaran; Lystrup, Aage; Løgstrup Andersen, Tom

    2012-01-01

    The composite industry looks toward a new material system (resins) based on thermoplastic polymers for the vacuum infusion process, similar to the infusion process using thermosetting polymers. A large number of thermoplastics are available in the market with a variety of properties suitable...... for different engineering applications, and few of those are available in a not yet polymerised form suitable for resin infusion. The proper selection of a new resin system among these thermoplastic polymers is a concern for manufactures in the current scenario and a special mathematical tool would...... be beneficial. In this paper, the authors introduce a new decision making tool for resin selection based on significant attributes. This article provides a broad overview of suitable thermoplastic material systems for vacuum infusion process available in today’s market. An illustrative example—resin selection...

  18. Neurorehabilitation using the virtual reality based Rehabilitation Gaming System: methodology, design, psychometrics, usability and validation.

    Science.gov (United States)

    Cameirão, Mónica S; Badia, Sergi Bermúdez I; Oller, Esther Duarte; Verschure, Paul F M J

    2010-09-22

    Stroke is a frequent cause of adult disability that can lead to enduring impairments. However, given the life-long plasticity of the brain one could assume that recovery could be facilitated by the harnessing of mechanisms underlying neuronal reorganization. Currently it is not clear how this reorganization can be mobilized. Novel technology based neurorehabilitation techniques hold promise to address this issue. Here we describe a Virtual Reality (VR) based system, the Rehabilitation Gaming System (RGS) that is based on a number of hypotheses on the neuronal mechanisms underlying recovery, the structure of training and the role of individualization. We investigate the psychometrics of the RGS in stroke patients and healthy controls. We describe the key components of the RGS and the psychometrics of one rehabilitation scenario called Spheroids. We performed trials with 21 acute/subacute stroke patients and 20 healthy controls to study the effect of the training parameters on task performance. This allowed us to develop a Personalized Training Module (PTM) for online adjustment of task difficulty. In addition, we studied task transfer between physical and virtual environments. Finally, we assessed the usability and acceptance of the RGS as a rehabilitation tool. We show that the PTM implemented in RGS allows us to effectively adjust the difficulty and the parameters of the task to the user by capturing specific features of the movements of the arms. The results reported here also show a consistent transfer of movement kinematics between physical and virtual tasks. Moreover, our usability assessment shows that the RGS is highly accepted by stroke patients as a rehabilitation tool. We introduce a novel VR based paradigm for neurorehabilitation, RGS, which combines specific rehabilitative principles with a psychometric evaluation to provide a personalized and automated training. Our results show that the RGS effectively adjusts to the individual features of the user

  19. Neurorehabilitation using the virtual reality based Rehabilitation Gaming System: methodology, design, psychometrics, usability and validation

    Directory of Open Access Journals (Sweden)

    Verschure Paul FMJ

    2010-09-01

    Full Text Available Abstract Background Stroke is a frequent cause of adult disability that can lead to enduring impairments. However, given the life-long plasticity of the brain one could assume that recovery could be facilitated by the harnessing of mechanisms underlying neuronal reorganization. Currently it is not clear how this reorganization can be mobilized. Novel technology based neurorehabilitation techniques hold promise to address this issue. Here we describe a Virtual Reality (VR based system, the Rehabilitation Gaming System (RGS that is based on a number of hypotheses on the neuronal mechanisms underlying recovery, the structure of training and the role of individualization. We investigate the psychometrics of the RGS in stroke patients and healthy controls. Methods We describe the key components of the RGS and the psychometrics of one rehabilitation scenario called Spheroids. We performed trials with 21 acute/subacute stroke patients and 20 healthy controls to study the effect of the training parameters on task performance. This allowed us to develop a Personalized Training Module (PTM for online adjustment of task difficulty. In addition, we studied task transfer between physical and virtual environments. Finally, we assessed the usability and acceptance of the RGS as a rehabilitation tool. Results We show that the PTM implemented in RGS allows us to effectively adjust the difficulty and the parameters of the task to the user by capturing specific features of the movements of the arms. The results reported here also show a consistent transfer of movement kinematics between physical and virtual tasks. Moreover, our usability assessment shows that the RGS is highly accepted by stroke patients as a rehabilitation tool. Conclusions We introduce a novel VR based paradigm for neurorehabilitation, RGS, which combines specific rehabilitative principles with a psychometric evaluation to provide a personalized and automated training. Our results show that the

  20. Forecasting Electricity Market Risk Using Empirical Mode Decomposition (EMD—Based Multiscale Methodology

    Directory of Open Access Journals (Sweden)

    Kaijian He

    2016-11-01

    Full Text Available The electricity market has experienced an increasing level of deregulation and reform over the years. There is an increasing level of electricity price fluctuation, uncertainty, and risk exposure in the marketplace. Traditional risk measurement models based on the homogeneous and efficient market assumption no longer suffice, facing the increasing level of accuracy and reliability requirements. In this paper, we propose a new Empirical Mode Decomposition (EMD-based Value at Risk (VaR model to estimate the downside risk measure in the electricity market. The proposed model investigates and models the inherent multiscale market risk structure. The EMD model is introduced to decompose the electricity time series into several Intrinsic Mode Functions (IMF with distinct multiscale characteristics. The Exponential Weighted Moving Average (EWMA model is used to model the individual risk factors across different scales. Experimental results using different models in the Australian electricity markets show that EMD-EWMA models based on Student’s t distribution achieves the best performance, and outperforms the benchmark EWMA model significantly in terms of model reliability and predictive accuracy.