WorldWideScience

Sample records for based depletion methodology

  1. Validation of a Monte Carlo Based Depletion Methodology Using HFIR Post-Irradiation Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Chandler, David [ORNL; Maldonado, G Ivan [ORNL; Primm, Trent [ORNL

    2009-11-01

    Post-irradiation uranium isotopic atomic densities within the core of the High Flux Isotope Reactor (HFIR) were calculated and compared to uranium mass spectrographic data measured in the late 1960s and early 70s [1]. This study was performed in order to validate a Monte Carlo based depletion methodology for calculating the burn-up dependent nuclide inventory, specifically the post-irradiation uranium

  2. Development of a practical fuel management system for PSBR based on advanced three-dimensional Monte Carlo coupled depletion methodology

    Science.gov (United States)

    Tippayakul, Chanatip

    The main objective of this research is to develop a practical fuel management system for the Pennsylvania State University Breazeale research reactor (PSBR) based on several advanced Monte Carlo coupled depletion methodologies. Primarily, this research involved two major activities: model and method developments and analyses and validations of the developed models and methods. The starting point of this research was the utilization of the earlier developed fuel management tool, TRIGSIM, to create the Monte Carlo model of core loading 51 (end of the core loading). It was found when comparing the normalized power results of the Monte Carlo model to those of the current fuel management system (using HELIOS/ADMARC-H) that they agreed reasonably well (within 2%--3% differences on average). Moreover, the reactivity of some fuel elements was calculated by the Monte Carlo model and it was compared with measured data. It was also found that the fuel element reactivity results of the Monte Carlo model were in good agreement with the measured data. However, the subsequent task of analyzing the conversion from the core loading 51 to the core loading 52 using TRIGSIM showed quite significant difference of each control rod worth between the Monte Carlo model and the current methodology model. The differences were mainly caused by inconsistent absorber atomic number densities between the two models. Hence, the model of the first operating core (core loading 2) was revised in light of new information about the absorber atomic densities to validate the Monte Carlo model with the measured data. With the revised Monte Carlo model, the results agreed better to the measured data. Although TRIGSIM showed good modeling and capabilities, the accuracy of TRIGSIM could be further improved by adopting more advanced algorithms. Therefore, TRIGSIM was planned to be upgraded. The first task of upgrading TRIGSIM involved the improvement of the temperature modeling capability. The new TRIGSIM was

  3. VERA Core Simulator Methodology for PWR Cycle Depletion

    Energy Technology Data Exchange (ETDEWEB)

    Kochunas, Brendan [University of Michigan; Collins, Benjamin S [ORNL; Jabaay, Daniel [University of Michigan; Kim, Kang Seog [ORNL; Graham, Aaron [University of Michigan; Stimpson, Shane [University of Michigan; Wieselquist, William A [ORNL; Clarno, Kevin T [ORNL; Palmtag, Scott [Core Physics, Inc.; Downar, Thomas [University of Michigan; Gehin, Jess C [ORNL

    2015-01-01

    This paper describes the methodology developed and implemented in MPACT for performing high-fidelity pressurized water reactor (PWR) multi-cycle core physics calculations. MPACT is being developed primarily for application within the Consortium for the Advanced Simulation of Light Water Reactors (CASL) as one of the main components of the VERA Core Simulator, the others being COBRA-TF and ORIGEN. The methods summarized in this paper include a methodology for performing resonance self-shielding and computing macroscopic cross sections, 2-D/1-D transport, nuclide depletion, thermal-hydraulic feedback, and other supporting methods. These methods represent a minimal set needed to simulate high-fidelity models of a realistic nuclear reactor. Results demonstrating this are presented from the simulation of a realistic model of the first cycle of Watts Bar Unit 1. The simulation, which approximates the cycle operation, is observed to be within 50 ppm boron (ppmB) reactivity for all simulated points in the cycle and approximately 15 ppmB for a consistent statepoint. The verification and validation of the PWR cycle depletion capability in MPACT is the focus of two companion papers.

  4. METHODOLOGICAL BASES OF OUTSOURCING

    Directory of Open Access Journals (Sweden)

    Lanskaya D. V.

    2014-09-01

    Full Text Available Outsourcing is investigated from a position of finding steady and unique competitive advantages of a public corporation due to attraction of carriers of unique intellectual and uses social capitals of the specialized companies within the institutional theory. Key researchers and events in the history of outsourcing are marked out, the existing approaches to definition of the concept of outsourcing, advantage and risks from application of technology of outsourcing are considered. It is established that differences of outsourcing, sub-contraction and cooperation are not in the nature of the functional relations, and in the depth of considered economic terms and phenomena. The methodology of outsourcing is considered as a part of methodology of cooperation of enterprise innovative structures of being formed sector of knowledge economy

  5. INPRO Methodology for Sustainability Assessment of Nuclear Energy Systems: Environmental Impact from Depletion of Resources

    International Nuclear Information System (INIS)

    INPRO is an international project to help ensure that nuclear energy is available to contribute in a sustainable manner to meeting the energy needs of the 21st century. A basic principle of INPRO in the area of environmental impact from depletion of resources is that a nuclear energy system will be capable of contributing to the energy needs in the 21st century while making efficient use of non-renewable resources needed for construction, operation and decommissioning. Recognizing that a national nuclear energy programme in a given country may be based both on indigenous resources and resources purchased from abroad, this publication provides background materials and summarizes the results of international global resource availability studies that could contribute to the corresponding national assessments

  6. Depletion methodology in the 3-D whole core transport code DeCART

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kang Seog; Cho, Jin Young; Zee, Sung Quun

    2005-02-01

    Three dimensional whole-core transport code DeCART has been developed to include a characteristics of the numerical reactor to replace partly the experiment. This code adopts the deterministic method in simulating the neutron behavior with the least assumption and approximation. This neutronic code is also coupled with the thermal hydraulic code CFD and the thermo mechanical code to simulate the combined effects. Depletion module has been implemented in DeCART code to predict the depleted composition in the fuel. The exponential matrix method of ORIGEN-2 has been used for the depletion calculation. The library of including decay constants, yield matrix and others has been used and greatly simplified for the calculation efficiency. This report summarizes the theoretical backgrounds and includes the verification of the depletion module in DeCART by performing the benchmark calculations.

  7. Towards an MDA-based development methodology

    NARCIS (Netherlands)

    Gavras, Anastasius; Belaunde, Mariano; Ferreira Pires, Luis; Almeida, João Paolo A.; Oquendo, Flavio; Warboys, Brian C.; Morrison, Ron

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  8. Depleting methyl bromide residues in soil by reaction with bases

    Science.gov (United States)

    Despite generally being considered the most effective soil fumigant, methyl bromide (MeBr) use is being phased out because its emissions from soil can lead to stratospheric ozone depletion. However, a large amount is still currently used due to Critical Use Exemptions. As strategies for reducing the...

  9. Evaluation of acute tryptophan depletion and sham depletion with a gelatin-based collagen peptide protein mixture

    DEFF Research Database (Denmark)

    Stenbæk, D S; Einarsdottir, H S; Goregliad-Fjaellingsdal, T;

    2016-01-01

    Acute Tryptophan Depletion (ATD) is a dietary method used to modulate central 5-HT to study the effects of temporarily reduced 5-HT synthesis. The aim of this study is to evaluate a novel method of ATD using a gelatin-based collagen peptide (CP) mixture. We administered CP-Trp or CP+Trp mixtures ...... effects of CP-Trp compared to CP+Trp were observed. The transient increase in plasma Trp after CP+Trp may impair comparison to the CP-Trp and we therefore recommend in future studies to use a smaller dose of Trp supplement to the CP mixture....

  10. Satellite-based estimates of groundwater depletion in India

    OpenAIRE

    Rodell, M.; Velicogna, I; Famiglietti, JS

    2009-01-01

    Groundwater is a primary source of fresh water in many parts of the world. Some regions are becoming overly dependent on it, consuming groundwater faster than it is naturally replenished and causing water tables to decline unremittingly. Indirect evidence suggests that this is the case in northwest India, but there has been no regional assessment of the rate of groundwater depletion. Here we use terrestrial water storage-change observations from the NASA Gravity Recovery and Climate Experimen...

  11. DKBLM—Deep Knowledge Based Learning Methodology

    Institute of Scientific and Technical Information of China (English)

    马志方

    1993-01-01

    To solve the Imperfect Theory Problem(ITP)faced by Explanation Based Generalization(EBG),this paper proposes a methodology,Deep Knowledge Based Learning Methodology(DKBLM)by name and gives an implementation of DKBLM,called Hierarchically Distributed Learning System(HDLS).As an example of HDLS's application,this paper shows a learning system(MLS)in meteorology domain and its running with a simplified example.DKBLM can acquire experiential knowledge with causality in it.It is applicable to those kinds of domains,in which experiments are relatively difficults to caryy out,and in which there exist many available knowledge systems at different levels for the same domain(such as weather forecasting).

  12. Performance-based asphalt mixture design methodology

    Science.gov (United States)

    Ali, Al-Hosain Mansour

    performance based design procedure. Finally, the developed guidelines with easy-to-use flow charts for the integrated mix design methodology are presented.

  13. Project management methodology based on the PMI

    OpenAIRE

    Lukyanchenko, V. P.; Kizeev, Veniamin Mikhailovich; Nikolaenko, Nina Aleksandrovna; Лукьянченко, В. П.; Кизеев, Вениамин Михайлович; Николаенко, Нина Александровна

    2015-01-01

    Project management is a tool to improve business efficiency and bring maximum impact, that is why every organization should choose the project management methodology that best suits the specifics of its business. This article describes the Project Management Institute. The article investigates a PMI methodology. The paper defines the main objectives of this kind of methodology and considers its structure revealing the effectiveness of the methodology in PMI project management. Consider the fi...

  14. Methodology base and problems of information technologies

    Science.gov (United States)

    Sovetov, Boris Y.

    1993-04-01

    Information product qualitative forming and effective use is the aim of any information technology. Information technology as a system provides both computer-aided problem solving for the user and automation of information processes, which in turn support the problem solving process. That's why the information technology methods are the methods for data transmission, processing, and storage. The tools of methodology, mathematics, algorithms, hardware, software, and information are the tools of information technology. We propose to differ between global, basic, and applied information technologies depending on information product significance and characteristics of models, methods, and tools used. The global technology is aimed to use information resources in the social sphere as a whole. The basic technology is oriented on the application sphere (industry, scientific research, design, training). Transition towards new information technology should have in its concern business area model merged with the formal model of problem solving: computing organization based on the data concept; user's intellectual interface development.

  15. Case Based Reasoning: Case Representation Methodologies

    Directory of Open Access Journals (Sweden)

    Shaker H. El-Sappagh

    2015-11-01

    Full Text Available Case Based Reasoning (CBR is an important technique in artificial intelligence, which has been applied to various kinds of problems in a wide range of domains. Selecting case representation formalism is critical for the proper operation of the overall CBR system. In this paper, we survey and evaluate all of the existing case representation methodologies. Moreover, the case retrieval and future challenges for effective CBR are explained. Case representation methods are grouped in to knowledge-intensive approaches and traditional approaches. The first group overweight the second one. The first methods depend on ontology and enhance all CBR processes including case representation, retrieval, storage, and adaptation. By using a proposed set of qualitative metrics, the existing methods based on ontology for case representation are studied and evaluated in details. All these systems have limitations. No approach exceeds 53% of the specified metrics. The results of the survey explain the current limitations of CBR systems. It shows that ontology usage in case representation needs improvements to achieve semantic representation and semantic retrieval in CBR system.

  16. Methodologies for Crawler Based Web Surveys.

    Science.gov (United States)

    Thelwall, Mike

    2002-01-01

    Describes Web survey methodologies used to study the content of the Web, and discusses search engines and the concept of crawling the Web. Highlights include Web page selection methodologies; obstacles to reliable automatic indexing of Web sites; publicly indexable pages; crawling parameters; and tests for file duplication. (Contains 62…

  17. Measurement-based experimental research methodology

    OpenAIRE

    Papadimitriou D.; Fabrega L.; Vila P.; Careglio D.; Demeester P.

    2013-01-01

    Aiming at creating a dynamic between elaboration, realization, and validation by means of iterative cycles of experimentation, Future Internet Research and Experimentation (FIRE) projects have been rapidly confronted to the lack of systematic experimental research methodology. Moreover, the “validation by experimentation” objective involves a broad spectrum of experimentation tools ranging from simulation to field trial prototypes together with their associated measurement tools. As experimen...

  18. SNE's methodological basis - web-based software in entrepreneurial surveys

    DEFF Research Database (Denmark)

    Madsen, Henning

    This overhead based paper gives an introduction to the research methodology applied in the surveys carried out in the SNE-project.......This overhead based paper gives an introduction to the research methodology applied in the surveys carried out in the SNE-project....

  19. Low loss and high speed silicon optical modulator based on a lateral carrier depletion structure.

    Science.gov (United States)

    Marris-Morini, Delphine; Vivien, Laurent; Fédéli, Jean Marc; Cassan, Eric; Lyan, Philippe; Laval, Suzanne

    2008-01-01

    A high speed and low loss silicon optical modulator based on carrier depletion has been made using an original structure consisting of a p-doped slit embedded in the intrinsic region of a lateral pin diode. This design allows a good overlap between the optical mode and carrier density variations. Insertion loss of 5 dB has been measured with a contrast ratio of 14 dB for a 3 dB bandwidth of 10 GHz. PMID:18521165

  20. School-Based Methylphenidate Placebo Protocols: Methodological and Practical Issues.

    Science.gov (United States)

    Hyman, Irwin A.; Wojtowicz, Alexandra; Lee, Kee Duk; Haffner, Mary Elizabeth; Fiorello, Catherine A.; And Others

    1998-01-01

    Focuses on methodological issues involved in choosing instruments to monitor behavior, once a comprehensive evaluation has suggested trials on Ritalin. Case examples illustrate problems of teacher compliance in filling out measures, supplying adequate placebos, and obtaining physical cooperation. Emerging school-based methodologies are discussed…

  1. A Metadata based Knowledge Discovery Methodology for Seeding Translational Research.

    Science.gov (United States)

    Kothari, Cartik R; Payne, Philip R O

    2015-01-01

    In this paper, we present a semantic, metadata based knowledge discovery methodology for identifying teams of researchers from diverse backgrounds who can collaborate on interdisciplinary research projects: projects in areas that have been identified as high-impact areas at The Ohio State University. This methodology involves the semantic annotation of keywords and the postulation of semantic metrics to improve the efficiency of the path exploration algorithm as well as to rank the results. Results indicate that our methodology can discover groups of experts from diverse areas who can collaborate on translational research projects.

  2. An ontological case base engineering methodology for diabetes management.

    Science.gov (United States)

    El-Sappagh, Shaker H; El-Masri, Samir; Elmogy, Mohammed; Riad, A M; Saddik, Basema

    2014-08-01

    Ontology engineering covers issues related to ontology development and use. In Case Based Reasoning (CBR) system, ontology plays two main roles; the first as case base and the second as domain ontology. However, the ontology engineering literature does not provide adequate guidance on how to build, evaluate, and maintain ontologies. This paper proposes an ontology engineering methodology to generate case bases in the medical domain. It mainly focuses on the research of case representation in the form of ontology to support the case semantic retrieval and enhance all knowledge intensive CBR processes. A case study on diabetes diagnosis case base will be provided to evaluate the proposed methodology.

  3. Regression-Based Artificial Neural Network Methodology in Response Surface Methodology

    Institute of Scientific and Technical Information of China (English)

    何桢; 肖粤翔

    2004-01-01

    Response surface methodology (RSM) is an important tool for process parameter optimization, robust design and other quality improvement efforts. When the relationship between influential input variables and output response is very complex, it' s hard to find the real response surface using RSM. In recent years artificial neural network(ANN) has been used in RSM. But the classical ANN does not work well under the constraints of real applications. An algorithm of regression-based ANN(R-ANN) is proposed in this paper, which is a supplement to the classical ANN methodology. It makes network closer to the response surface, so that training time is reduced and robustness is strengthened. The procedure of improving ANN by regressions is described and the comparisons among R-ANN, RSM and classical ANN are computed graphically in three examples. Our research shows that the R-ANN methodology is a good supplement to the RSM and classical ANN methodology,which can yield lower standard error of prediction under conditions that the scope of experiment is rigidly restricted.

  4. Towards an MDA-based development methodology for distributed applications

    NARCIS (Netherlands)

    Gavras, A.; Belaunde, M.; Ferreira Pires, L.; Andrade Almeida, J.P.; van Sinderen, M.J.; Ferreira Pires, L.

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  5. The Impact of an Ego Depletion Manipulation on Performance-Based and Self-Report Assessment Measures.

    Science.gov (United States)

    Charek, Daniel B; Meyer, Gregory J; Mihura, Joni L

    2016-10-01

    We investigated the impact of ego depletion on selected Rorschach cognitive processing variables and self-reported affect states. Research indicates acts of effortful self-regulation transiently deplete a finite pool of cognitive resources, impairing performance on subsequent tasks requiring self-regulation. We predicted that relative to controls, ego-depleted participants' Rorschach protocols would have more spontaneous reactivity to color, less cognitive sophistication, and more frequent logical lapses in visualization, whereas self-reports would reflect greater fatigue and less attentiveness. The hypotheses were partially supported; despite a surprising absence of self-reported differences, ego-depleted participants had Rorschach protocols with lower scores on two variables indicative of sophisticated combinatory thinking, as well as higher levels of color receptivity; they also had lower scores on a composite variable computed across all hypothesized markers of complexity. In addition, self-reported achievement striving moderated the effect of the experimental manipulation on color receptivity, and in the Depletion condition it was associated with greater attentiveness to the tasks, more color reactivity, and less global synthetic processing. Results are discussed with an emphasis on the response process, methodological limitations and strengths, implications for calculating refined Rorschach scores, and the value of using multiple methods in research and experimental paradigms to validate assessment measures. PMID:26002059

  6. A Model-based Prognostics Methodology for Electrolytic Capacitors Based on Electrical Overstress Accelerated Aging

    Data.gov (United States)

    National Aeronautics and Space Administration — A remaining useful life prediction methodology for elec- trolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical...

  7. A Model-Based Prognostics Methodology For Electrolytic Capacitors Based On Electrical Overstress Accelerated Aging

    Data.gov (United States)

    National Aeronautics and Space Administration — A remaining useful life prediction methodology for electrolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical...

  8. Design Methodology for Self-organized Mobile Networks Based

    Directory of Open Access Journals (Sweden)

    John Petearson Anzola

    2016-06-01

    Full Text Available The methodology proposed in this article enables a systematic design of routing algorithms based on schemes of biclustering, which allows you to respond with timely techniques, clustering heuristics proposed by a researcher, and a focused approach to routing in the choice of clusterhead nodes. This process uses heuristics aimed at improving the different costs in communication surface groups called biclusters. This methodology globally enables a variety of techniques and heuristics of clustering that have been addressed in routing algorithms, but we have not explored all possible alternatives and their different assessments. Therefore, the methodology oriented design research of routing algorithms based on biclustering schemes will allow new concepts of evolutionary routing along with the ability to adapt the topological changes that occur in self-organized data networks.

  9. Radiation protection optimization using a knowledge based methodology

    International Nuclear Information System (INIS)

    This paper presents a knowledge based methodology for radiological planning and radiation protection optimization. The cost-benefit methodology described on International Commission of Radiation Protection Report No. 37 is employed within a knowledge based framework for the purpose of optimizing radiation protection and plan maintenance activities while optimizing radiation protection.1,2 The methodology is demonstrated through an application to a heating ventilation and air conditioning (HVAC) system. HVAC is used to reduce radioactivity concentration levels in selected contaminated multi-compartment models at nuclear power plants when higher than normal radiation levels are detected. The overall objective is to reduce personnel exposure resulting from airborne radioactivity, when routine or maintenance access is required in contaminated areas. 2 figs, 15 refs

  10. A Discrepancy-Based Methodology for Nuclear Training Program Evaluation.

    Science.gov (United States)

    Cantor, Jeffrey A.

    1991-01-01

    A three-phase comprehensive process for commercial nuclear power training program evaluation is presented. The discrepancy-based methodology was developed after the Three Mile Island nuclear reactor accident. It facilitates analysis of program components to identify discrepancies among program specifications, actual outcomes, and industry…

  11. Phenomena based Methodology for Process Synthesis incorporating Process Intensification

    DEFF Research Database (Denmark)

    Lutze, Philip; Babi, Deenesh Kavi; Woodley, John;

    2013-01-01

    Process intensification (PI) has the potential to improve existing as well as conceptual processes, in order to achieve a more sustainable production. PI can be achieved at different levels. That is, the unit operations, functional and/or phenomena level. The highest impact is expected by looking...... at processes at the lowest level of aggregation which is the phenomena level. In this paper, a phenomena based synthesis/design methodology incorporating process intensification is presented. Using this methodology, a systematic identification of necessary and desirable (integrated) phenomena as well...

  12. Optimal (Solvent) Mixture Design through a Decomposition Based CAMD methodology

    DEFF Research Database (Denmark)

    Achenie, L.; Karunanithi, Arunprakash T.; Gani, Rafiqul

    2004-01-01

    Computer Aided Molecular/Mixture design (CAMD) is one of the most promising techniques for solvent design and selection. A decomposition based CAMD methodology has been formulated where the mixture design problem is solved as a series of molecular and mixture design sub-problems. This approach is...... is able to overcome most of the difficulties associated with the solution of mixture design problems. The new methodology has been illustrated with the help of a case study involving the design of solvent-anti solvent binary mixtures for crystallization of Ibuprofen....

  13. Methodological Innovation in Practice-Based Design Doctorates

    Directory of Open Access Journals (Sweden)

    Joyce S. R. Yee

    2010-01-01

    Full Text Available This article presents a selective review of recent design PhDs that identify and analyse the methodological innovation that is occurring in the field, in order to inform future provision of research training. Six recently completed design PhDs are used to highlight possible philosophical and practical models that can be adopted by future PhD students in design. Four characteristics were found in design PhD methodology: innovations in the format and structure of the thesis, a pick-and-mix approach to research design, situating practice in the inquiry, and the validation of visual analysis. The article concludes by offering suggestions on how research training can be improved. By being aware of recent methodological innovations in the field, design educators will be better informed when developing resources for future design doctoral candidates and assisting supervision teams in developing a more informed and flexible approach to practice-based research.

  14. Methodological Approaches for Estimating Gross Regional Product after Taking into Account Depletion of Natural Resources, Environmental Pollution and Human Capital Aspects

    Directory of Open Access Journals (Sweden)

    Boris Alengordovich Korobitsyn

    2015-09-01

    Full Text Available A key indicator of the System of National Accounts of Russia at a regional scale is Gross Regional Product characterizing the value of goods and services produced in all sectors of the economy in a country and intended for final consumption, capital formation and net exports (excluding imports. From a sustainability perspective, the most weakness of GRP is that it ignores depreciation of man-made assets, natural resource depletion, environmental pollution and degradation, and potential social costs such as poorer health due to exposure to occupational hazards. Several types of alternative approaches to measuring socio-economic progress are considering for six administrative units of the Ural Federal District for the period 2006–2014. Proposed alternatives to GRP as a measure of social progress are focused on natural resource depletion, environmental externalities and some human development aspects. The most promising is the use of corrected macroeconomic indicators similar to the “genuine savings” compiled by the World Bank. Genuine savings are defined in this paper as net savings (net gross savings minus consumption of fixed capital minus the consumption of natural non-renewable resources and the monetary evaluations of damages resulting from air pollution, water pollution and waste disposal. Two main groups of non renewable resources are considered: energy resources (uranium ore, oil and natural gas and mineral resources (iron ore, copper, and aluminum. In spite of various shortcomings, this indicator represents a considerable improvement over GRP information. For example, while GRP demonstrates steady growth between 2006 and 2014 for the main Russian oil- and gas-producing regions — Hanty-Mansi and Yamalo-Nenets Autonomous Okrugs, genuine savings for these regions decreased over all period. It means that their resource-based economy could not be considered as being on a sustainable path even in the framework of

  15. Valence-dependent influence of serotonin depletion on model-based choice strategy.

    Science.gov (United States)

    Worbe, Y; Palminteri, S; Savulich, G; Daw, N D; Fernandez-Egea, E; Robbins, T W; Voon, V

    2016-05-01

    Human decision-making arises from both reflective and reflexive mechanisms, which underpin goal-directed and habitual behavioural control. Computationally, these two systems of behavioural control have been described by different learning algorithms, model-based and model-free learning, respectively. Here, we investigated the effect of diminished serotonin (5-hydroxytryptamine) neurotransmission using dietary tryptophan depletion (TD) in healthy volunteers on the performance of a two-stage decision-making task, which allows discrimination between model-free and model-based behavioural strategies. A novel version of the task was used, which not only examined choice balance for monetary reward but also for punishment (monetary loss). TD impaired goal-directed (model-based) behaviour in the reward condition, but promoted it under punishment. This effect on appetitive and aversive goal-directed behaviour is likely mediated by alteration of the average reward representation produced by TD, which is consistent with previous studies. Overall, the major implication of this study is that serotonin differentially affects goal-directed learning as a function of affective valence. These findings are relevant for a further understanding of psychiatric disorders associated with breakdown of goal-directed behavioural control such as obsessive-compulsive disorders or addictions. PMID:25869808

  16. A Library-Based Synthesis Methodology for Reversible Logic

    CERN Document Server

    Saeedi, Mehdi; Zamani, Morteza Saheb; 10.1016/j.mejo.2010.02.002

    2010-01-01

    In this paper, a library-based synthesis methodology for reversible circuits is proposed where a reversible specification is considered as a permutation comprising a set of cycles. To this end, a pre-synthesis optimization step is introduced to construct a reversible specification from an irreversible function. In addition, a cycle-based representation model is presented to be used as an intermediate format in the proposed synthesis methodology. The selected intermediate format serves as a focal point for all potential representation models. In order to synthesize a given function, a library containing seven building blocks is used where each building block is a cycle of length less than 6. To synthesize large cycles, we also propose a decomposition algorithm which produces all possible minimal and inequivalent factorizations for a given cycle of length greater than 5. All decompositions contain the maximum number of disjoint cycles. The generated decompositions are used in conjunction with a novel cycle assi...

  17. Measurement-based research: methodology, experiments, and tools

    OpenAIRE

    Papadimitriou, Dimitri; Fàbrega, Lluís; Vila Fumas, Pere; Careglio, Davide; Demeester, Piet

    2012-01-01

    In this paper, we report the results of the workshop organized by the FP7 EULER project on measurement-based research and associated methodology, experiments and tools. This workshop aimed at gathering all Future Internet Research and Experimentation (FIRE) experimental research projects under this thematic. Participants were invited to present the usage of measurement techniques in their experiments, their developments on measurement tools, and their foreseeable needs with respect to new dom...

  18. Simulation Parameters Settings Methodology Proposal Based on Leverage Points

    Science.gov (United States)

    Janošek, Michal; Kocian, Václav

    Simulation belongs to one of the most time consuming phases of complex system design. It is necessary to test our model with number of parameters of the entire simulation, control mechanism and components. We test various scenarios and strategies. In this article we would like to present the methodology proposal for simulation parameters settings based on leverage points' hierarchy developed by Donella H. Meadows to aid the simulation process.

  19. An Ontology Based Methodology for Satellite Data Semantic Interoperability

    Directory of Open Access Journals (Sweden)

    ABBURU, S.

    2015-08-01

    Full Text Available Satellites and ocean based observing system consists of various sensors and configurations. These observing systems transmit data in heterogeneous file formats and heterogeneous vocabulary from various data centers. These data centers maintain a centralized data management system that disseminates the observations to various research communities. Currently, different data naming conventions are being used by existing observing systems, thus leading to semantic heterogeneity. In this work, sensor data interoperability and semantics of the data are being addressed through ontologies. The present work provides an effective technical solution to address semantic heterogeneity through semantic technologies. These technologies provide interoperability, capability to build knowledge base, and framework for semantic information retrieval by developing an effective concept vocabulary through domain ontologies. The paper aims at a new methodology to interlink the multidisciplinary and heterogeneous sensor data products. A four phase methodology has been implemented to address satellite data semantic interoperability. The paper concludes with the evaluation of the methodology by linking and interfacing multiple ontologies to arrive at ontology vocabulary for sensor observations. Data from Indian Meteorological satellite INSAT-3D satellite have been used as a typical example to illustrate the concepts. This work on similar lines can also be extended to other sensor observations.

  20. ONTOLOGY BASED DATA MINING METHODOLOGY FOR DISCRIMINATION PREVENTION

    Directory of Open Access Journals (Sweden)

    Nandana Nagabhushana

    2014-09-01

    Full Text Available Data Mining is being increasingly used in the field of automation of decision making processes, which involve extraction and discovery of information hidden in large volumes of collected data. Nonetheless, there are negative perceptions like privacy invasion and potential discrimination which contribute as hindrances to the use of data mining methodologies in software systems employing automated decision making. Loan granting, Employment, Insurance Premium calculation, Admissions in Educational Institutions etc., can make use of data mining to effectively prevent human biases pertaining to certain attributes like gender, nationality, race etc. in critical decision making. The proposed methodology prevents discriminatory rules ensuing due to the presence of certain information regarding sensitive discriminatory attributes in the data itself. Two aspects of novelty in the proposal are, first, the rule mining technique based on ontologies and the second, concerning generalization and transformation of the mined rules that are quantized as discriminatory, into non-discriminatory ones.

  1. Alternative Node Based Energy Depletion and Expected Residual Lifetime Balancing Method for Mobile Ad Hoc Networks

    Directory of Open Access Journals (Sweden)

    Anuradha Banerjee

    2013-09-01

    Full Text Available A mobile ad hoc network is an infrastructure less network, where nodes are free to move independently in any direction. The nodes have limited battery power; hence we require efficient balancing techniques (energy depletion or expected residual lifetime, whichever is applicable under specific circumstances to reduce overload on the nodes, wherever possible, to enhance their lifetime and network performance. This kind of balance among network nodes increase the average lifetime of nodes and reduce the phenomenon of network partitioning due to excessive exhaustion of nodes. In this paper, we propose an alternative-node based balancing method (ANB that channels the forwarding load of a node to some other less exhausted alternative node provided that alternative node is capable of handling the extra load. This greatly reduces the number of link breakages and also the number of route-requests flooded in the network to repair the broken links. This, in turn, improves the data packet delivery ratio of the underlying routing protocol as well as average node lifetime.

  2. A Network Based Methodology to Reveal Patterns in Knowledge Transfer

    Directory of Open Access Journals (Sweden)

    Orlando López-Cruz

    2015-12-01

    Full Text Available This paper motivates, presents and demonstrates in use a methodology based in complex network analysis to support research aimed at identification of sources in the process of knowledge transfer at the interorganizational level. The importance of this methodology is that it states a unified model to reveal knowledge sharing patterns and to compare results from multiple researches on data from different periods of time and different sectors of the economy. This methodology does not address the underlying statistical processes. To do this, national statistics departments (NSD provide documents and tools at their websites. But this proposal provides a guide to model information inferences gathered from data processing revealing links between sources and recipients of knowledge being transferred and that the recipient detects as main source to new knowledge creation. Some national statistics departments set as objective for these surveys the characterization of innovation dynamics in firms and to analyze the use of public support instruments. From this characterization scholars conduct different researches. Measures of dimensions of the network composed by manufacturing firms and other organizations conform the base to inquiry the structure that emerges from taking ideas from other organizations to incept innovations. These two sets of data are actors of a two- mode-network. The link between two actors (network nodes, one acting as the source of the idea. The second one acting as the destination comes from organizations or events organized by organizations that “provide” ideas to other group of firms. The resulting demonstrated design satisfies the objective of being a methodological model to identify sources in knowledge transfer of knowledge effectively used in innovation.

  3. A new evaluation methodology for literature-based discovery systems.

    Science.gov (United States)

    Yetisgen-Yildiz, Meliha; Pratt, Wanda

    2009-08-01

    While medical researchers formulate new hypotheses to test, they need to identify connections to their work from other parts of the medical literature. However, the current volume of information has become a great barrier for this task. Recently, many literature-based discovery (LBD) systems have been developed to help researchers identify new knowledge that bridges gaps across distinct sections of the medical literature. Each LBD system uses different methods for mining the connections from text and ranking the identified connections, but none of the currently available LBD evaluation approaches can be used to compare the effectiveness of these methods. In this paper, we present an evaluation methodology for LBD systems that allows comparisons across different systems. We demonstrate the abilities of our evaluation methodology by using it to compare the performance of different correlation-mining and ranking approaches used by existing LBD systems. This evaluation methodology should help other researchers compare approaches, make informed algorithm choices, and ultimately help to improve the performance of LBD systems overall.

  4. Development of a statistically based access delay timeline methodology.

    Energy Technology Data Exchange (ETDEWEB)

    Rivera, W. Gary; Robinson, David Gerald; Wyss, Gregory Dane; Hendrickson, Stacey M. Langfitt

    2013-02-01

    The charter for adversarial delay is to hinder access to critical resources through the use of physical systems increasing an adversary's task time. The traditional method for characterizing access delay has been a simple model focused on accumulating times required to complete each task with little regard to uncertainty, complexity, or decreased efficiency associated with multiple sequential tasks or stress. The delay associated with any given barrier or path is further discounted to worst-case, and often unrealistic, times based on a high-level adversary, resulting in a highly conservative calculation of total delay. This leads to delay systems that require significant funding and personnel resources in order to defend against the assumed threat, which for many sites and applications becomes cost prohibitive. A new methodology has been developed that considers the uncertainties inherent in the problem to develop a realistic timeline distribution for a given adversary path. This new methodology incorporates advanced Bayesian statistical theory and methodologies, taking into account small sample size, expert judgment, human factors and threat uncertainty. The result is an algorithm that can calculate a probability distribution function of delay times directly related to system risk. Through further analysis, the access delay analyst or end user can use the results in making informed decisions while weighing benefits against risks, ultimately resulting in greater system effectiveness with lower cost.

  5. Development of a statistically based access delay timeline methodology.

    Energy Technology Data Exchange (ETDEWEB)

    Rivera, W. Gary; Robinson, David Gerald; Wyss, Gregory Dane; Hendrickson, Stacey M. Langfitt

    2013-02-01

    The charter for adversarial delay is to hinder access to critical resources through the use of physical systems increasing an adversarys task time. The traditional method for characterizing access delay has been a simple model focused on accumulating times required to complete each task with little regard to uncertainty, complexity, or decreased efficiency associated with multiple sequential tasks or stress. The delay associated with any given barrier or path is further discounted to worst-case, and often unrealistic, times based on a high-level adversary, resulting in a highly conservative calculation of total delay. This leads to delay systems that require significant funding and personnel resources in order to defend against the assumed threat, which for many sites and applications becomes cost prohibitive. A new methodology has been developed that considers the uncertainties inherent in the problem to develop a realistic timeline distribution for a given adversary path. This new methodology incorporates advanced Bayesian statistical theory and methodologies, taking into account small sample size, expert judgment, human factors and threat uncertainty. The result is an algorithm that can calculate a probability distribution function of delay times directly related to system risk. Through further analysis, the access delay analyst or end user can use the results in making informed decisions while weighing benefits against risks, ultimately resulting in greater system effectiveness with lower cost.

  6. Investment Strategies Optimization based on a SAX-GA Methodology

    CERN Document Server

    Canelas, António M L; Horta, Nuno C G

    2013-01-01

    This book presents a new computational finance approach combining a Symbolic Aggregate approXimation (SAX) technique with an optimization kernel based on genetic algorithms (GA). While the SAX representation is used to describe the financial time series, the evolutionary optimization kernel is used in order to identify the most relevant patterns and generate investment rules. The proposed approach considers several different chromosomes structures in order to achieve better results on the trading platform The methodology presented in this book has great potential on investment markets.

  7. Methodological Approaches for Estimating Gross Regional Product after Taking into Account Depletion of Natural Resources, Environmental Pollution and Human Capital Aspects

    OpenAIRE

    Boris Alengordovich Korobitsyn

    2015-01-01

    A key indicator of the System of National Accounts of Russia at a regional scale is Gross Regional Product characterizing the value of goods and services produced in all sectors of the economy in a country and intended for final consumption, capital formation and net exports (excluding imports). From a sustainability perspective, the most weakness of GRP is that it ignores depreciation of man-made assets, natural resource depletion, environmental pollution and degradation, and potent...

  8. Extended Methodology of RS Design and Instances Based on GIP

    Institute of Scientific and Technical Information of China (English)

    Qian-Hong Wu; Bo Qin; Yu-Min Wang

    2005-01-01

    Abe et al. proposed the methodology of ring signature (RS) design in 2002 and showed how to construct RS with a mixture of public keys based on factorization and/or discrete logarithms. Their methodology cannot be applied to knowledge signatures (KS) using the Fiat-Shamir heuristic and cut-and-choose techniques, for instance, the Goldreich KS.This paper presents a more general construction of RS from various public keys if there exists a secure signature using such a public key and an efficient algorithm to forge the relation to be checked if the challenges in such a signature are known in advance. The paper shows how to construct RS based on the graph isomorphism problem (GIP). Although it is unknown whether or not GIP is NP-Complete, there are no known arguments that it can be solved even in the quantum computation model. Hence, the scheme has a better security basis and it is plausibly secure against quantum adversaries.

  9. Methodology of citrate-based biomaterial development and application

    Science.gov (United States)

    Tran, M. Richard

    Biomaterials play central roles in modern strategies of regenerative medicine and tissue engineering. Attempts to find tissue-engineered solutions to cure various injuries or diseases have led to an enormous increase in the number of polymeric biomaterials over the past decade. The breadth of new materials arises from the multiplicity of anatomical locations, cell types, and mode of application, which all place application-specific requirements on the biomaterial. Unfortunately, many of the currently available biodegradable polymers are limited in their versatility to meet the wide range of requirements for tissue engineering. Therefore, a methodology of biomaterial development, which is able to address a broad spectrum of requirements, would be beneficial to the biomaterial field. This work presents a methodology of citrate-based biomaterial design and application to meet the multifaceted needs of tissue engineering. We hypothesize that (1) citric acid, a non-toxic metabolic product of the body (Krebs Cycle), can be exploited as a universal multifunctional monomer and reacted with various diols to produce a new class of soft biodegradable elastomers with the flexibility to tune the material properties of the resulting material to meet a wide range of requirements; (2) the newly developed citrate-based polymers can be used as platform biomaterials for the design of novel tissue engineering scaffolding; and (3) microengineering approaches in the form thin scaffold sheets, microchannels, and a new porogen design can be used to generate complex cell-cell and cell-microenvironment interactions to mimic tissue complexity and architecture. To test these hypotheses, we first developed a methodology of citrate-based biomaterial development through the synthesis and characterization of a family of in situ crosslinkable and urethane-doped elastomers, which are synthesized using simple, cost-effective strategies and offer a variety methods to tailor the material properties to

  10. A response surface methodology based damage identification technique

    International Nuclear Information System (INIS)

    Response surface methodology (RSM) is a combination of statistical and mathematical techniques to represent the relationship between the inputs and outputs of a physical system by explicit functions. This methodology has been widely employed in many applications such as design optimization, response prediction and model validation. But so far the literature related to its application in structural damage identification (SDI) is scarce. Therefore this study attempts to present a systematic SDI procedure comprising four sequential steps of feature selection, parameter screening, primary response surface (RS) modeling and updating, and reference-state RS modeling with SDI realization using the factorial design (FD) and the central composite design (CCD). The last two steps imply the implementation of inverse problems by model updating in which the RS models substitute the FE models. The proposed method was verified against a numerical beam, a tested reinforced concrete (RC) frame and an experimental full-scale bridge with the modal frequency being the output responses. It was found that the proposed RSM-based method performs well in predicting the damage of both numerical and experimental structures having single and multiple damage scenarios. The screening capacity of the FD can provide quantitative estimation of the significance levels of updating parameters. Meanwhile, the second-order polynomial model established by the CCD provides adequate accuracy in expressing the dynamic behavior of a physical system

  11. Electroencephalogram-based methodology for determining unconsciousness during depopulation.

    Science.gov (United States)

    Benson, E R; Alphin, R L; Rankin, M K; Caputo, M P; Johnson, A L

    2012-12-01

    When an avian influenza or virulent Newcastle disease outbreak occurs within commercial poultry, key steps involved in managing a fast-moving poultry disease can include: education; biosecurity; diagnostics and surveillance; quarantine; elimination of infected poultry through depopulation or culling, disposal, and disinfection; and decreasing host susceptibility. Available mass emergency depopulation procedures include whole-house gassing, partial-house gassing, containerized gassing, and water-based foam. To evaluate potential depopulation methods, it is often necessary to determine the time to the loss of consciousness (LOC) in poultry. Many current approaches to evaluating LOC are qualitative and require visual observation of the birds. This study outlines an electroencephalogram (EEG) frequency domain-based approach for determining the point at which a bird loses consciousness. In this study, commercial broilers were used to develop the methodology, and the methodology was validated with layer hens. In total, 42 data sets from 13 broilers aged 5-10 wk and 12 data sets from four spent hens (age greater than 1 yr) were collected and analyzed. A wireless EEG transmitter was surgically implanted, and each bird was monitored during individual treatment with isoflurane anesthesia. EEG data were evaluated using a frequency-based approach. The alpha/delta (A/D, alpha: 8-12 Hz, delta: 0.5-4 Hz) ratio and loss of posture (LOP) were used to determine the point at which the birds became unconscious. Unconsciousness, regardless of the method of induction, causes suppression in alpha and a rise in the delta frequency component, and this change is used to determine unconsciousness. There was no statistically significant difference between time to unconsciousness as measured by A/D ratio or LOP, and the A/D values were correlated at the times of unconsciousness. The correlation between LOP and A/D ratio indicates that the methodology is appropriate for determining

  12. Stratospheric ozone depletion over Antarctica - Role of aerosols based on SAGE II satellite observations

    Science.gov (United States)

    Lin, N.-H.; Saxena, V. K.

    1992-01-01

    The physical characteristics of the Antarctic stratospheric aerosol are investigated via a comprehensive analysis of the SAGE II data during the most severe ozone depletion episode of October 1987. The aerosol size distribution is found to be bimodal in several instances using the randomized minimization search technique, which suggests that the distribution of a single mode may be used to fit the data in the retrieved size range only at the expense of resolution for the larger particles. On average, in the region below 18 km, a wavelike perturbation with the upstream tilting for the parameters of mass loading, total number, and surface area concentration is found to be located just above the region of the most severe ozone depletion.

  13. Adjoint-based uncertainty quantification and sensitivity analysis for reactor depletion calculations

    Science.gov (United States)

    Stripling, Hayes Franklin

    Depletion calculations for nuclear reactors model the dynamic coupling between the material composition and neutron flux and help predict reactor performance and safety characteristics. In order to be trusted as reliable predictive tools and inputs to licensing and operational decisions, the simulations must include an accurate and holistic quantification of errors and uncertainties in its outputs. Uncertainty quantification is a formidable challenge in large, realistic reactor models because of the large number of unknowns and myriad sources of uncertainty and error. We present a framework for performing efficient uncertainty quantification in depletion problems using an adjoint approach, with emphasis on high-fidelity calculations using advanced massively parallel computing architectures. This approach calls for a solution to two systems of equations: (a) the forward, engineering system that models the reactor, and (b) the adjoint system, which is mathematically related to but different from the forward system. We use the solutions of these systems to produce sensitivity and error estimates at a cost that does not grow rapidly with the number of uncertain inputs. We present the framework in a general fashion and apply it to both the source-driven and k-eigenvalue forms of the depletion equations. We describe the implementation and verification of solvers for the forward and ad- joint equations in the PDT code, and we test the algorithms on realistic reactor analysis problems. We demonstrate a new approach for reducing the memory and I/O demands on the host machine, which can be overwhelming for typical adjoint algorithms. Our conclusion is that adjoint depletion calculations using full transport solutions are not only computationally tractable, they are the most attractive option for performing uncertainty quantification on high-fidelity reactor analysis problems.

  14. Engineering Design Optimization Based on Intelligent Response Surface Methodology

    Institute of Scientific and Technical Information of China (English)

    SONG Guo-hui; WU Yu; LI Cong-xin

    2008-01-01

    An intelligent response surface methodology (IRSM) was proposed to achieve the most competitivemetal forming products, in which artificial intelligence technologies are introduced into the optimization process.It is used as simple and inexpensive replacement for computationally expensive simulation model. In IRSM,the optimal design space can be reduced greatly without any prior information about function distribution.Also, by identifying the approximation error region, new design points can be supplemented correspondingly toimprove the response surface model effectively. The procedure is iterated until the accuracy reaches the desiredthreshold value. Thus, the global optimization can be performed based on this substitute model. Finally, wepresent an optimization design example about roll forming of a "U" channel product.

  15. Nuclear insurance risk assessment using risk-based methodology

    International Nuclear Information System (INIS)

    This paper presents American Nuclear Insurers' (ANI's) and Mutual Atomic Energy Liability Underwriters' (MAELU's) process and experience for conducting nuclear insurance risk assessments using a risk-based methodology. The process is primarily qualitative and uses traditional insurance risk assessment methods and an approach developed under the auspices of the American Society of Mechanical Engineers (ASME) in which ANI/MAELU is an active sponsor. This process assists ANI's technical resources in identifying where to look for insurance risk in an industry in which insurance exposure tends to be dynamic and nonactuarial. The process is an evolving one that also seeks to minimize the impact on insureds while maintaining a mutually agreeable risk tolerance

  16. Universal Verification Methodology Based Register Test Automation Flow.

    Science.gov (United States)

    Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu

    2016-05-01

    In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers. PMID:27483924

  17. A strategy for selective detection based on interferent depleting and redox cycling using the plane-recessed microdisk array electrodes

    Energy Technology Data Exchange (ETDEWEB)

    Zhu Feng [State Key Laboratory for Physical Chemistry of Solid Surfaces and Department of Chemistry, College of Chemistry and Chemical Engineering, Xiamen University, Xiamen, Fujian 361005 (China); Yan Jiawei, E-mail: jwyan@xmu.edu.cn [State Key Laboratory for Physical Chemistry of Solid Surfaces and Department of Chemistry, College of Chemistry and Chemical Engineering, Xiamen University, Xiamen, Fujian 361005 (China); Lu Miao [Pen-Tung Sah Micro-Nano Technology Research Center, Xiamen University, Xiamen, Fujian 361005 (China); Zhou Yongliang; Yang Yang; Mao Bingwei [State Key Laboratory for Physical Chemistry of Solid Surfaces and Department of Chemistry, College of Chemistry and Chemical Engineering, Xiamen University, Xiamen, Fujian 361005 (China)

    2011-10-01

    Highlights: > A novel strategy based on a combination of interferent depleting and redox cycling is proposed for the plane-recessed microdisk array electrodes. > The strategy break up the restriction of selectively detecting a species that exhibits reversible reaction in a mixture with one that exhibits an irreversible reaction. > The electrodes enhance the current signal by redox cycling. > The electrodes can work regardless of the reversibility of interfering species. - Abstract: The fabrication, characterization and application of the plane-recessed microdisk array electrodes for selective detection are demonstrated. The electrodes, fabricated by lithographic microfabrication technology, are composed of a planar film electrode and a 32 x 32 recessed microdisk array electrode. Different from commonly used redox cycling operating mode for array configurations such as interdigitated array electrodes, a novel strategy based on a combination of interferent depleting and redox cycling is proposed for the electrodes with an appropriate configuration. The planar film electrode (the plane electrode) is used to deplete the interferent in the diffusion layer. The recessed microdisk array electrode (the microdisk array), locating within the diffusion layer of the plane electrode, works for detecting the target analyte in the interferent-depleted diffusion layer. In addition, the microdisk array overcomes the disadvantage of low current signal for a single microelectrode. Moreover, the current signal of the target analyte that undergoes reversible electron transfer can be enhanced due to the redox cycling between the plane electrode and the microdisk array. Based on the above working principle, the plane-recessed microdisk array electrodes break up the restriction of selectively detecting a species that exhibits reversible reaction in a mixture with one that exhibits an irreversible reaction, which is a limitation of single redox cycling operating mode. The advantages of the

  18. Methodology of Mathematical error-Based Tuning Sliding Mode Controller

    Directory of Open Access Journals (Sweden)

    Farzin Piltan

    2012-04-01

    Full Text Available Design a nonlinear controller for second order nonlinear uncertain dynamical systems is one of the most important challenging works. This paper focuses on the design of a chattering free mathematical error-based tuning sliding mode controller (MTSMC for highly nonlinear dynamic robot manipulator, in presence of uncertainties. In order to provide high performance nonlinear methodology, sliding mode controller is selected. Pure sliding mode controller can be used to control of partly known nonlinear dynamic parameters of robot manipulator. Conversely, pure sliding mode controller is used in many applications; it has an important drawback namely; chattering phenomenon which it can causes some problems such as saturation and heat the mechanical parts of robot manipulators or drivers. In order to reduce the chattering this research is used the switching function in presence of mathematical error-based method instead of switching function method in pure sliding mode controller. The results demonstrate that the sliding mode controller with switching function is a model-based controllers which works well in certain and partly uncertain system. Pure sliding mode controller has difficulty in handling unstructured model uncertainties. To solve this problem applied mathematical model-free tuning method to sliding mode controller for adjusting the sliding surface gain (ë . Since the sliding surface gain (ë is adjusted by mathematical model free-based tuning method, it is nonlinear and continuous. In this research new ë is obtained by the previous ë multiple sliding surface slopes updating factor (á. Chattering free mathematical error-based tuning sliding mode controller is stable controller which eliminates the chattering phenomenon without to use the boundary layer saturation function. Lyapunov stability is proved in mathematical error-based tuning sliding mode controller with switching (sign function. This controller has acceptable performance in

  19. A methodology for physically based rockfall hazard assessment

    Directory of Open Access Journals (Sweden)

    G. B. Crosta

    2003-01-01

    Full Text Available Rockfall hazard assessment is not simple to achieve in practice and sound, physically based assessment methodologies are still missing. The mobility of rockfalls implies a more difficult hazard definition with respect to other slope instabilities with minimal runout. Rockfall hazard assessment involves complex definitions for "occurrence probability" and "intensity". This paper is an attempt to evaluate rockfall hazard using the results of 3-D numerical modelling on a topography described by a DEM. Maps portraying the maximum frequency of passages, velocity and height of blocks at each model cell, are easily combined in a GIS in order to produce physically based rockfall hazard maps. Different methods are suggested and discussed for rockfall hazard mapping at a regional and local scale both along linear features or within exposed areas. An objective approach based on three-dimensional matrixes providing both a positional "Rockfall Hazard Index" and a "Rockfall Hazard Vector" is presented. The opportunity of combining different parameters in the 3-D matrixes has been evaluated to better express the relative increase in hazard. Furthermore, the sensitivity of the hazard index with respect to the included variables and their combinations is preliminarily discussed in order to constrain as objective as possible assessment criteria.

  20. Materialized View Selection Approach Using Tree Based Methodology

    Directory of Open Access Journals (Sweden)

    MR. P. P. KARDE

    2010-10-01

    Full Text Available In large databases particularly in distributed database, query response time plays an important role as timely access to information and it is the basic requirement of successful business application. A data warehouse uses multiple materialized views to efficiently process a given set of queries. Quick response time and accuracy areimportant factors in the success of any database. The materialization of all views is not possible because of the space constraint and maintenance cost constraint. Selection of Materialized views is one of the most important decisions in designing a data warehouse for optimal efficiency. Selecting a suitable set of views that minimizesthe total cost associated with the materialized views and is the key component in data warehousing. Materialized views are found to be very useful for fast query processing. This paper gives the results of proposed tree based materialized view selection algorithm for query processing. In distributed environment where database is distributed over the nodes on which query should get executed and also plays an important role. This paper also proposes node selection algorithm for fast materialized view selection in distributed environment. And finally it is found that the proposed methodology performs better for query processing as compared to other materializedview selection strategies.

  1. Online monitoring of immunoaffinity-based depletion of high-abundance blood proteins by UV spectrophotometry using enhanced green fluorescence protein and FITC-labeled human serum albumin

    Directory of Open Access Journals (Sweden)

    Yu Hyeong

    2010-12-01

    Full Text Available Abstract Background The removal of high-abundance proteins from plasma is an efficient approach to investigating flow-through proteins for biomarker discovery studies. Most depletion methods are based on multiple immunoaffinity methods available commercially including LC columns and spin columns. Despite its usefulness, high-abundance depletion has an intrinsic problem, the sponge effect, which should be assessed during depletion experiments. Concurrently, the yield of depletion of high-abundance proteins must be monitored during the use of the depletion column. To date, there is no reasonable technique for measuring the recovery of flow-through proteins after depletion and assessing the capacity for capture of high-abundance proteins. Results In this study, we developed a method of measuring recovery yields of a multiple affinity removal system column easily and rapidly using enhanced green fluorescence protein as an indicator of flow-through proteins. Also, we monitored the capture efficiency through depletion of a high-abundance protein, albumin labeled with fluorescein isothiocyanate. Conclusion This simple method can be applied easily to common high-abundance protein depletion methods, effectively reducing experimental variations in biomarker discovery studies.

  2. Lean methodology: an evidence-based practice approach for healthcare improvement.

    Science.gov (United States)

    Johnson, Pauline M; Patterson, Claire J; OʼConnell, Mary P

    2013-12-10

    Lean methodology, an evidence-based practice approach adopted from Toyota, is grounded on the pillars of respect for people and continuous improvement. This article describes the use of Lean methodology to improve healthcare outcomes for patients with community-acquired pneumonia. Nurse practitioners and other clinicians should be knowledgeable about this methodology and become leaders in Lean transformation.

  3. Evaluating Supply Chain Management: A Methodology Based on a Theoretical Model

    OpenAIRE

    Alexandre Tadeu Simon; Luiz Carlos Di Serio; Silvio Roberto Ignacio Pires; Guilherme Silveira Martins

    2015-01-01

    Despite the increasing interest in supply chain management (SCM) by researchers and practitioners, there is still a lack of academic literature concerning topics such as methodologies to guide and support SCM evaluation. Most developed methodologies have been provided by consulting companies and are restricted in their publication and use. This article presents a methodology for evaluating companies’ degree of adherence to a SCM conceptual model. The methodology is based on Cooper, Lambert an...

  4. Base cation depletion, eutrophication and acidification of species-rich grasslands in response to long-term simulated nitrogen deposition

    International Nuclear Information System (INIS)

    Pollutant nitrogen deposition effects on soil and foliar element concentrations were investigated in acidic and limestone grasslands, located in one of the most nitrogen and acid rain polluted regions of the UK, using plots treated for 8-10 years with 35-140 kg N ha-2 y-1 as NH4NO3. Historic data suggests both grasslands have acidified over the past 50 years. Nitrogen deposition treatments caused the grassland soils to lose 23-35% of their total available bases (Ca, Mg, K, and Na) and they became acidified by 0.2-0.4 pH units. Aluminium, iron and manganese were mobilised and taken up by limestone grassland forbs and were translocated down the acid grassland soil. Mineral nitrogen availability increased in both grasslands and many species showed foliar N enrichment. This study provides the first definitive evidence that nitrogen deposition depletes base cations from grassland soils. The resulting acidification, metal mobilisation and eutrophication are implicated in driving floristic changes. - Nitrogen deposition causes base cation depletion, acidification and eutrophication of semi-natural grassland soils

  5. Development of Proliferation Resistance Assessment Methodology based on International Standard

    International Nuclear Information System (INIS)

    Proliferation resistance is one of the requirement to be met in GEN IV and INPRO for next generation nuclear energy system. Internationally, the evaluation methodology on PR had been already initiated from 1980, but the systematic development was started at 2000s. In Korea, for the export of nuclear energy system and the increase of international credibility and transparence of domestic nuclear system and fuel cycle development, the independent development of PR evaluation methodology was started in 2007 as a nuclear long term R and D project and the development is being performed for the model of PR evaluation methodology. In 1st year, comparative study of GEN-IV/INPRO, PR indicator development, quantification of indicator and evaluation model development, analysis of technology system and international technology development trend had been performed. In 2nd and 3rd year, feasibility study of indicator, allowable limit of indicator, review of technical requirement of indicator, technical standard, design of evaluation model were done. The results of PR evaluation must be applied in the beginning of conceptual design of nuclear system. Through the technology development of PR evaluation methodology, the methodology will be applied in the regulatory requirement for authorization and permission to be developed

  6. Black Box Chimera Check (B2C2): a Windows-Based Software for Batch Depletion of Chimeras from Bacterial 16S rRNA Gene Datasets.

    Science.gov (United States)

    Gontcharova, Viktoria; Youn, Eunseog; Wolcott, Randall D; Hollister, Emily B; Gentry, Terry J; Dowd, Scot E

    2010-01-01

    The existing chimera detection programs are not specifically designed for "next generation" sequence data. Technologies like Roche 454 FLX and Titanium have been adapted over the past years especially with the introduction of bacterial tag-encoded FLX/Titanium amplicon pyrosequencing methodologies to produce over one million 250-600 bp 16S rRNA gene reads that need to be depleted of chimeras prior to downstream analysis. Meeting the needs of basic scientists who are venturing into high-throughput microbial diversity studies such as those based upon pyrosequencing and specifically providing a solution for Windows users, the B2C2 software is designed to be able to accept files containing large multi-FASTA formatted sequences and screen for possible chimeras in a high throughput fashion. The graphical user interface (GUI) is also able to batch process multiple files. When compared to popular chimera screening software the B2C2 performed as well or better while dramatically decreasing the amount of time required generating and screening results. Even average computer users are able to interact with the Windows .Net GUI-based application and define the stringency to which the analysis should be done. B2C2 may be downloaded from http://www.researchandtesting.com/B2C2. PMID:21339894

  7. Nanoscale Field Effect Optical Modulators Based on Depletion of Epsilon-Near-Zero Films

    CERN Document Server

    Lu, Zhaolin; Shi, Kaifeng

    2015-01-01

    The field effect in metal-oxide-semiconductor (MOS) capacitors plays a key role in field-effect transistors (FETs), which are the fundamental building blocks of modern digital integrated circuits. Recent works show that the field effect can also be used to make optical/plasmonic modulators. In this paper, we report field effect electro-absorption modulators (FEOMs) each made of an ultrathin epsilon-near-zero (ENZ) film, as the active material, sandwiched in a silicon or plasmonic waveguide. Without a bias, the ENZ film maximizes the attenuation of the waveguides and the modulators work at the OFF state; contrariwise, depletion of the carriers in the ENZ film greatly reduces the attenuation and the modulators work at the ON state. The double capacitor gating scheme is used to enhance the modulation by the field effect. According to our simulation, extinction ratio up to 3.44 dB can be achieved in a 500-nm long Si waveguide with insertion loss only 0.71 dB (85.0%); extinction ratio up to 7.86 dB can be achieved...

  8. Integrated Methodology for Information System Change Control Based on Enterprise Architecture Models

    Directory of Open Access Journals (Sweden)

    Pirta Ruta

    2015-12-01

    Full Text Available The information system (IS change management and governance, according to the best practices, are defined and described in several international methodologies, standards, and frameworks (ITIL, COBIT, ValIT etc.. These methodologies describe IS change management aspects from the viewpoint of their particular enterprise resource management area. The areas are mainly viewed in a partly isolated environment, and the integration of the existing methodologies is insufficient for providing unified and controlled methodological support for holistic IS change management. In this paper, an integrated change management methodology is introduced. The methodology consists of guidelines for IS change control by integrating the following significant resource management areas – information technology (IT governance, change management and enterprise architecture (EA change management. In addition, the methodology includes lists of controls applicable at different phases. The approach is based on re-use and fusion of principles used by related methodologies as well as on empirical observations about typical IS change management mistakes in enterprises.

  9. Depletion Calculations Based on Perturbations. Application to the Study of a Rep-Like Assembly at Beginning of Cycle with TRIPOLI-4®.

    Science.gov (United States)

    Dieudonne, Cyril; Dumonteil, Eric; Malvagi, Fausto; M'Backé Diop, Cheikh

    2014-06-01

    For several years, Monte Carlo burnup/depletion codes have appeared, which couple Monte Carlo codes to simulate the neutron transport to deterministic methods, which handle the medium depletion due to the neutron flux. Solving Boltzmann and Bateman equations in such a way allows to track fine 3-dimensional effects and to get rid of multi-group hypotheses done by deterministic solvers. The counterpart is the prohibitive calculation time due to the Monte Carlo solver called at each time step. In this paper we present a methodology to avoid the repetitive and time-expensive Monte Carlo simulations, and to replace them by perturbation calculations: indeed the different burnup steps may be seen as perturbations of the isotopic concentration of an initial Monte Carlo simulation. In a first time we will present this method, and provide details on the perturbative technique used, namely the correlated sampling. In a second time the implementation of this method in the TRIPOLI-4® code will be discussed, as well as the precise calculation scheme a meme to bring important speed-up of the depletion calculation. Finally, this technique will be used to calculate the depletion of a REP-like assembly, studied at beginning of its cycle. After having validated the method with a reference calculation we will show that it can speed-up by nearly an order of magnitude standard Monte-Carlo depletion codes.

  10. Methodological Innovation in Practice-Based Design Doctorates

    Science.gov (United States)

    Yee, Joyce S. R.

    2010-01-01

    This article presents a selective review of recent design PhDs that identify and analyse the methodological innovation that is occurring in the field, in order to inform future provision of research training. Six recently completed design PhDs are used to highlight possible philosophical and practical models that can be adopted by future PhD…

  11. Analytical base-collector depletion capacitance in vertical SiGe heterojunction bipolar transistors fabricated on CMOS-compatible silicon on insulator

    Institute of Scientific and Technical Information of China (English)

    Xu Xiao-Bo; Zhang He-Ming; Hu Hui-Yong; Ma Jian-Li; Xu Li-Jun

    2011-01-01

    The base-collector depletion capacitance for vertical SiGe npn heterojunction bipolar transistors (HBTs) on silicon on insulator (SOI) is split into vertical and lateral parts. This paper proposes a novel analytical depletion capacitance model of this structure for the first time. A large discrepancy is predicted when the present model is compared with the conventional depletion model, and it is shown that the capacitance decreases with the increase of the reverse collectorbase bias-and shows a kink as the reverse collector-base bias reaches the effective vertical punch-through voltage while the voltage differs with the collector doping concentrations, which is consistent with measurement results. The model can be employed for a fast evaluation of the depletion capacitance of an SOI SiGe HBT and has useful applications on the design and simulation of high performance SiGe circuits and devices.

  12. Analytical base-collector depletion capacitance in vertical SiGe heterojunction bipolar transistors fabricated on CMOS-compatible silicon on insulator

    Science.gov (United States)

    Xu, Xiao-Bo; Zhang, He-Ming; Hu, Hui-Yong; Ma, Jian-Li; Xu, Li-Jun

    2011-01-01

    The base—collector depletion capacitance for vertical SiGe npn heterojunction bipolar transistors (HBTs) on silicon on insulator (SOI) is split into vertical and lateral parts. This paper proposes a novel analytical depletion capacitance model of this structure for the first time. A large discrepancy is predicted when the present model is compared with the conventional depletion model, and it is shown that the capacitance decreases with the increase of the reverse collector—base bias—and shows a kink as the reverse collector—base bias reaches the effective vertical punch-through voltage while the voltage differs with the collector doping concentrations, which is consistent with measurement results. The model can be employed for a fast evaluation of the depletion capacitance of an SOI SiGe HBT and has useful applications on the design and simulation of high performance SiGe circuits and devices.

  13. An infrared image based methodology for breast lesions screening

    Science.gov (United States)

    Morais, K. C. C.; Vargas, J. V. C.; Reisemberger, G. G.; Freitas, F. N. P.; Oliari, S. H.; Brioschi, M. L.; Louveira, M. H.; Spautz, C.; Dias, F. G.; Gasperin, P.; Budel, V. M.; Cordeiro, R. A. G.; Schittini, A. P. P.; Neto, C. D.

    2016-05-01

    The objective of this paper is to evaluate the potential of utilizing a structured methodology for breast lesions screening, based on infrared imaging temperature measurements of a healthy control group to establish expected normality ranges, and of breast cancer patients, previously diagnosed through biopsies of the affected regions. An analysis of the systematic error of the infrared camera skin temperature measurements was conducted in several different regions of the body, by direct comparison to high precision thermistor temperature measurements, showing that infrared camera temperatures are consistently around 2 °C above the thermistor temperatures. Therefore, a method of conjugated gradients is proposed to eliminate the infrared camera direct temperature measurement imprecision, by calculating the temperature difference between two points to cancel out the error. The method takes into account the human body approximate bilateral symmetry, and compares measured dimensionless temperature difference values (Δ θ bar) between two symmetric regions of the patient's breast, that takes into account the breast region, the surrounding ambient and the individual core temperatures, and doing so, the results interpretation for different individuals become simple and non subjective. The range of normal whole breast average dimensionless temperature differences for 101 healthy individuals was determined, and admitting that the breasts temperatures exhibit a unimodal normal distribution, the healthy normal range for each region was considered to be the dimensionless temperature difference plus/minus twice the standard deviation of the measurements, Δ θ bar ‾ + 2σ Δ θ bar ‾ , in order to represent 95% of the population. Forty-seven patients with previously diagnosed breast cancer through biopsies were examined with the method, which was capable of detecting breast abnormalities in 45 cases (96%). Therefore, the conjugated gradients method was considered effective

  14. Knowledge-based and model-based hybrid methodology for comprehensive waste minimization in electroplating plants

    Science.gov (United States)

    Luo, Keqin

    1999-11-01

    The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and

  15. Intrinsic Depletion or Not

    DEFF Research Database (Denmark)

    Klösgen, Beate; Bruun, Sara; Hansen, Søren;

    with an AFM (2).    The intuitive explanation for the depletion based on "hydrophobic mismatch" between the obviously hydrophilic bulk phase of water next to the hydrophobic polymer. It would thus be an intrinsic property of all interfaces between non-matching materials. The detailed physical interaction path......  The presence of a depletion layer of water along extended hydrophobic interfaces, and a possibly related formation of nanobubbles, is an ongoing discussion. The phenomenon was initially reported when we, years ago, chose thick films (~300-400Å) of polystyrene as cushions between a crystalline...

  16. Intrinsic Depletion or Not

    DEFF Research Database (Denmark)

    Klösgen, Beate; Bruun, Sara; Hansen, Søren;

    with an AFM (2). The intuitive explanation for the depletion based on "hydrophobic mismatch" between the obviously hydrophilic bulk phase of water next to the hydrophobic polymer. It would thus be an intrinsic property of all interfaces between non-matching materials. The detailed physical interaction path......  The presence of a depletion layer of water along extended hydrophobic interfaces, and a possibly related formation of nanobubbles, is an ongoing discussion. The phenomenon was initially reported when we, years ago, chose thick films (~300-400Å) of polystyrene as cushions between a crystalline...

  17. AONBench: A Methodology for Benchmarking XML Based Service Oriented Applications

    Directory of Open Access Journals (Sweden)

    Abdul Waheed

    2007-09-01

    Full Text Available Service Oriented Architectures (SOA and applications increasingly rely on network infrastructure instead of back-end servers. Cisco system Application Oriented Networking (AON initiative exemplifies this trend. Benchmarking such infrastructure and their services is expected to play an important role in the networking industry. We present AONBech specifications and methodology to benchmark networked XML application servers and appliances. AONBench is not a benchmarking tool. It is a specification and methodology for performance measurements, which leverages from existing XML microbenchmarks and uses HTTP for end-to-end communication. We implement AONBench specifications for end-to-end performance measurements through public domain HTTP load generation tool ApacheBench and Apache web server. We present three case studies of using AONBench for architecting real application oriented networking products.

  18. ENACTED SOFTWARE DEVELOPMENT PROCESS BASED ON AGILE AND AGENT METHODOLOGIES

    Directory of Open Access Journals (Sweden)

    DR. NACHAMAI. M

    2011-11-01

    Full Text Available Software Engineering gives the procedures and practices to be followed in the software development and acts as a backbone for computer science engineering techniques. This paper deals with current trends in software engineering methodologies, Agile and Agent Oriented software development process. Agile Methodology is to meet the needs of dynamic changing requirements of the customers. This model is iterative and incremental and accepts the changes in requirements at any stage of development. Agent oriented software’s is a rapidly developing area of research , Software agents are an innovative technology designed to support the development of complex, distributed, and heterogeneous information systems. The work of paper weight against factors of agile and agent oriented software development process on the basis of Architectural Design ,Applicability,Project Duration, Customer Interaction Level, Team collaboration, Documentation, Software Models.

  19. Theoretical and Methodological Bases of Accounting Product Formation

    OpenAIRE

    Mykhaylo Prodanchuk

    2015-01-01

    The article is devoted to deepening of theoretical and methodological provisions of accounting by determining the economic essence of the concept of 'accounting product', the formulation of its main characteristics and properties. There is defined the place of accounting in information system of a company. There is proved that the accounting system is a major source of accounting information that turns raw data recorded in the documents to qualitative informative product and adapts all the di...

  20. Cooperative Decision Making : a methodology based on collective preferences aggregation

    OpenAIRE

    Sibertin-Blanc, Christophe; ZARATÉ, Pascale

    2014-01-01

    The benefice of a collective decisions process mainly rests upon the possibility for the participants to confront their respective points of views. To this end, they must have cognitive and technical tools that ease the sharing of the reasons that motivate their own preferences, while accounting for information and feelings they should keep for their own. The paper presents the basis of such a cooperative decision making methodology that allows sharing information by accurately distinguishing...

  1. Some Methodological Considerations in Theory-Based Health Behavior Research

    OpenAIRE

    Collins, Linda M.; MacKinnon, David P.; Reeve, Bryce B.

    2013-01-01

    As this special issue shows, much research in social and personality psychology is directly relevant to health psychology. In this brief commentary, we discuss three topics in research methodology that may be of interest to investigators involved in health-related psychological research. The first topic is statistical analysis of mediated and moderated effects. The second is measurement of latent constructs. The third is the Multiphase Optimization Strategy, a framework for translation of inn...

  2. Auto-reactive T cells revised. Overestimation based on methodology?

    DEFF Research Database (Denmark)

    Thorlacius-Ussing, Gorm; Sørensen, Jesper F; Wandall, Hans H;

    2015-01-01

    loaded with E. coli produced recombinant protein or unmodified synthetic HLA binding peptides. Our concern is that this approach may ignore the presence of natural genetic variation and post-translational modifications such as e.g. the complex nature of N- and O-linked glycosylation of mammalian proteins...... methodology applied to document T cell reactivity against unmodified protein or peptide may lead to overinterpretation of the reported frequencies of autoreactive CD4+ and CD8+ T cells....

  3. Eutrophication of mangroves linked to depletion of foliar and soil base cations.

    Science.gov (United States)

    Fauzi, Anas; Skidmore, Andrew K; Heitkönig, Ignas M A; van Gils, Hein; Schlerf, Martin

    2014-12-01

    There is growing concern that increasing eutrophication causes degradation of coastal ecosystems. Studies in terrestrial ecosystems have shown that increasing the concentration of nitrogen in soils contributes to the acidification process, which leads to leaching of base cations. To test the effects of eutrophication on the availability of base cations in mangroves, we compared paired leaf and soil nutrient levels sampled in Nypa fruticans and Rhizophora spp. on a severely disturbed, i.e. nutrient loaded, site (Mahakam delta) with samples from an undisturbed, near-pristine site (Berau delta) in East Kalimantan, Indonesia. The findings indicate that under pristine conditions, the availability of base cations in mangrove soils is determined largely by salinity. Anthropogenic disturbances on the Mahakam site have resulted in eutrophication, which is related to lower levels of foliar and soil base cations. Path analysis suggests that increasing soil nitrogen reduces soil pH, which in turn reduces the levels of foliar and soil base cations in mangroves.

  4. A Nuclear Reactor Transient Methodology Based on Discrete Ordinates Method

    Directory of Open Access Journals (Sweden)

    Shun Zhang

    2014-01-01

    Full Text Available With the rapid development of nuclear power industry, simulating and analyzing the reactor transient are of great significance for the nuclear safety. The traditional diffusion theory is not suitable for small volume or strong absorption problem. In this paper, we have studied the application of discrete ordinates method in the numerical solution of space-time kinetics equation. The fully implicit time integration was applied and the precursor equations were solved by analytical method. In order to improve efficiency of the transport theory, we also adopted some advanced acceleration methods. Numerical results of the TWIGL benchmark problem presented demonstrate the accuracy and efficiency of this methodology.

  5. AN INDUCTIVE, INTERACTIVE AND ADAPTIVE HYBRID PROBLEM-BASED LEARNING METHODOLOGY: APPLICATION TO STATISTICS

    OpenAIRE

    ADA ZHENG; YAN ZHOU

    2011-01-01

    We have developed an innovative hybrid problem-based learning (PBL) methodology. The methodology has the following distinctive features: i) Each complex question was decomposed into a set of coherent finer subquestions by following the carefully designed criteria to maintain a delicate balance between guiding the students and inspiring them to think independently. This learning methodology enabled the students to solve the complex questions progressively in an inductive context. ii) Facilitat...

  6. A Model-based Prognostics Methodology for Electrolytic Capacitors Based on Electrical Overstress Accelerated Aging

    Science.gov (United States)

    Celaya, Jose; Kulkarni, Chetan; Biswas, Gautam; Saha, Sankalita; Goebel, Kai

    2011-01-01

    A remaining useful life prediction methodology for electrolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical degradation model. Electrolytic capacitors are used in several applications ranging from power supplies on critical avionics equipment to power drivers for electro-mechanical actuators. These devices are known for their comparatively low reliability and given their criticality in electronics subsystems they are a good candidate for component level prognostics and health management. Prognostics provides a way to assess remaining useful life of a capacitor based on its current state of health and its anticipated future usage and operational conditions. We present here also, experimental results of an accelerated aging test under electrical stresses. The data obtained in this test form the basis for a remaining life prediction algorithm where a model of the degradation process is suggested. This preliminary remaining life prediction algorithm serves as a demonstration of how prognostics methodologies could be used for electrolytic capacitors. In addition, the use degradation progression data from accelerated aging, provides an avenue for validation of applications of the Kalman filter based prognostics methods typically used for remaining useful life predictions in other applications.

  7. Eutrophication of mangroves linked to depletion of foliar and soil base cations

    NARCIS (Netherlands)

    Fauzi, A.; Skidmore, A.K.; Heitkonig, I.M.A.; Gils, van H.; Schlerf, M.

    2014-01-01

    There is growing concern that increasing eutrophication causes degradation of coastal ecosystems. Studies in terrestrial ecosystems have shown that increasing the concentration of nitrogen in soils contributes to the acidification process, which leads to leaching of base cations. To test the effects

  8. Radiochemical Analysis Methodology for uranium Depletion Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Scatena-Wachel DE

    2007-01-09

    This report provides sufficient material for a test sponsor with little or no radiochemistry background to understand and follow physics irradiation test program execution. Most irradiation test programs employ similar techniques and the general details provided here can be applied to the analysis of other irradiated sample types. Aspects of program management directly affecting analysis quality are also provided. This report is not an in-depth treatise on the vast field of radiochemical analysis techniques and related topics such as quality control. Instrumental technology is a very fast growing field and dramatic improvements are made each year, thus the instrumentation described in this report is no longer cutting edge technology. Much of the background material is still applicable and useful for the analysis of older experiments and also for subcontractors who still retain the older instrumentation.

  9. Legal and methodological bases of comprehensive forensic enquiry of pornography

    Directory of Open Access Journals (Sweden)

    Berdnikov D.V.

    2016-03-01

    Full Text Available The article gives an analysis of the legal definition of pornography. The author identified descriptive and target criteria groups which are required for the analysis and analyses the content of descriptive criteria of pornography and the way how they should be documented. Fixing attention to the anatomical and physiological characteristics of the sexual relations is determine as necessary target criterion. It is noted that the term "pornography" is a legal and cannot be subject of expertise. That is why author underlined some methodological basis of complex psycho-linguistic and psycho-art expertise. The article presents general issue depends on expert conclusion and studies cases where the research is necessary to involve doctors, as well as criteria for expert's opinion. Besides that, author defined subject, object and main tasks of psychological studies of pornographic information.

  10. A Methodology for Integrating Computer-Based Learning Tools in Science Curricula

    Science.gov (United States)

    Papadouris, Nicos; Constantinou, Constantinos P.

    2009-01-01

    This paper demonstrates a methodology for effectively integrating computer-based learning tools in science teaching and learning. This methodology provides a means of systematic analysis to identify the capabilities of particular software tools and to formulate a series of competencies relevant to physical science that could be developed by means…

  11. Halogen activation and ozone depletion events as measured from space and ground-based DOAS measurements during Spring 2009

    Energy Technology Data Exchange (ETDEWEB)

    Sihler, Holger [Institute of Environmental Physics, University of Heidelberg (Germany); Max-Planck-Institute for Chemistry, Mainz (Germany); Friess, Udo; Platt, Ulrich [Institute of Environmental Physics, University of Heidelberg (Germany); Wagner, Thomas [Max-Planck-Institute for Chemistry, Mainz (Germany)

    2010-07-01

    Bromine monoxide (BrO) radicals are known to play an important role in the chemistry of the springtime polar troposphere. Their release by halogen activation processes leads to the almost complete destruction of near-surface ozone during ozone depletion events ODEs. In order to improve our understanding of the halogen activation processes in three dimensions, we combine active and passive ground-based and satellite-borne measurements of BrO radicals. While satellites can not resolve the vertical distribution and have rather coarse horizontal resolution, they may provide information on the large-scale horizontal distribution. Information on the spatial variability within a satellite pixel may be derived from our combined ground-based instrumentation. Simultaneous passive multi-axis differential optical absorption spectroscopy (MAX-DOAS) and active long-path DOAS (LP-DOAS) measurements were conducted during the jointly organised OASIS campaign in Barrow, Alaska during Spring 2009 within the scope of the International Polar Year (IPY). Ground-based measurements are compared to BrO column densities measured by GOME-2 in order to find a conclusive picture of the spatial pattern of bromine activation.

  12. The effect of acute tyrosine phenylalanine depletion on emotion-based decision-making in healthy adults.

    Science.gov (United States)

    Vrshek-Schallhorn, Suzanne; Wahlstrom, Dustin; White, Tonya; Luciana, Monica

    2013-04-01

    Despite interest in dopamine's role in emotion-based decision-making, few reports of the effects of dopamine manipulations are available in this area in humans. This study investigates dopamine's role in emotion-based decision-making through a common measure of this construct, the Iowa Gambling Task (IGT), using Acute Tyrosine Phenylalanine Depletion (ATPD). In a between-subjects design, 40 healthy adults were randomized to receive either an ATPD beverage or a balanced amino acid beverage (a control) prior to completing the IGT, as well as pre- and post-manipulation blood draws for the neurohormone prolactin. Together with conventional IGT performance metrics, choice selections and response latencies were examined separately for good and bad choices before and after several key punishment events. Changes in response latencies were also used to predict total task performance. Prolactin levels increased significantly in the ATPD group but not in the control group. However, no significant group differences in performance metrics were detected, nor were there sex differences in outcome measures. However, the balanced group's bad deck latencies speeded up across the task, while the ATPD group's latencies remained adaptively hesitant. Additionally, modulation of latencies to the bad decks predicted total score for the ATPD group only. One interpretation is that ATPD subtly attenuated reward salience and altered the approach by which individuals achieved successful performance, without resulting in frank group differences in task performance.

  13. Drag &Drop, Mixed-Methodology-based Lab-on-Chip Design Optimization Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall objective is to develop a ?mixed-methodology?, drag and drop, component library (fluidic-lego)-based, system design and optimization tool for complex...

  14. A methodology for capturing and analyzing data from technology base seminar wargames.

    OpenAIRE

    Miles, Jeffrey T.

    1991-01-01

    Approved for public release; distribution is unlimited This thesis provides a structured methodology for obtaining, evaluating, and portraying to a decision maker, the opinions of players of Technology Base Seminar Wargames (TBSW). The thesis then demonstrates the methodology by applying the events of the Fire Support Technology Base Seminar Wargame held in May 1991. Specifically, the evaluation team developed six surveys, each survey capturing opinions using the categorical...

  15. Design Based Research Methodology for Teaching with Technology in English

    Science.gov (United States)

    Jetnikoff, Anita

    2015-01-01

    Design based research (DBR) is an appropriate method for small scale educational research projects involving collaboration between teachers, students and researchers. It is particularly useful in collaborative projects where an intervention is implemented and evaluated in a grounded context. The intervention can be technological, or a new program…

  16. A methodology to support multidisciplinary model-based water management

    NARCIS (Netherlands)

    Scholten, H.; Kassahun, A.; Refsgaard, J.C.; Kargas, Th.; Gavardinas, C.; Beulens, A.J.M.

    2007-01-01

    Quality assurance in model based water management is needed because of some frequently perceived shortcomings, e.g. a lack of mutual understanding between modelling team members, malpractice and a tendency of modellers to oversell model capabilities. Initiatives to support quality assurance focus on

  17. Knowledge-based methodology in pattern recognition and understanding

    OpenAIRE

    Haton, Jean-Paul

    1987-01-01

    The interpretation and understanding of complex patterns (e.g. speech, images or other kinds of mono- or multi-dimensional signals) is related both to pattern recognition and artificial intelligence since it necessitates numerical processing as well as symbolic knoledge-based reasoning techniques. This paper presents a state-of-the-art in the field, including basic concepts and practical applications.

  18. Simulation Based Performance Assessment : Methodology and case study

    NARCIS (Netherlands)

    Kraker, J.K. de; Noordkamp, H.W.; Fitski, H.J.; Barros, A.I

    2009-01-01

    During the various phases of the Defense acquisition process, and in the early design phases, many decisions must be made concerning the performance and cost of the new equipment. Often many of these decisions are made while having only a limited view of their consequences or based on subjective inf

  19. Unbiased Selective Isolation of Protein N-Terminal Peptides from Complex Proteome Samples Using Phospho Tagging PTAG) and TiO2-based Depletion

    NARCIS (Netherlands)

    Mommen, G.P.M.; Waterbeemd, van de B.; Meiring, H.D.; Kersten, G.; Heck, A.J.R.; Jong, de A.P.J.M.

    2012-01-01

    A positional proteomics strategy for global N-proteome analysis is presented based on phospho tagging (PTAG) of internal peptides followed by depletion by titanium dioxide (TiO2) affinity chromatography. Therefore, N-terminal and lysine amino groups are initially completely dimethylated with formald

  20. A methodology to assess the contribution of biorefineries to a sustainable bio-based economy

    Energy Technology Data Exchange (ETDEWEB)

    Maga, Daniel

    2015-07-01

    Within this thesis for the first time an integrative methodology to assess the sustainability of biorefineries and bio-based products has been developed which is based on a fundamental understanding of sustainability as presented in the Brundtland report. The applied integrative concept of sustainability as developed by the Institute for Technology Assessment and Systems Analysis (ITAS) overcomes the widespread thinking in three pillars of sustainability and opens up new perspectives. The methodology developed addresses innovative life cycle assessment evaluation methods on midpoint level as well as on the area of protection and adopts state-of-the-art assessment procedures e.g. to determine water deprivation. It goes far beyond the scope of conventional LCA studies and examines effects on human health, on the environment, on the development of knowledge and physical capital, and on regional development and acceptance. In order to validate the developed method it was applied to an algae biorefinery currently under development and construction in the south of Spain. For this assessment for the first time extensive process data was collected of a real algae biorefinery which uses municipal waste water as a culture medium for microalgae. The use of waste water allows to reduce the demand for fresh water and avoids additional fertilisation of microalgae. Moreover, the analysed algae biorefinery replaces conventional waste water treatment by a biological purification and produces biogas by an anaerobic pretreatment of waste water as well as by anaerobic digestion of algae. After several purification steps the biogas can be used as automotive fuel and thus contributes to further development and increased use of biofuels. On the one hand the sustainability assessment shows that this way of waste water treatment contributes to climate protection and to the conservation of fossil energy carrier. On the other hand approximately ten times more land is needed and twenty times

  1. A methodology to assess the contribution of biorefineries to a sustainable bio-based economy

    International Nuclear Information System (INIS)

    Within this thesis for the first time an integrative methodology to assess the sustainability of biorefineries and bio-based products has been developed which is based on a fundamental understanding of sustainability as presented in the Brundtland report. The applied integrative concept of sustainability as developed by the Institute for Technology Assessment and Systems Analysis (ITAS) overcomes the widespread thinking in three pillars of sustainability and opens up new perspectives. The methodology developed addresses innovative life cycle assessment evaluation methods on midpoint level as well as on the area of protection and adopts state-of-the-art assessment procedures e.g. to determine water deprivation. It goes far beyond the scope of conventional LCA studies and examines effects on human health, on the environment, on the development of knowledge and physical capital, and on regional development and acceptance. In order to validate the developed method it was applied to an algae biorefinery currently under development and construction in the south of Spain. For this assessment for the first time extensive process data was collected of a real algae biorefinery which uses municipal waste water as a culture medium for microalgae. The use of waste water allows to reduce the demand for fresh water and avoids additional fertilisation of microalgae. Moreover, the analysed algae biorefinery replaces conventional waste water treatment by a biological purification and produces biogas by an anaerobic pretreatment of waste water as well as by anaerobic digestion of algae. After several purification steps the biogas can be used as automotive fuel and thus contributes to further development and increased use of biofuels. On the one hand the sustainability assessment shows that this way of waste water treatment contributes to climate protection and to the conservation of fossil energy carrier. On the other hand approximately ten times more land is needed and twenty times

  2. Relational and Object-Oriented Methodology in Data Bases Systems

    Directory of Open Access Journals (Sweden)

    Marian Pompiliu CRISTESCU

    2006-01-01

    Full Text Available Database programming languages integrate concepts of databases and programming languages to provide both implementation tools for data-intensive applications and high-level user interfaces to databases. Frequently, database programs contain a large amount of application knowledge which is hidden in the procedural code and thus difficult to maintain with changing data and user views. This paper presents a first attempt to improve the situation by supporting the integrated definition and management of data and rules based on a setoriented and predicative approach. The use of database technology for integrated fact and rule base management is shown to have some important advantages in terms of fact and rule integrity, question-answering, and explanation of results.

  3. Electrocardiogram based methodology for computing of Coronary Sinus Pressure

    Directory of Open Access Journals (Sweden)

    Loay Alzubaidi

    2011-05-01

    Full Text Available In this paper, a method based on pattern recognition and ECG technology is introduced as a means of calculating the optimum occlusion and release points within Pressure controlled intermittent coronary sinus occlusion (PICSO cycles. There are favorable results that show PICSO can substantially salvage ischemic myocardium during medical surgery. These results are confirmed after studying groups of animals. The new method is a continuation of previous work on two other techniques using estimation and derivative calculations.

  4. Electrocardiogram based methodology for computing of Coronary Sinus Pressure

    OpenAIRE

    Loay Alzubaidi; Ammar El Hassan; Jaafar Al Ghazo

    2011-01-01

    In this paper, a method based on pattern recognition and ECG technology is introduced as a means of calculating the optimum occlusion and release points within Pressure controlled intermittent coronary sinus occlusion (PICSO) cycles. There are favorable results that show PICSO can substantially salvage ischemic myocardium during medical surgery. These results are confirmed after studying groups of animals. The new method is a continuation of previous work on two other techniques using estimat...

  5. Publishing FAIR Data: An Exemplar Methodology Utilizing PHI-Base.

    Science.gov (United States)

    Rodríguez-Iglesias, Alejandro; Rodríguez-González, Alejandro; Irvine, Alistair G; Sesma, Ane; Urban, Martin; Hammond-Kosack, Kim E; Wilkinson, Mark D

    2016-01-01

    Pathogen-Host interaction data is core to our understanding of disease processes and their molecular/genetic bases. Facile access to such core data is particularly important for the plant sciences, where individual genetic and phenotypic observations have the added complexity of being dispersed over a wide diversity of plant species vs. the relatively fewer host species of interest to biomedical researchers. Recently, an international initiative interested in scholarly data publishing proposed that all scientific data should be "FAIR"-Findable, Accessible, Interoperable, and Reusable. In this work, we describe the process of migrating a database of notable relevance to the plant sciences-the Pathogen-Host Interaction Database (PHI-base)-to a form that conforms to each of the FAIR Principles. We discuss the technical and architectural decisions, and the migration pathway, including observations of the difficulty and/or fidelity of each step. We examine how multiple FAIR principles can be addressed simultaneously through careful design decisions, including making data FAIR for both humans and machines with minimal duplication of effort. We note how FAIR data publishing involves more than data reformatting, requiring features beyond those exhibited by most life science Semantic Web or Linked Data resources. We explore the value-added by completing this FAIR data transformation, and then test the result through integrative questions that could not easily be asked over traditional Web-based data resources. Finally, we demonstrate the utility of providing explicit and reliable access to provenance information, which we argue enhances citation rates by encouraging and facilitating transparent scholarly reuse of these valuable data holdings. PMID:27433158

  6. Publishing FAIR Data: An Exemplar Methodology Utilizing PHI-Base

    Science.gov (United States)

    Rodríguez-Iglesias, Alejandro; Rodríguez-González, Alejandro; Irvine, Alistair G.; Sesma, Ane; Urban, Martin; Hammond-Kosack, Kim E.; Wilkinson, Mark D.

    2016-01-01

    Pathogen-Host interaction data is core to our understanding of disease processes and their molecular/genetic bases. Facile access to such core data is particularly important for the plant sciences, where individual genetic and phenotypic observations have the added complexity of being dispersed over a wide diversity of plant species vs. the relatively fewer host species of interest to biomedical researchers. Recently, an international initiative interested in scholarly data publishing proposed that all scientific data should be “FAIR”—Findable, Accessible, Interoperable, and Reusable. In this work, we describe the process of migrating a database of notable relevance to the plant sciences—the Pathogen-Host Interaction Database (PHI-base)—to a form that conforms to each of the FAIR Principles. We discuss the technical and architectural decisions, and the migration pathway, including observations of the difficulty and/or fidelity of each step. We examine how multiple FAIR principles can be addressed simultaneously through careful design decisions, including making data FAIR for both humans and machines with minimal duplication of effort. We note how FAIR data publishing involves more than data reformatting, requiring features beyond those exhibited by most life science Semantic Web or Linked Data resources. We explore the value-added by completing this FAIR data transformation, and then test the result through integrative questions that could not easily be asked over traditional Web-based data resources. Finally, we demonstrate the utility of providing explicit and reliable access to provenance information, which we argue enhances citation rates by encouraging and facilitating transparent scholarly reuse of these valuable data holdings. PMID:27433158

  7. Publishing FAIR Data: an exemplar methodology utilizing PHI-base

    Directory of Open Access Journals (Sweden)

    Alejandro eRodríguez Iglesias

    2016-05-01

    Full Text Available Pathogen-Host interaction data is core to our understanding of disease processes and their molecular/genetic bases. Facile access to such core data is particularly important for the plant sciences, where individual genetic and phenotypic observations have the added complexity of being dispersed over a wide diversity of plant species versus the relatively fewer host species of interest to biomedical researchers. Recently, an international initiative interested in scholarly data publishing proposed that all scientific data should be FAIR - Findable, Accessible, Interoperable, and Reusable. In this work, we describe the process of migrating a database of notable relevance to the plant sciences - the Pathogen-Host Interaction Database (PHI-base - to a form that conforms to each of the FAIR Principles. We discuss the technical and architectural decisions, and the migration pathway, including observations of the difficulty and/or fidelity of each step. We examine how multiple FAIR principles can be addressed simultaneously through careful design decisions, including making data FAIR for both humans and machines with minimal duplication of effort. We note how FAIR data publishing involves more than data reformatting, requiring features beyond those exhibited by most life science Semantic Web or Linked Data resources. We explore the value-added by completing this FAIR data transformation, and then test the result through integrative questions that could not easily be asked over traditional Web-based data resources. Finally, we demonstrate the utility of providing explicit and reliable access to provenance information, which we argue enhances citation rates by encouraging and facilitating transparent scholarly reuse of these valuable data holdings.

  8. Publishing FAIR Data: An Exemplar Methodology Utilizing PHI-Base.

    Science.gov (United States)

    Rodríguez-Iglesias, Alejandro; Rodríguez-González, Alejandro; Irvine, Alistair G; Sesma, Ane; Urban, Martin; Hammond-Kosack, Kim E; Wilkinson, Mark D

    2016-01-01

    Pathogen-Host interaction data is core to our understanding of disease processes and their molecular/genetic bases. Facile access to such core data is particularly important for the plant sciences, where individual genetic and phenotypic observations have the added complexity of being dispersed over a wide diversity of plant species vs. the relatively fewer host species of interest to biomedical researchers. Recently, an international initiative interested in scholarly data publishing proposed that all scientific data should be "FAIR"-Findable, Accessible, Interoperable, and Reusable. In this work, we describe the process of migrating a database of notable relevance to the plant sciences-the Pathogen-Host Interaction Database (PHI-base)-to a form that conforms to each of the FAIR Principles. We discuss the technical and architectural decisions, and the migration pathway, including observations of the difficulty and/or fidelity of each step. We examine how multiple FAIR principles can be addressed simultaneously through careful design decisions, including making data FAIR for both humans and machines with minimal duplication of effort. We note how FAIR data publishing involves more than data reformatting, requiring features beyond those exhibited by most life science Semantic Web or Linked Data resources. We explore the value-added by completing this FAIR data transformation, and then test the result through integrative questions that could not easily be asked over traditional Web-based data resources. Finally, we demonstrate the utility of providing explicit and reliable access to provenance information, which we argue enhances citation rates by encouraging and facilitating transparent scholarly reuse of these valuable data holdings.

  9. JOB SHOP METHODOLOGY BASED ON AN ANT COLONY

    Directory of Open Access Journals (Sweden)

    OMAR CASTRILLON

    2009-01-01

    Full Text Available The purpose of this study is to reduce the total process time (Makespan and to increase the machines working time, in a job shop environment, using a heuristic based on ant colony optimization. This work is developed in two phases: The first stage describes the identification and definition of heuristics for the sequential processes in the job shop. The second stage shows the effectiveness of the system in the traditional programming of production. A good solution, with 99% efficiency is found using this technique.

  10. Synthesis of Schiff Bases via Environmentally Benign and Energy-Efficient Greener Methodologies

    OpenAIRE

    Arshi Naqvi; Mohd. Shahnawaaz; Arikatla V. Rao; Daya S. Seth; Sharma, Nawal K.

    2009-01-01

    Non classical methods (water based reaction, microwave and grindstone chemistry) were used for the preparation of Schiff bases from 3-chloro-4-fluoro aniline and several benzaldehydes. The key raw materials were allowed to react in water, under microwave irradiation and grindstone. These methodologies constitute an energy-efficient and environmentally benign greener chemistry version of the classical condensation reactions for Schiff bases formation.

  11. Application of risk-based methodologies to prioritize safety resources

    International Nuclear Information System (INIS)

    The Electric Power Research Institute (EPRI) started a program entitled risk-based prioritization in 1992. The purpose of this program is to provide generic technical support to the nuclear power industry relative to its recent initiatives in the area of operations and maintenance (O ampersand M) cost control using state-of-the-art risk methods. The approach uses probabilistic risk assessment (PRA), or similar techniques, to allocate resources commensurate with the risk posed by nuclear plant operations. Specifically, those items or events that have high risk significance would receive the most attention, while those with little risk content would command fewer resources. As quantified in a companion paper,close-quote the potential O ampersand M cost reduction inherent in this approach is very large. Furthermore, risk-based methods should also lead to safety improvements. This paper outlines the way that the EPRI technical work complements the technical, policy, and regulatory initiatives taken by others in the industry and provides an example of the approach as used to prioritize motor-operated valve (MOV) testing in response to US Nuclear Regulatory Commission (NRC) Generic Letter 89-10

  12. Evaluation methodology for query-based scene understanding systems

    Science.gov (United States)

    Huster, Todd P.; Ross, Timothy D.; Culbertson, Jared L.

    2015-05-01

    In this paper, we are proposing a method for the principled evaluation of scene understanding systems in a query-based framework. We can think of a query-based scene understanding system as a generalization of typical sensor exploitation systems where instead of performing a narrowly defined task (e.g., detect, track, classify, etc.), the system can perform general user-defined tasks specified in a query language. Examples of this type of system have been developed as part of DARPA's Mathematics of Sensing, Exploitation, and Execution (MSEE) program. There is a body of literature on the evaluation of typical sensor exploitation systems, but the open-ended nature of the query interface introduces new aspects to the evaluation problem that have not been widely considered before. In this paper, we state the evaluation problem and propose an approach to efficiently learn about the quality of the system under test. We consider the objective of the evaluation to be to build a performance model of the system under test, and we rely on the principles of Bayesian experiment design to help construct and select optimal queries for learning about the parameters of that model.

  13. Radiative characteristics of depleted uranium bomb and it is protection

    International Nuclear Information System (INIS)

    Based on the developing process of depleted uranium bombs described in the first part, the radiative characteristics and mechanism of depleted uranium bombs are analyzed emphatically. The deeper discussion on protection of depleted uranium bombs is proceeded

  14. Tornado missile simulation and design methodology. Volume 2: model verification and data base updates. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments.

  15. A copula-based downscaling methodology of RCM precipitation fields

    Science.gov (United States)

    Lorenz, Manuel

    2016-04-01

    Many hydrological studies require long term precipitation time series at a fine spatial resolution. While regional climate models are nowadays capable of simulating reasonable high-resolution precipitation fields, the long computing time makes the generation of such long term time series often infeasible for practical purposes. We introduce a comparatively fast stochastic approach to simulate precipitation fields which resemble the spatial dependencies and density distributions of the dynamic model. Nested RCM simulations at two different spatial resolutions serve as a training set to derive the statistics which will then be used in a random path simulation where fine scale precipitation values are simulated from a multivariate Gaussian Copula. The chosen RCM is the Weather Research and Forecasting Model (WRF). Simulated daily precipitation fields of the RCM are based on ERA-Interim reanalysis data from 1971 to 2000 and are available at a spatial resolution of 42 km (Europe) and 7 km (Germany). In order to evaluate the method, the stochastic algorithm is applied to the nested German domain and the resulting spatial dependencies and density distributions are compared to the original 30 years long 7 km WRF simulations. Preliminary evaluations based on QQ-plots for one year indicate that the distributions of the downscaled values are very similar to the original values for most cells. In this presentation, a detailed overview of the stochastic downscaling algorithm and the evaluation of the long term simulations are given. Additionally, an outlook for a 5 km and 1 km downscaling experiment for urban hydrology studies is presented.

  16. Conducting global team-based ethnography: Methodological challenges and practical methods

    OpenAIRE

    Jarzabkowski, P; Bednarek, R; Cabantous, L.

    2014-01-01

    Ethnography has often been seen as the province of the lone researcher; however, increasingly management scholars are examining global phenomena, necessitating a shift to global team-based ethnography. This shift presents some fundamental methodological challenges, as well as practical issues of method, that have not been examined in the literature on organizational research methods. That is the focus of this paper. We first outline the methodological implications of a shift from single resea...

  17. Estimating the revenues of a hydrogen-based high-capacity storage device: methodology and results

    OpenAIRE

    François-Lavet, Vincent; Fonteneau, Raphaël; Ernst, Damien

    2014-01-01

    This paper proposes a methodology to estimate the maximum revenue that can be generated by a company that operates a high-capacity storage device to buy or sell electricity on the day-ahead electricity market. The methodology exploits the Dynamic Programming (DP) principle and is specified for hydrogen-based storage devices that use electrolysis to produce hydrogen and fuel cells to generate electricity from hydrogen. Experimental results are generated using historical data of energy prices o...

  18. D2.1 - An EA Active, Problem Based Learning Methodology - EAtrain2

    DEFF Research Database (Denmark)

    Ryberg, Thomas; Georgsen, Marianne; Buus, Lillian;

    This deliverable reports on the work undertaken in work package 2 with the key objective to develop a learning methodology for web 2.0 mediated Enterprise Architecture (EA) learning building on a problem based learning (PBL) approach. The deliverable reports not only on the methodology but also...... on the activities leading to its development (literature review, workshops, etc.) and on further outcomes of this work relevant to the platform specification and pilot courses preparation....

  19. Petri net based engineering and software methodology for service-oriented industrial automation

    OpenAIRE

    Mendes, João M.; Restivo, Francisco; Leitão, Paulo; Colombo, Armando W.

    2010-01-01

    Indexado ISI Collaborative industrial systems are becoming an emergent paradigm towards flexibility. One promising solution are service-oriented industrial automation systems, but integrated software methodologies and major frameworks for the engineering are still missing. This paper presents na overview of the current results on a unified and integrated methodology based on intrinsic and novel features of Petri nets. These nets are applied to the modeling, analysis, service management, em...

  20. Efficient Finite Element Methodology Based on Cartesian Grids: Application to Structural Shape Optimization

    OpenAIRE

    Nadal, E.; Ródenas, J. J.; Albelda, J.; Tur, M.; Tarancón, J. E.; Fuenmayor, F.J.

    2013-01-01

    This work presents an analysis methodology based on the use of the Finite Element Method (FEM) nowadays considered one of the main numerical tools for solving Boundary Value Problems (BVPs). The proposed methodology, so-called cg-FEM (Cartesian grid FEM), has been implemented for fast and accurate numerical analysis of 2D linear elasticity problems. The traditional FEM uses geometry-conforming meshes; however, in cg-FEM the analysis mesh is not conformal to the geometry. This allows for defin...

  1. Design methodology for bio-based processing: Biodiesel and fatty alcohol production

    DEFF Research Database (Denmark)

    Simasatikul, Lida; Arpornwichanop, Amornchai; Gani, Rafiqul

    2012-01-01

    A systematic design methodology is developed for producing two main products plus side products starting with one or more bio-based renewable source. A superstructure that includes all possible reaction and separation operations is generated through thermodynamic insights and available data. The ....... Economic analysis and net present value are determined to find the best economically and operationally feasible process. The application of the methodology is presented through a case study involving biodiesel and fatty alcohol productions....

  2. Design methodology for bio-based processing: Biodiesel and fatty alcohol production

    DEFF Research Database (Denmark)

    Simasatikul, Lida; Arpornwichanopa, Amornchai; Gani, Rafiqul

    2013-01-01

    A systematic design methodology is developed for producing multiple main products plus side products starting with one or more bio-based renewable source. A superstructure that includes all possible reaction and separation operations is generated through thermodynamic insights and available data........ Economic analysis and net present value are determined to find the best economically and operationally feasible process. The application of the methodology is presented through a case study involving biodiesel and fatty alcohol productions....

  3. Weibull-Based Design Methodology for Rotating Aircraft Engine Structures

    Science.gov (United States)

    Zaretsky, Erwin; Hendricks, Robert C.; Soditus, Sherry

    2002-01-01

    The NASA Energy Efficient Engine (E(sup 3)-Engine) is used as the basis of a Weibull-based life and reliability analysis. Each component's life and thus the engine's life is defined by high-cycle fatigue (HCF) or low-cycle fatigue (LCF). Knowing the cumulative life distribution of each of the components making up the engine as represented by a Weibull slope is a prerequisite to predicting the life and reliability of the entire engine. As the engine Weibull slope increases, the predicted lives decrease. The predicted engine lives L(sub 5) (95 % probability of survival) of approximately 17,000 and 32,000 hr do correlate with current engine maintenance practices without and with refurbishment. respectively. The individual high pressure turbine (HPT) blade lives necessary to obtain a blade system life L(sub 0.1) (99.9 % probability of survival) of 9000 hr for Weibull slopes of 3, 6 and 9, are 47,391 and 20,652 and 15,658 hr, respectively. For a design life of the HPT disks having probable points of failure equal to or greater than 36,000 hr at a probability of survival of 99.9 %, the predicted disk system life L(sub 0.1) can vary from 9,408 to 24,911 hr.

  4. Measure of Landscape Heterogeneity by Agent-Based Methodology

    Science.gov (United States)

    Wirth, E.; Szabó, Gy.; Czinkóczky, A.

    2016-06-01

    With the rapid increase of the world's population, the efficient food production is one of the key factors of the human survival. Since biodiversity and heterogeneity is the basis of the sustainable agriculture, the authors tried to measure the heterogeneity of a chosen landscape. The EU farming and subsidizing policies (EEA, 2014) support landscape heterogeneity and diversity, nevertheless exact measurements and calculations apart from statistical parameters (standard deviation, mean), do not really exist. In the present paper the authors' goal is to find an objective, dynamic method that measures landscape heterogeneity. It is achieved with the so called agent-based modelling, where randomly dispatched dynamic scouts record the observed land cover parameters and sum up the features of a new type of land. During the simulation the agents collect a Monte Carlo integral as a diversity landscape potential which can be considered as the unit of the `greening' measure. As a final product of the ABM method, a landscape potential map is obtained that can serve as a tool for objective decision making to support agricultural diversity.

  5. Simulation-based reactor control design methodology for CANDU 9

    Energy Technology Data Exchange (ETDEWEB)

    Kattan, M.K.; MacBeth, M.J. [Atomic Energy of Canada Limited, Saskatoon, Saskatchewan (Canada); Chan, W.F.; Lam, K.Y. [Cassiopeia Technologies Inc., Toronto, Ontario (Canada)

    1996-07-01

    The next generation of CANDU nuclear power plant being designed by AECL is the 900 MWe CANDU 9 station. This design is based upon the Darlington CANDU nuclear power plant located in Ontario which is among the world leading nuclear power stations for highest capacity factor with the lowest operation, maintenance and administration costs in North America. Canadian-designed CANDU pressurized heavy water nuclear reactors have traditionally been world leaders in electrical power generation capacity performance. This paper introduces the CANDU 9 design initiative to use plant simulation during the design stage of the plant distributed control system (DCS), plant display system (PDS) and the control centre panels. This paper also introduces some details of the CANDU 9 DCS reactor regulating system (RRS) control application, a typical DCS partition configuration, and the interfacing of some of the software design processes that are being followed from conceptual design to final integrated design validation. A description is given of the reactor model developed specifically for use in the simulator. The CANDU 9 reactor model is a synthesis of 14 micro point-kinetic reactor models to facilitate 14 liquid zone controllers for bulk power error control, as well as zone flux tilt control. (author)

  6. Problems and Issues in Using Computer- Based Support Tools to Enhance 'Soft' Systems Methodologies

    Directory of Open Access Journals (Sweden)

    Mark Stansfield

    2001-11-01

    Full Text Available This paper explores the issue of whether computer-based support tools can enhance the use of 'soft' systems methodologies as applied to real-world problem situations. Although work has been carried out by a number of researchers in applying computer-based technology to concepts and methodologies relating to 'soft' systems thinking such as Soft Systems Methodology (SSM, such attempts appear to be still in their infancy and have not been applied widely to real-world problem situations. This paper will highlight some of the problems that may be encountered in attempting to develop computer-based support tools for 'soft' systems methodologies. Particular attention will be paid to an attempt by the author to develop a computer-based support tool for a particular 'soft' systems method of inquiry known as the Appreciative Inquiry Method that is based upon Vickers' notion of 'appreciation' (Vickers, 196S and Checkland's SSM (Checkland, 1981. The final part of the paper will explore some of the lessons learnt from developing and applying the computer-based support tool to a real world problem situation, as well as considering the feasibility of developing computer-based support tools for 'soft' systems methodologies. This paper will put forward the point that a mixture of manual and computer-based tools should be employed to allow a methodology to be used in an unconstrained manner, but the benefits provided by computer-based technology should be utilised in supporting and enhancing the more mundane and structured tasks.

  7. Methodological Aspects of Building Science-based Sports Training System for Taekwondo Sportsmen

    Directory of Open Access Journals (Sweden)

    Ananchenko Konstantin

    2016-10-01

    Full Text Available The authors have solved topical scientific problems in the article: 1 the research base in the construction of theoretical and methodological foundations of sports training, based on taekwondo has been analysed; 2 the organization and methodological requirements for the training sessions of taekwondo have been researched; 3 the necessity of interaction processes of natural development and adaptation to physical activity of young taekwondo sportsmen has been grounded; 4 the necessity of scientific evidence of building young fighters training loads in microcycles, based on their individualization has been proved.

  8. Evaluating Supply Chain Management: A Methodology Based on a Theoretical Model

    Directory of Open Access Journals (Sweden)

    Alexandre Tadeu Simon

    2015-01-01

    Full Text Available Despite the increasing interest in supply chain management (SCM by researchers and practitioners, there is still a lack of academic literature concerning topics such as methodologies to guide and support SCM evaluation. Most developed methodologies have been provided by consulting companies and are restricted in their publication and use. This article presents a methodology for evaluating companies’ degree of adherence to a SCM conceptual model. The methodology is based on Cooper, Lambert and Pagh’s original contribution and involves analysis of eleven referential axes established from key business processes, horizontal structures, and initiatives & practices. We analyze the applicability of the proposed model based on findings from interviews with experts - academics and practitioners - as well as from case studies of three focal firms and their supply chains. In general terms, the methodology can be considered a diagnostic instrument that allows companies to evaluate their maturity regarding SCM practices. From this diagnosis, firms can identify and implement activities to improve degree of adherence to the reference model and achieve SCM benefits. The methodology aims to contribute to SCM theory development. It is an initial, but structured, reference for translating a theoretical approach into practical aspects.

  9. Towards a self-adaptive service-oriented methodology based on extended SOMA

    Institute of Scientific and Technical Information of China (English)

    Alireza PARVIZI-MOSAED‡; Shahrouz MOAVEN; Jafar HABIBI; Ghazaleh BEIGI; Mahdieh NASER-SHARIAT

    2015-01-01

    We propose a self-adaptive process (SAP) that maintains the software architecture quality using the MAPE-K standard model. The proposed process can be plugged into various software development processes and service-oriented meth-odologies due to its explicitly defined inputs and outputs. To this aim, the proposed SAP is integrated with the service-oriented modeling and application (SOMA) methodology in a two-layered structure to create a novel methodology, named self-adaptive service-oriented architecture methodology (SASOAM), which provides a semi-automatic self-aware method by the composition of architectural tactics. Moreover, the maintenance activity of SOMA is improved using architectural and adaptive patterns, which results in controlling the software architecture quality. The improvement in the maintainability of SOMA is demonstrated by an analytic hierarchy process (AHP) based evaluation method. Furthermore, the proposed method is applied to a case study to represent the feasibility and practicality of SASOAM.

  10. Integration of process design and controller design for chemical processes using model-based methodology

    DEFF Research Database (Denmark)

    Abd.Hamid, Mohd-Kamaruddin; Sin, Gürkan; Gani, Rafiqul

    2010-01-01

    In this paper, a novel systematic model-based methodology for performing integrated process design and controller design (IPDC) for chemical processes is presented. The methodology uses a decomposition method to solve the IPDC typically formulated as a mathematical programming (optimization...... that satisfy design, control and cost criteria. The advantage of the proposed methodology is that it is systematic, makes use of thermodynamic-process knowledge and provides valuable insights to the solution of IPDC problems in chemical engineering practice....... with constraints) problem. Accordingly the optimization problem is decomposed into four sub-problems: (i) pre-analysis, (ii) design analysis, (iii) controller design analysis, and (iv) final selection and verification, which are relatively easier to solve. The methodology makes use of thermodynamic-process...

  11. Vision-based methodology for collaborative management of qualitative criteria in design

    DEFF Research Database (Denmark)

    Tollestrup, Christian

    2006-01-01

    A Vision-based methodology is proposed as part of the management of qualitative criteria for design in early phases of the product development process for team based organisations. Focusing on abstract values and qualities for the product establishes a shared vision for the product amongst team m...

  12. A Step-by-Step Design Methodology for a Base Case Vanadium Redox-Flow Battery

    Science.gov (United States)

    Moore, Mark; Counce, Robert M.; Watson, Jack S.; Zawodzinski, Thomas A.; Kamath, Haresh

    2012-01-01

    The purpose of this work is to develop an evolutionary procedure to be used by Chemical Engineering students for the base-case design of a Vanadium Redox-Flow Battery. The design methodology is based on the work of Douglas (1985) and provides a profitability analysis at each decision level so that more profitable alternatives and directions can be…

  13. CRDIAC: Coupled Reactor Depletion Instrument with Automated Control

    Energy Technology Data Exchange (ETDEWEB)

    Steven K. Logan

    2012-08-01

    When modeling the behavior of a nuclear reactor over time, it is important to understand how the isotopes in the reactor will change, or transmute, over that time. This is especially important in the reactor fuel itself. Many nuclear physics modeling codes model how particles interact in the system, but do not model this over time. Thus, another code is used in conjunction with the nuclear physics code to accomplish this. In our code, Monte Carlo N-Particle (MCNP) codes and the Multi Reactor Transmutation Analysis Utility (MRTAU) were chosen as the codes to use. In this way, MCNP would produce the reaction rates in the different isotopes present and MRTAU would use cross sections generated from these reaction rates to determine how the mass of each isotope is lost or gained. Between these two codes, the information must be altered and edited for use. For this, a Python 2.7 script was developed to aid the user in getting the information in the correct forms. This newly developed methodology was called the Coupled Reactor Depletion Instrument with Automated Controls (CRDIAC). As is the case in any newly developed methodology for modeling of physical phenomena, CRDIAC needed to be verified against similar methodology and validated against data taken from an experiment, in our case AFIP-3. AFIP-3 was a reduced enrichment plate type fuel tested in the ATR. We verified our methodology against the MCNP Coupled with ORIGEN2 (MCWO) method and validated our work against the Post Irradiation Examination (PIE) data. When compared to MCWO, the difference in concentration of U-235 throughout Cycle 144A was about 1%. When compared to the PIE data, the average bias for end of life U-235 concentration was about 2%. These results from CRDIAC therefore agree with the MCWO and PIE data, validating and verifying CRDIAC. CRDIAC provides an alternative to using ORIGEN-based methodology, which is useful because CRDIAC's depletion code, MRTAU, uses every available isotope in its

  14. Pragmatic principles--methodological pragmatism in the principle-based approach to bioethics.

    Science.gov (United States)

    Schmidt-Felzmann, Heike

    2003-01-01

    In this paper it will be argued that Beauchamp and Childress' principle-based approach to bioethics has strongly pragmatic features. Drawing on the writings of William James, I first develop an understanding of methodological pragmatism as a method of justification. On the basis of Beauchamp's and Childress' most recent proposals concerning moral justification in the fifth edition of their Principles of Biomedical Ethics (2001), I then discuss different aspects that the principle-based approach and methodological pragmatism have in common. PMID:14972762

  15. Methodology for Web Services Adoption Based on Technology Adoption Theory and Business Process Analyses

    Institute of Scientific and Technical Information of China (English)

    AN Liping; YAN Jianyuan; TONG Lingyun

    2008-01-01

    Web services use an emerging service-oriented architecture for distributed computing. Many organizations are either in the process of adopting web services technology or evaluating this option for incorporation into their enterprise information architectures. Implementation of this new technology requires careful assessment of the needs and capabilities of an organization to formulate adoption strategies. This paper presents a methodology for web services adoption based on technology adoption theory and business process analyses. The methodology suggests that strategies, business areas, and functions within an organization should be considered based on the existing organizational information technology status during the process of adopting web services to support the business needs and requirements.

  16. DEPLETED URANIUM TECHNICAL WORK

    Science.gov (United States)

    The Depleted Uranium Technical Work is designed to convey available information and knowledge about depleted uranium to EPA Remedial Project Managers, On-Scene Coordinators, contractors, and other Agency managers involved with the remediation of sites contaminated with this mater...

  17. A general methodology for mobility analysis of mechanisms based on constraint screw theory

    Institute of Scientific and Technical Information of China (English)

    HUANG Zhen; LIU JingFang; ZENG DaXing

    2009-01-01

    It is well known that the traditional Grubler-Kutzbach formula fails to calculate the mobility of some classical mechanisms or many modern parallel robots, and this situation seriously hampers mechani-cal innovation. To seek an efficient and universal method for mobility calculation has been a heated topic in the sphere of mechanism. The modified Grubler-Kutzbach criterion proposed by us achieved success in calculating the mobility of a lot of highly complicated mechanisms, especially the mobility of all recent parallel mechanisms listed by Gogu, and the Bennett mechanism known for its particular difficulty. With wide applications of the criterion, a systematic methodology has recently formed. This paper systematically presents the methodology based on the screw theory for the first time and ana-lyzes six representative puzzling mechanisms. In addition, the methodology is convenient for judgment of the instantaneous or full-cycle mobility, and has become an effective and general method of great scientific value and practical significance. In the first half, this paper introduces the basic screw theory,then it presents the effective methodology formed within this decade. The second half of this paperpresents how to apply the methodology by analyzing the mobility of several puzzling mechanisms.Finally, this paper contrasts and analyzes some different methods and interprets the essential reason for validity of our methodology.

  18. A general methodology for mobility analysis of mechanisms based on constraint screw theory

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    It is well known that the traditional Grübler-Kutzbach formula fails to calculate the mobility of some classical mechanisms or many modern parallel robots,and this situation seriously hampers mechani-cal innovation.To seek an efficient and universal method for mobility calculation has been a heated topic in the sphere of mechanism.The modified Grübler-Kutzbach criterion proposed by us achieved success in calculating the mobility of a lot of highly complicated mechanisms,especially the mobility of all recent parallel mechanisms listed by Gogu,and the Bennett mechanism known for its particular difficulty.With wide applications of the criterion,a systematic methodology has recently formed.This paper systematically presents the methodology based on the screw theory for the first time and ana-lyzes six representative puzzling mechanisms.In addition,the methodology is convenient for judgment of the instantaneous or full-cycle mobility,and has become an effective and general method of great scientific value and practical significance.In the first half,this paper introduces the basic screw theory,then it presents the effective methodology formed within this decade.The second half of this paper presents how to apply the methodology by analyzing the mobility of several puzzling mechanisms.Finally,this paper contrasts and analyzes some different methods and interprets the essential reason for validity of our methodology.

  19. Teaching Research Methodology Using a Project-Based Three Course Sequence Critical Reflections on Practice

    Science.gov (United States)

    Braguglia, Kay H.; Jackson, Kanata A.

    2012-01-01

    This article presents a reflective analysis of teaching research methodology through a three course sequence using a project-based approach. The authors reflect critically on their experiences in teaching research methods courses in an undergraduate business management program. The introduction of a range of specific techniques including student…

  20. Future Directions in Adventure-based Therapy Research: Methodological Considerations and Design Suggestions.

    Science.gov (United States)

    Newes, Sandra L.

    2001-01-01

    More methodologically sound research in adventure therapy is needed if the field is to claim empirically-based efficacy as a treatment modality. Some considerations for conducting outcome studies in adventure therapy relate to standardization, multiple domain assessment, regression techniques, objective assessment of participant change, client and…

  1. Project-based learning in organizations : towards a methodology for learning in groups

    NARCIS (Netherlands)

    Poell, R.F.; van der Krogt, F.J.

    2003-01-01

    This article introduces a methodology for employees in organizations to set up and carry out their own group learning projects. It is argued that employees can use project-based learning to make their everyday learning more systematic at times, without necessarily formalizing it. The article emphasi

  2. The Idea of National HRD: An Analysis Based on Economics and Theory Development Methodology

    Science.gov (United States)

    Wang, Greg G.; Swanson, Richard A.

    2008-01-01

    Recent human resource development (HRD) literature focuses attention on national HRD (NHRD) research and represents problems in both HRD identity and research methodology. Based on a review of development economics and international development literature, this study analyzes the existing NHRD literature with respect to the theory development…

  3. An integrated science-based methodology to assess potential risks and implications of engineered nanomaterials

    Science.gov (United States)

    There is an urgent need for broad and integrated studies that address the risks of engineered nanomaterials (ENMs) along the different endpoints of the society, environment, and economy (SEE) complex adaptive system. This article presents an integrated science-based methodology ...

  4. The Change towards a Teaching Methodology Based on Competences: A Case Study in a Spanish University

    Science.gov (United States)

    Gonzalez, Jose Maria G.; Arquero Montaño, Jose Luis; Hassall, Trevor

    2014-01-01

    The European Higher Education Area (EHEA) has promoted the implementation of a teaching methodology based on competences. Drawing on New Institutional Sociology, the present work aims to identify and improve knowledge concerning the factors which are hindering that change in the Spanish university system. This is investigated using a case study…

  5. The Integration of Project-Based Methodology into Teaching in Machine Translation

    Science.gov (United States)

    Madkour, Magda

    2016-01-01

    This quantitative-qualitative analytical research aimed at investigating the effect of integrating project-based teaching methodology into teaching machine translation on students' performance. Data was collected from the graduate students in the College of Languages and Translation, at Imam Muhammad Ibn Saud Islamic University, Riyadh, Saudi…

  6. EA Training 2.0 Newsletter #3 - EA Active, Problem Based Learning Methodology

    DEFF Research Database (Denmark)

    Buus, Lillian; Ryberg, Thomas; Sroga, Magdalena

    2010-01-01

    The main products of the project are innovative, active problem-based learning methodology for EA education and training, EA courses for university students and private and public sector employees, and an Enterprise Architecture competence ontology including a complete specification of skills...

  7. REGULATORY FRAMEWORK AND EVALUATION OF HUMAN-MACHINE INTERFACES IMS NPP SAFETY CASE BASED METHODOLOGY

    OpenAIRE

    Харченко, В'ячеслав Сергійович; "Національний аерокосмічний університет ім.М.Є.Жуковського "ХАІ""; Орехова, Анастасія Олександрівна; "Національний аерокосмічний університет ім.М.Є.Жуковського "ХАІ""

    2012-01-01

    The problems associated with safety of human-machine interfaces, information and control systems in NPP are analyzed.. An approach to assess the safety HMI I&C system NPP, based on Safety Case methodology is proposed. The profile of standards for HMI quality requirements is presented. An example of HMI quality assessment is described.

  8. MRI脑测谎实验方法学%Brain-Based MRI lie detection experiment methodology

    Institute of Scientific and Technical Information of China (English)

    李文石; 张好; 胡清泉; 苏香; 郭亮

    2006-01-01

    The brain-based MRI lie detection experiment methodology is reviewed for the first time, including the magnetic resonance imaging paradigm,the double-block deign,the equidistance hit-ball and the test mechanice,This paper illustrates the research results of 3D MRI lie detection and the contrastive experiment of otopoint mapping brain signature lie detection,ingeminates the lie-Truth Law(PT/PL ≤0.618)which we get from the statistic of the world MRI reports. The conclusion points out the essence of this technology,its advantages and disadvantages,and the evolution of this methodology.

  9. A Novel HW/SW Based NoC Router Self-Testing Methodology

    OpenAIRE

    Nazari, Masoom; Lighvan, Mina Zolfy; Koozekonani, Ziaeddin Daie; Sadeghi, Ali

    2016-01-01

    Network-on-Chip (NoC) architecture has been proposed to solve the global communication problem of complex Systems-on-Chips (SoCs). However, NoC testing is a main challenging problem yet. In this article, we propose novel test architecture for NoC router testing. The proposed test architecture uses both advantages of Software Based Self-Testing (SBST) and Built in Self-Testing (BIST) methodologies. In this methodology, we propose custom test instructions with regarding the ISA of NoC processor...

  10. A novel method to isolate protein N-terminal peptides from proteome samples using sulfydryl tagging and gold-nanoparticle-based depletion.

    Science.gov (United States)

    Li, Lanting; Wu, Runqing; Yan, Guoquan; Gao, Mingxia; Deng, Chunhui; Zhang, Xiangmin

    2016-01-01

    A novel method to isolate global N-termini using sulfydryl tagging and gold-nanoparticle-based depletion (STagAu method) is presented. The N-terminal and lysine amino groups were first completely dimethylated at the protein level, after which the proteins were digested. The newly generated internal peptides were tagged with sulfydryl by Traut's reagent through digested N-terminal amines in yields of 96%. The resulting sulfydryl peptides were depleted through binding onto nano gold composite materials. The Au-S bond is stable and widely used in materials science. Nano gold composite materials showed nearly complete depletion of sulfydryl peptides. A set of the acetylated and dimethylated N-terminal peptides were analyzed by liquid chromatography-tandem mass spectrometry. This method was demonstrated to be an efficient N-terminus enrichment method because of the use of an effective derivatization reaction, in combination with robust and relative easy to implement Au-S coupling. We identified 632 N-terminal peptides from 386 proteins in a mouse liver sample. The STagAu approach presented is therefore a facile and efficient method for mass-spectrometry-based analysis of proteome N-termini or protease-generated cleavage products.

  11. Hybrid pn-junction solar cells based on layers of inorganic nanocrystals and organic semiconductors: optimization of layer thickness by considering the width of the depletion region.

    Science.gov (United States)

    Saha, Sudip K; Guchhait, Asim; Pal, Amlan J

    2014-03-01

    We report the formation and characterization of hybrid pn-junction solar cells based on a layer of copper diffused silver indium disulfide (AgInS2@Cu) nanoparticles and another layer of copper phthalocyanine (CuPc) molecules. With copper diffusion in the nanocrystals, their optical absorption and hence the activity of the hybrid pn-junction solar cells was extended towards the near-IR region. To decrease the particle-to-particle separation for improved carrier transport through the inorganic layer, we replaced the long-chain ligands of copper-diffused nanocrystals in each monolayer with short-ones. Under illumination, the hybrid pn-junctions yielded a higher short-circuit current as compared to the combined contribution of the Schottky junctions based on the components. A wider depletion region at the interface between the two active layers in the pn-junction device as compared to that of the Schottky junctions has been considered to analyze the results. Capacitance-voltage characteristics under a dark condition supported such a hypothesis. We also determined the width of the depletion region in the two layers separately so that a pn-junction could be formed with a tailored thickness of the two materials. Such a "fully-depleted" device resulted in an improved photovoltaic performance, primarily due to lessening of the internal resistance of the hybrid pn-junction solar cells.

  12. Methodology to estimate parameters of an excitation system based on experimental conditions

    Energy Technology Data Exchange (ETDEWEB)

    Saavedra-Montes, A.J. [Carrera 80 No 65-223, Bloque M8 oficina 113, Escuela de Mecatronica, Universidad Nacional de Colombia, Medellin (Colombia); Calle 13 No 100-00, Escuela de Ingenieria Electrica y Electronica, Universidad del Valle, Cali, Valle (Colombia); Ramirez-Scarpetta, J.M. [Calle 13 No 100-00, Escuela de Ingenieria Electrica y Electronica, Universidad del Valle, Cali, Valle (Colombia); Malik, O.P. [2500 University Drive N.W., Electrical and Computer Engineering Department, University of Calgary, Calgary, Alberta (Canada)

    2011-01-15

    A methodology to estimate the parameters of a potential-source controlled rectifier excitation system model is presented in this paper. The proposed parameter estimation methodology is based on the characteristics of the excitation system. A comparison of two pseudo random binary signals, two sampling periods for each one, and three estimation algorithms is also presented. Simulation results from an excitation control system model and experimental results from an excitation system of a power laboratory setup are obtained. To apply the proposed methodology, the excitation system parameters are identified at two different levels of the generator saturation curve. The results show that it is possible to estimate the parameters of the standard model of an excitation system, recording two signals and the system operating in closed loop with the generator. The normalized sum of squared error obtained with experimental data is below 10%, and with simulation data is below 5%. (author)

  13. Advanced piloted aircraft flight control system design methodology. Volume 1: Knowledge base

    Science.gov (United States)

    Mcruer, Duane T.; Myers, Thomas T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design stages starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. While theory and associated computational means are an important aspect of the design methodology, the lore, knowledge and experience elements, which guide and govern applications are critical features. This material is presented as summary tables, outlines, recipes, empirical data, lists, etc., which encapsulate a great deal of expert knowledge. Much of this is presented in topical knowledge summaries which are attached as Supplements. The composite of the supplements and the main body elements constitutes a first cut at a a Mark 1 Knowledge Base for manned-aircraft flight control.

  14. INSTALLING AN ERP SYSTEM WITH A METHODOLOGY BASED ON THE PRINCIPLES OF GOAL DIRECTED PROJECT MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Ioannis Zafeiropoulos

    2009-12-01

    Full Text Available This paper describes a generic methodology to support the process of modelling, adaptation and implementation (MAI of Enterprise Resource Planning Systems (ERPS based on the principles of goal directed project management (GDPM. The proposed methodology guides the project manager through specific stages in order to successfully complete the ERPS implementation. The development of the proper MAI methodology is deemed necessary because it will simplify the installation process of ERPS. The goal directed project management method was chosen since it provides a way of focusing all changes towards a predetermined goal. The main stages of the methodology are the promotion and preparation steps, the proposal, the contract, the implementation and the completion. The methodology was applied as a pilot application by a major ERPS development company. Important benefits were the easy and effective guidance for all installation and analysis stages, the faster installation for the ERPS and the control and cost reduction for the installation, in terms of time, manpower, technological equipment and other resources.

  15. INSTALLING AN ERP SYSTEM WITH A METHODOLOGY BASED ON THE PRINCIPLES OF GOAL DIRECTED PROJECT MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Ioannis Zafeiropoulos

    2010-01-01

    Full Text Available This paper describes a generic methodology to support the process of modelling, adaptation and implementation (MAI of Enterprise Resource Planning Systems (ERPS based on the principles of goal directed project management (GDPM. The proposed methodology guides the project manager through specific stages in order to successfully complete the ERPS implementation. The development of the proper MAI methodology is deemed necessary because it will simplify the installation process of ERPS. The goal directed project management method was chosen since it provides a way of focusing all changes towards a predetermined goal. The main stages of the methodology are the promotion and preparation steps, the proposal, the contract, the implementation and the completion. The methodology was applied as a pilot application by a major ERPS development company. Important benefits were the easy and effective guidance for all installation and analysis stages, the faster installation for the ERPS and the control and cost reduction for the installation, in terms of time, manpower, technological equipment and other resources.

  16. Hierachy-based methodology for producing educational contents with maximal reutilization

    OpenAIRE

    Pedraza, Rafael; Valverde Albacete, Francisco; Cid Sueiro, Jes??s; Molina Bulla, Harold; Navia V??zquez, ??ngel

    2002-01-01

    Computer based training or distance education are facing dramatic changes with the advent of standardization efforts, some of them concentrating in maximal reuse. This is of paramount importance for a sustainable -cost affordable- production of educational materials. Reuse in itself should not be a goal, though, since many methodological aspects might be lost. In this paper we propose two content production approaches for the InterMediActor platform under a competence-based ...

  17. Design-Based Research: Is This a Suitable Methodology for Short-Term Projects?

    Science.gov (United States)

    Pool, Jessica; Laubscher, Dorothy

    2016-01-01

    This article reports on a design-based methodology of a thesis in which a fully face-to-face contact module was converted into a blended learning course. The purpose of the article is to report on how design-based phases, in the form of micro-, meso- and macro-cycles were applied to improve practice and to generate design principles. Design-based…

  18. Depleted Uranium Management

    International Nuclear Information System (INIS)

    The paper considers radiological and toxic impact of the depleted uranium on the human health. Radiological influence of depleted uranium is less for 60 % than natural uranium due to the decreasing of short-lived isotopes uranium-234 and uranium-235 after enrichment. The formation of radioactive aerosols and their impact on the human are mentioned. Use of the depleted uranium weapons has also a chemical effect on intake due to possible carcinogenic influence on kidney. Uranium-236 in the substance of the depleted uranium is determined. The fact of beta-radiation formation in the uranium-238 decay is regarded. This effect practically is the same for both depleted and natural uranium. Importance of toxicity of depleted uranium, as the heavier chemical substance, has a considerable contribution to the population health. The paper analyzes risks regarding the use of the depleted uranium weapons. There is international opposition against using weapons with depleted uranium. Resolution on effects of the use of armaments and ammunitions containing depleted uranium was five times supported by the United Nations (USA, United Kingdom, France and Israel did not support). The decision for banning of depleted uranium weapons was supported by the European Parliament

  19. A Transformation-oriented Methodology to Knowledge-based Conceptual Data Warehouse Design

    Directory of Open Access Journals (Sweden)

    Opim S. Sitompul

    2006-01-01

    Full Text Available Applications of artificial intelligence (AI technology in the form of knowledge-based systems within the context of database design have been extensively researched particularly to provide support within the conceptual design phase. However, a similar approach to the task of data warehouse design has yet to be seriously initiated. In this paper, we proposed a design methodology for conceptual data warehouse design called the transformation-oriented methodology, which transforms an Entity-Relationship (ER model into a multidimensional model based on a series of transformation and analysis rules. The transformation-oriented methodology translates the ER model into a specification language model and transformed it into an initial problem domain model. A set of synthesis and diagnosis rules will then gradually transform the problem domain model into the multidimensional model. A prototype KB tool called the DWDesigner has been developed to implement the aforementioned methodology. The multidimensional model produces by the DWDesigner as output is presented in a graphical form for better visualization. Testing has been conducted to a number of design problems, such as university, business and hospital domains and consistent results have been achieved.

  20. Attack Methodology Analysis: Emerging Trends in Computer-Based Attack Methodologies and Their Applicability to Control System Networks

    Energy Technology Data Exchange (ETDEWEB)

    Bri Rolston

    2005-06-01

    Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills, and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between

  1. A fault diagnosis methodology for rolling element bearings based on advanced signal pretreatment and autoregressive modelling

    Science.gov (United States)

    Al-Bugharbee, Hussein; Trendafilova, Irina

    2016-05-01

    This study proposes a methodology for rolling element bearings fault diagnosis which gives a complete and highly accurate identification of the faults present. It has two main stages: signals pretreatment, which is based on several signal analysis procedures, and diagnosis, which uses a pattern-recognition process. The first stage is principally based on linear time invariant autoregressive modelling. One of the main contributions of this investigation is the development of a pretreatment signal analysis procedure which subjects the signal to noise cleaning by singular spectrum analysis and then stationarisation by differencing. So the signal is transformed to bring it close to a stationary one, rather than complicating the model to bring it closer to the signal. This type of pretreatment allows the use of a linear time invariant autoregressive model and improves its performance when the original signals are non-stationary. This contribution is at the heart of the proposed method, and the high accuracy of the diagnosis is a result of this procedure. The methodology emphasises the importance of preliminary noise cleaning and stationarisation. And it demonstrates that the information needed for fault identification is contained in the stationary part of the measured signal. The methodology is further validated using three different experimental setups, demonstrating very high accuracy for all of the applications. It is able to correctly classify nearly 100 percent of the faults with regard to their type and size. This high accuracy is the other important contribution of this methodology. Thus, this research suggests a highly accurate methodology for rolling element bearing fault diagnosis which is based on relatively simple procedures. This is also an advantage, as the simplicity of the individual processes ensures easy application and the possibility for automation of the entire process.

  2. Environmental restoration risk-based prioritization work package planning and risk ranking methodology. Revision 2

    International Nuclear Information System (INIS)

    This document presents the risk-based prioritization methodology developed to evaluate and rank Environmental Restoration (ER) work packages at the five US Department of Energy, Oak Ridge Field Office (DOE-ORO) sites [i.e., Oak Ridge K-25 Site (K-25), Portsmouth Gaseous Diffusion Plant (PORTS), Paducah Gaseous Diffusion Plant (PGDP), Oak Ridge National Laboratory (ORNL), and the Oak Ridge Y-12 Plant (Y-12)], the ER Off-site Program, and Central ER. This prioritization methodology was developed to support the increased rigor and formality of work planning in the overall conduct of operations within the DOE-ORO ER Program. Prioritization is conducted as an integral component of the fiscal ER funding cycle to establish program budget priorities. The purpose of the ER risk-based prioritization methodology is to provide ER management with the tools and processes needed to evaluate, compare, prioritize, and justify fiscal budget decisions for a diverse set of remedial action, decontamination and decommissioning, and waste management activities. The methodology provides the ER Program with a framework for (1) organizing information about identified DOE-ORO environmental problems, (2) generating qualitative assessments of the long- and short-term risks posed by DOE-ORO environmental problems, and (3) evaluating the benefits associated with candidate work packages designed to reduce those risks. Prioritization is conducted to rank ER work packages on the basis of the overall value (e.g., risk reduction, stakeholder confidence) each package provides to the ER Program. Application of the methodology yields individual work package ''scores'' and rankings that are used to develop fiscal budget requests. This document presents the technical basis for the decision support tools and process

  3. Nonlinear lower hybrid wave depletion

    International Nuclear Information System (INIS)

    Two numerical ray tracing codes with focusing are used to compute lower hybrid daughter wave amplification by quasi-mode parametric decay. The first code, LHPUMP provides a numerical pump model on a grid. This model is used by a second code, LHFQM which computes daughter wave amplification inside the pump extent and follows the rays until their energy is absorbed by the plasma. An analytic model is then used to estimate pump depletion based on the numerical results. Results for PLT indicate strong pump depletion at the plasma edge at high density operation for the 800 Mhz wave frequency, but weak depletion for the 2.45 Ghz experiment. This is proposed to be the mechanism responsible for the high density limit for current drive as well as for the difficulty to heat ions

  4. A Novel Clustering Methodology Based on Modularity Optimisation for Detecting Authorship Affinities in Shakespearean Era Plays

    Science.gov (United States)

    Craig, Hugh; Berretta, Regina; Moscato, Pablo

    2016-01-01

    In this study we propose a novel, unsupervised clustering methodology for analyzing large datasets. This new, efficient methodology converts the general clustering problem into the community detection problem in graph by using the Jensen-Shannon distance, a dissimilarity measure originating in Information Theory. Moreover, we use graph theoretic concepts for the generation and analysis of proximity graphs. Our methodology is based on a newly proposed memetic algorithm (iMA-Net) for discovering clusters of data elements by maximizing the modularity function in proximity graphs of literary works. To test the effectiveness of this general methodology, we apply it to a text corpus dataset, which contains frequencies of approximately 55,114 unique words across all 168 written in the Shakespearean era (16th and 17th centuries), to analyze and detect clusters of similar plays. Experimental results and comparison with state-of-the-art clustering methods demonstrate the remarkable performance of our new method for identifying high quality clusters which reflect the commonalities in the literary style of the plays. PMID:27571416

  5. A Novel Clustering Methodology Based on Modularity Optimisation for Detecting Authorship Affinities in Shakespearean Era Plays.

    Science.gov (United States)

    Naeni, Leila M; Craig, Hugh; Berretta, Regina; Moscato, Pablo

    2016-01-01

    In this study we propose a novel, unsupervised clustering methodology for analyzing large datasets. This new, efficient methodology converts the general clustering problem into the community detection problem in graph by using the Jensen-Shannon distance, a dissimilarity measure originating in Information Theory. Moreover, we use graph theoretic concepts for the generation and analysis of proximity graphs. Our methodology is based on a newly proposed memetic algorithm (iMA-Net) for discovering clusters of data elements by maximizing the modularity function in proximity graphs of literary works. To test the effectiveness of this general methodology, we apply it to a text corpus dataset, which contains frequencies of approximately 55,114 unique words across all 168 written in the Shakespearean era (16th and 17th centuries), to analyze and detect clusters of similar plays. Experimental results and comparison with state-of-the-art clustering methods demonstrate the remarkable performance of our new method for identifying high quality clusters which reflect the commonalities in the literary style of the plays. PMID:27571416

  6. Performance Assessment of the Wave Dragon Wave Energy Converter Based on the EquiMar Methodology

    OpenAIRE

    Parmeggiani, Stefano; Chozas, Julia Fernandez; Pecher, Arthur; Friis-Madsen, E.; Sørensen, H. C.; Kofoed, Jens Peter

    2011-01-01

    At the present pre-commercial phase of the wave energy sector, device developers are called to provide reliable estimates on power performance and production at possible deployment locations. The EU EquiMar project has proposed a novel approach, where the performance assessment is based mainly on experimental data deriving from sea trials rather than solely on numerical predictions. The study applies this methodology to evaluate the performance of Wave Dragon at two locations in the North Sea...

  7. Extension of direct displacement-based design methodology for bridges to account for higher mode effects

    OpenAIRE

    Kappos, A. J.; Gkatzogias, K.I.; Gidaris, I.G.

    2013-01-01

    An improvement is suggested to the direct displacement-based design (DDBD) procedure for bridges to account for higher mode effects, the key idea being not only the proper prediction of a target-displacement profile through the effective mode shape (EMS) method (wherein all significant modes are considered), but also the proper definition of the corresponding peak structural response. The proposed methodology is then applied to an actual concrete bridge wherein the different pier heights and ...

  8. An Intuitionistic Fuzzy Methodology for Component-Based Software Reliability Optimization

    DEFF Research Database (Denmark)

    Madsen, Henrik; Grigore, Albeanu; Popenţiuvlǎdicescu, Florin

    2012-01-01

    Component-based software development is the current methodology facilitating agility in project management, software reuse in design and implementation, promoting quality and productivity, and increasing the reliability and performability. This paper illustrates the usage of intuitionistic fuzzy...... degree approach in modelling the quality of entities in imprecise software reliability computing in order to optimize management results. Intuitionistic fuzzy optimization algorithms are proposed to be used for complex software systems reliability optimization under various constraints....

  9. Comments on "A model-based design methodology for the development of mechatronic systems"

    OpenAIRE

    Thramboulidis, Kleanthis

    2014-01-01

    In the paper by G. Barbieri et al. (Mechatronics (2014), http://dx.doi.org/10.1016/j.mechatronics. 2013.12.004), a design methodology, based on the W life cycle process model, is presented and SysML is proposed as a tool to support the whole development process. In this letter, we discuss the presented approach, we point out technical errors and raise additional issues that might help in making the proposed approach applicable.

  10. Mergers in Greece: evaluation of the merger related performance of greek companies, accounting based methodology

    OpenAIRE

    Ζούτσου, Ρούλα

    2001-01-01

    This paper evaluates the financial results of 23 Greek merger transactions that were completed between 1993 and 1998 using the accounting based methodology. A set of 20 performance ratios is examined for a period of 5 years to get an indication of the mean weighted industry- adjusted performance difference between the pre- to post-merger period. Additionally, a cross-sectional analysis is performed to conclude on whether special characteristics of the merger participants are associated with i...

  11. Eco-efficiency methodology of Hybrid Electric Vehicle based on multidisciplinary multi-objective optimization

    OpenAIRE

    Nzisabira, Jonathan; Louvigny, Yannick; Christiaens, Sébastien; Duysinx, Pierre

    2009-01-01

    The eco-efficiency concept of clean propulsion vehicles aims at is simultaneously reducing the fuel consumption and environment pollutants impact (Eco-score) without decreasing the vehicle performances and other user satisfaction criteria. Based on a simulation model in ADVISOR, one can evaluate the performances, the emissions and then the Ecoscore and the User Satisfaction for different driving scenarios. To establish a rationale methodology for conducting the eco-efficiency design of electr...

  12. How Is the Evaluation Process in a Course Following the PBI (Problems-Based Learning Methodology?

    Directory of Open Access Journals (Sweden)

    Patricia Morales Bueno

    2013-09-01

    Full Text Available This article focuses on the different ways in which the concepts of teaching and learning are conceived. It also defines the conception on which the educational view of the PBL is based, and how it determines its learning goals. Likewise, it shows how evaluation strategies are bonded to each of the stages in the PBL process, noting their features and the relation with learning goals of the methodology.

  13. A design methodology for delta-sigma converters based on solid-state passive filters

    OpenAIRE

    Benabes, Philippe

    2013-01-01

    Print ISBN : 978-1-4799-0618-5 International audience In the context in the ENIAC ARTEMOS project for the design of agile radio front ends, this paper shows a methodology for the design of agile bandpass continuous-time delta sigma based on acoustic tunable resonators. These resonators use BST materials which have the property to be tunable by an external voltage, allowing changing the resonance frequency of filters by a few percent. Using such filters, the Oversampling ratio of delta s...

  14. WEBDATANET : a Network on Web-based Data Collection, Methodological Challenges, Solutions, and Implementation

    OpenAIRE

    Steinmetz, Stephanie; Lars, Kaczmirek; de Pedraza, Pablo; Reips, Ulf-Dietrich; Tijdens, Kea; Lozar Manfreda, Katja; Bernardo, Winer

    2012-01-01

    Do you collect data via the Internet in your research? If you do, this European network is important for you. The network collects and combines experiences and research on the methodology of online data collection. It provides access to expertise that may be important in your research.WEBDATANET is a unique multidisciplinary European network bringing together more than 76 leading web-based data collection experts, (web) survey methodologists, psychologists, sociologists, linguists, economists...

  15. Zombie Division : a methodological case study for the evaluation of game-based learning

    OpenAIRE

    Habgood, M. P. Jacob

    2015-01-01

    This paper discusses the methodological designs and technologies used to evaluate an educational videogame in order to support researchers in the design of their own evaluative research in the field of game-based learning. The Zombie Division videogame has been used to empirically evaluate the effectiveness of a more intrinsically integrated approach to creating educational games. It was specifically designed to deliver interventions as part of research studies examining differences in learni...

  16. Modelling and mapping critical loads and exceedances for the Georgia Basin, British Columbia, using a zero base-cation depletion criterion

    Directory of Open Access Journals (Sweden)

    Beverley A. RAYMOND†

    2010-08-01

    Full Text Available Critical load (CL and exceedance maps of sulphur (S and nitrogen (N for upland soils were generated for the Georgia Basin, British Columbia, Canada, by synthesizing available data layers for atmospheric deposition, climate (precipitation, temperature, soil, site classification and elevation. Critical loads were determined using the steady-state mass-balance model and a criterion based on zero-tolerance for further base-cation depletion. The resulting CL values were generally lowest on ridge tops and increased towards valleys. Critical load exceedance ranged from 13% of the Georgia Basin under wet deposition to 32% under modelled total (wet and dry deposition. Moreover, exceedance increased by an additional 10% when considering upland areas only for the Georgia Basin. Significant portions of the Georgia Basin are predicted to experience exceedance-enhanced base-cation depletion rates above 200 eq ha–1 y–1 and turn-over times to a final new base saturation state within 200 years under continued atmospheric S and N deposition.

  17. AN INDUCTIVE, INTERACTIVE AND ADAPTIVE HYBRID PROBLEM-BASED LEARNING METHODOLOGY: APPLICATION TO STATISTICS

    Directory of Open Access Journals (Sweden)

    ADA ZHENG

    2011-10-01

    Full Text Available We have developed an innovative hybrid problem-based learning (PBL methodology. The methodology has the following distinctive features: i Each complex question was decomposed into a set of coherent finer subquestions by following the carefully designed criteria to maintain a delicate balance between guiding the students and inspiring them to think independently. This learning methodology enabled the students to solve the complex questions progressively in an inductive context. ii Facilitated by the utilization of our web-based learning systems, the teacher was able to interact with the students intensively and could allocate more teaching time to provide tailor-made feedback for individual student. The students were actively engaged in the learning activities, stimulated by the intensive interaction. iii The answers submitted by the students could be automatically consolidated in the report of the Moodle system in real-time. The teacher could adjust the teaching schedule and focus of the class to adapt to the learning progress of the students by analysing the automatically generated report and log files of the web-based learning system. As a result, the attendance rate of the students increased from about 50% to more than 90%, and the students’ learning motivation have been significantly enhanced.

  18. 5.0. Depletion, activation, and spent fuel source terms

    Energy Technology Data Exchange (ETDEWEB)

    Wieselquist, William A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-04-01

    SCALE’s general depletion, activation, and spent fuel source terms analysis capabilities are enabled through a family of modules related to the main ORIGEN depletion/irradiation/decay solver. The nuclide tracking in ORIGEN is based on the principle of explicitly modeling all available nuclides and transitions in the current fundamental nuclear data for decay and neutron-induced transmutation and relies on fundamental cross section and decay data in ENDF/B VII. Cross section data for materials and reaction processes not available in ENDF/B-VII are obtained from the JEFF-3.0/A special purpose European activation library containing 774 materials and 23 reaction channels with 12,617 neutron-induced reactions below 20 MeV. Resonance cross section corrections in the resolved and unresolved range are performed using a continuous-energy treatment by data modules in SCALE. All nuclear decay data, fission product yields, and gamma-ray emission data are developed from ENDF/B-VII.1 evaluations. Decay data include all ground and metastable state nuclides with half-lives greater than 1 millisecond. Using these data sources, ORIGEN currently tracks 174 actinides, 1149 fission products, and 974 activation products. The purpose of this chapter is to describe the stand-alone capabilities and underlying methodology of ORIGEN—as opposed to the integrated depletion capability it provides in all coupled neutron transport/depletion sequences in SCALE, as described in other chapters.

  19. Depleted uranium in Japan

    International Nuclear Information System (INIS)

    In Japan, depleted uranium ammunition is regarded as nuclear weapons and meets with fierce opposition. The fact that US Marines mistakenly fired bullets containing depleted uranium on an island off Okinawa during training exercises in December 1995 and January 1996, also contributes. The overall situation in this area in Japan is outlined. (P.A.)

  20. Visualization of stratospheric ozone depletion and the polar vortex

    Science.gov (United States)

    Treinish, Lloyd A.

    1995-01-01

    Direct analysis of spacecraft observations of stratospheric ozone yields information about the morphology of annual austral depletion. Visual correlation of ozone with other atmospheric data illustrates the diurnal dynamics of the polar vortex and contributions from the upper troposphere, including the formation and breakup of the depletion region each spring. These data require care in their presentation to minimize the introduction of visualization artifacts that are erroneously interpreted as data features. Non geographically registered data of differing mesh structures can be visually correlated via cartographic warping of base geometries without interpolation. Because this approach is independent of the realization technique, it provides a framework for experimenting with many visualization strategies. This methodology preserves the fidelity of the original data sets in a coordinate system suitable for three-dimensional, dynamic examination of atmospheric phenomena.

  1. Lithium-ion batteries: Evaluation study of different charging methodologies based on aging process

    International Nuclear Information System (INIS)

    Highlights: • Different charging methodologies have been tested and analyzed. • Battery impedance representation using the Randle’s equivalent circuit. • Investigate the impact of the charging methodology on the battery’s lifetime. • An extended analysis to select the proper charging method that can be used to design an enhanced charging system. - Abstract: In this paper, high power 7 A h LiFePO4-based cells (LFP) have been used to investigate the impact of the charging methodology on the battery’s lifetime. Three charging techniques have been used: Constant Current (CC), Constant Current–Constant Voltage (CC–CV) and Constant Current–Constant Voltage with Negative Pulse (CC–CVNP). A comparative study between these techniques is presented in this research. For this purpose, a characterization of the batteries has been performed using capacity test and electrochemical impedance spectroscopy (EIS). As expected the obtained results showed that the battery’s aging rate depends on the charging methodology. Indeed, it has been shown that a combination of low amplitude and fewest number of negative pulses has a positive effect on battery’s capacity fading. From the impedance measurements, the results have demonstrated that the CC–CVNP technique with low amplitude and fewest number of negative pulses is more effective than the other techniques in reducing the concentration polarization resistance and the diffusion time constant. This research has provided an extended analysis to select the proper charging methodology that can be used to design an enhanced charging system

  2. The development of a neuroscience-based methodology for the nuclear energy learning/teaching process

    International Nuclear Information System (INIS)

    When compared to other energy sources such as fossil fuels, coal, oil, and gas, nuclear energy has perhaps the lowest impact on the environment. Moreover, nuclear energy has also benefited other fields such as medicine, pharmaceutical industry, and agriculture, among others. However, despite all benefits that result from the peaceful uses of nuclear energy, the theme is still addressed with prejudice. Education may be the starting point for public acceptance of nuclear energy as it provides pedagogical approaches, learning environments, and human resources, which are essential conditions for effective learning. So far nuclear energy educational researches have been conducted using only conventional assessment methods. The global educational scenario has demonstrated absence of neuroscience-based methods for the teaching of nuclear energy, and that may be an opportunity for developing new strategic teaching methods that will help demystifying the theme consequently improving public acceptance of this type of energy. This work aims to present the first step of a methodology in progress based on researches in neuroscience to be applied to Brazilian science teachers in order to contribute to an effective teaching/learning process. This research will use the Implicit Association Test (IAT) to verify implicit attitudes of science teachers concerning nuclear energy. Results will provide data for the next steps of the research. The literature has not reported a similar neuroscience-based methodology applied to the nuclear energy learning/teaching process; therefore, this has demonstrated to be an innovating methodology. The development of the methodology is in progress and the results will be presented in future works. (author)

  3. The development of a neuroscience-based methodology for the nuclear energy learning/teaching process

    Energy Technology Data Exchange (ETDEWEB)

    Barabas, Roberta de C.; Sabundjian, Gaiane, E-mail: robertabarabas@usp.br, E-mail: gdjian@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    When compared to other energy sources such as fossil fuels, coal, oil, and gas, nuclear energy has perhaps the lowest impact on the environment. Moreover, nuclear energy has also benefited other fields such as medicine, pharmaceutical industry, and agriculture, among others. However, despite all benefits that result from the peaceful uses of nuclear energy, the theme is still addressed with prejudice. Education may be the starting point for public acceptance of nuclear energy as it provides pedagogical approaches, learning environments, and human resources, which are essential conditions for effective learning. So far nuclear energy educational researches have been conducted using only conventional assessment methods. The global educational scenario has demonstrated absence of neuroscience-based methods for the teaching of nuclear energy, and that may be an opportunity for developing new strategic teaching methods that will help demystifying the theme consequently improving public acceptance of this type of energy. This work aims to present the first step of a methodology in progress based on researches in neuroscience to be applied to Brazilian science teachers in order to contribute to an effective teaching/learning process. This research will use the Implicit Association Test (IAT) to verify implicit attitudes of science teachers concerning nuclear energy. Results will provide data for the next steps of the research. The literature has not reported a similar neuroscience-based methodology applied to the nuclear energy learning/teaching process; therefore, this has demonstrated to be an innovating methodology. The development of the methodology is in progress and the results will be presented in future works. (author)

  4. Management of depleted uranium

    International Nuclear Information System (INIS)

    Large stocks of depleted uranium have arisen as a result of enrichment operations, especially in the United States and the Russian Federation. Countries with depleted uranium stocks are interested in assessing strategies for the use and management of depleted uranium. The choice of strategy depends on several factors, including government and business policy, alternative uses available, the economic value of the material, regulatory aspects and disposal options, and international market developments in the nuclear fuel cycle. This report presents the results of a depleted uranium study conducted by an expert group organised jointly by the OECD Nuclear Energy Agency and the International Atomic Energy Agency. It contains information on current inventories of depleted uranium, potential future arisings, long term management alternatives, peaceful use options and country programmes. In addition, it explores ideas for international collaboration and identifies key issues for governments and policy makers to consider. (authors)

  5. Water Depletion Threatens Agriculture

    Science.gov (United States)

    Brauman, K. A.; Richter, B. D.; Postel, S.; Floerke, M.; Malsy, M.

    2014-12-01

    Irrigated agriculture is the human activity that has by far the largest impact on water, constituting 85% of global water consumption and 67% of global water withdrawals. Much of this water use occurs in places where water depletion, the ratio of water consumption to water availability, exceeds 75% for at least one month of the year. Although only 17% of global watershed area experiences depletion at this level or more, nearly 30% of total cropland and 60% of irrigated cropland are found in these depleted watersheds. Staple crops are particularly at risk, with 75% of global irrigated wheat production and 65% of irrigated maize production found in watersheds that are at least seasonally depleted. Of importance to textile production, 75% of cotton production occurs in the same watersheds. For crop production in depleted watersheds, we find that one half to two-thirds of production occurs in watersheds that have not just seasonal but annual water shortages, suggesting that re-distributing water supply over the course of the year cannot be an effective solution to shortage. We explore the degree to which irrigated production in depleted watersheds reflects limitations in supply, a byproduct of the need for irrigation in perennially or seasonally dry landscapes, and identify heavy irrigation consumption that leads to watershed depletion in more humid climates. For watersheds that are not depleted, we evaluate the potential impact of an increase in irrigated production. Finally, we evaluate the benefits of irrigated agriculture in depleted and non-depleted watersheds, quantifying the fraction of irrigated production going to food production, animal feed, and biofuels.

  6. A safety assessment methodology applied to CNS/ATM-based air traffic control system

    International Nuclear Information System (INIS)

    In the last decades, the air traffic system has been changing to adapt itself to new social demands, mainly the safe growth of worldwide traffic capacity. Those changes are ruled by the Communication, Navigation, Surveillance/Air Traffic Management (CNS/ATM) paradigm , based on digital communication technologies (mainly satellites) as a way of improving communication, surveillance, navigation and air traffic management services. However, CNS/ATM poses new challenges and needs, mainly related to the safety assessment process. In face of these new challenges, and considering the main characteristics of the CNS/ATM, a methodology is proposed at this work by combining 'absolute' and 'relative' safety assessment methods adopted by the International Civil Aviation Organization (ICAO) in ICAO Doc.9689 , using Fluid Stochastic Petri Nets (FSPN) as the modeling formalism, and compares the safety metrics estimated from the simulation of both the proposed (in analysis) and the legacy system models. To demonstrate its usefulness, the proposed methodology was applied to the 'Automatic Dependent Surveillance-Broadcasting' (ADS-B) based air traffic control system. As conclusions, the proposed methodology assured to assess CNS/ATM system safety properties, in which FSPN formalism provides important modeling capabilities, and discrete event simulation allowing the estimation of the desired safety metric.

  7. IFC BIM-Based Methodology for Semi-Automated Building Energy Performance Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Bazjanac, Vladimir

    2008-07-01

    Building energy performance (BEP) simulation is still rarely used in building design, commissioning and operations. The process is too costly and too labor intensive, and it takes too long to deliver results. Its quantitative results are not reproducible due to arbitrary decisions and assumptions made in simulation model definition, and can be trusted only under special circumstances. A methodology to semi-automate BEP simulation preparation and execution makes this process much more effective. It incorporates principles of information science and aims to eliminate inappropriate human intervention that results in subjective and arbitrary decisions. This is achieved by automating every part of the BEP modeling and simulation process that can be automated, by relying on data from original sources, and by making any necessary data transformation rule-based and automated. This paper describes the new methodology and its relationship to IFC-based BIM and software interoperability. It identifies five steps that are critical to its implementation, and shows what part of the methodology can be applied today. The paper concludes with a discussion of application to simulation with EnergyPlus, and describes data transformation rules embedded in the new Geometry Simplification Tool (GST).

  8. A methodology for collection and analysis of human error data based on a cognitive model: IDA

    International Nuclear Information System (INIS)

    This paper presents a model-based human error taxonomy and data collection. The underlying model, IDA (described in two companion papers), is a cognitive model of behavior developed for analysis of the actions of nuclear power plant operating crew during abnormal situations. The taxonomy is established with reference to three external reference points (i.e. plant status, procedures, and crew) and four reference points internal to the model (i.e. information collected, diagnosis, decision, action). The taxonomy helps the analyst: (1) recognize errors as such; (2) categorize the error in terms of generic characteristics such as 'error in selection of problem solving strategies' and (3) identify the root causes of the error. The data collection methodology is summarized in post event operator interview and analysis summary forms. The root cause analysis methodology is illustrated using a subset of an actual event. Statistics, which extract generic characteristics of error prone behaviors and error prone situations are presented. Finally, applications of the human error data collection are reviewed. A primary benefit of this methodology is to define better symptom-based and other auxiliary procedures with associated training to minimize or preclude certain human errors. It also helps in design of control rooms, and in assessment of human error probabilities in the probabilistic risk assessment framework. (orig.)

  9. The IAEA Collaborating Centre for Neutron Activation Based Methodologies of Research Reactors

    International Nuclear Information System (INIS)

    The Reactor Institute Delft was inaugurated in May 2009 as a new IAEA Collaborating Centre for Neutron Activation Based Methodologies of Research Reactors. The collaboration involves education, research and development in (i) Production of reactor-produced, no-carrier added radioisotopes of high specific activity via neutron activation; (ii) Neutron activation analysis with emphasis on automation as well as analysis of large samples, and radiotracer techniques; and, as a cross-cutting activity, (iii) Quality assurance and management in research and application of research reactor based techniques and in research reactor operations. (author)

  10. Performance Assessment of the Wave Dragon Wave Energy Converter Based on the EquiMar Methodology

    DEFF Research Database (Denmark)

    Parmeggiani, Stefano; Chozas, Julia Fernandez; Pecher, Arthur;

    2011-01-01

    on experimental data deriving from sea trials rather than solely on numerical predictions. The study applies this methodology to evaluate the performance of Wave Dragon at two locations in the North Sea, based on the data acquired during the sea trials of a 1:4.5 scale prototype. Indications about power......At the present pre-commercial phase of the wave energy sector, device developers are called to provide reliable estimates on power performance and production at possible deployment locations. The EU EquiMar project has proposed a novel approach, where the performance assessment is based mainly...

  11. Forest soil nutrient status after 10 years of experimental acidification and base cation depletion : results from 2 long-term soil productivity sites in the central Appalachians

    Energy Technology Data Exchange (ETDEWEB)

    Adams, M.B. [United States Dept. of Agriculture Forest Service, Parsons, WV (United States); Burger, J.A. [Virginia Tech University, Blacks Burg, VA (United States)

    2010-07-01

    This study assessed the hypothesis that soil based cation depletion is an effect of acidic deposition in forests located in the central Appalachians. The effects of experimentally induced base cation depletion were evaluated in relation to long-term soil productivity and the sustainability of forest stands. Whole-tree harvesting was conducted along with the removal of dead wood litter in order to remove all aboveground nutrients. Ammonium sulfate fertilizer was added at annual rates of 40.6 kg S/ha and 35.4 kg N/h in order to increase the leaching of calcium (Ca) and magnesium (Mg) from the soil. A randomized complete block design was used in 4 or 5 treatment applications in a mixed hardwood experimental forest located in West Virginia and in a cherry-maple forest located in a national forest in West Virginia. Soils were sampled over a 10-year period. The study showed that significant changes in soil Mg, N and some other nutrients occurred over time. However, biomass did not differ significantly among the different treatment options used.

  12. Development of the MCNPX depletion capability: A Monte Carlo linked depletion method that automates the coupling between MCNPX and CINDER90 for high fidelity burnup calculations

    Science.gov (United States)

    Fensin, Michael Lorne

    and supported radiation transport code, for further development of a Monte Carlo-linked depletion methodology which is essential to the future development of advanced reactor technologies that exceed the limitations of current deterministic based methods.

  13. A Novel Water Supply Network Sectorization Methodology Based on a Complete Economic Analysis, Including Uncertainties

    Directory of Open Access Journals (Sweden)

    Enrique Campbell

    2016-04-01

    Full Text Available The core idea behind sectorization of Water Supply Networks (WSNs is to establish areas partially isolated from the rest of the network to improve operational control. Besides the benefits associated with sectorization, some drawbacks must be taken into consideration by water operators: the economic investment associated with both boundary valves and flowmeters and the reduction of both pressure and system resilience. The target of sectorization is to properly balance these negative and positive aspects. Sectorization methodologies addressing the economic aspects mainly consider costs of valves and flowmeters and of energy, and the benefits in terms of water saving linked to pressure reduction. However, sectorization entails other benefits, such as the reduction of domestic consumption, the reduction of burst frequency and the enhanced capacity to detect and intervene over future leakage events. We implement a development proposed by the International Water Association (IWA to estimate the aforementioned benefits. Such a development is integrated in a novel sectorization methodology based on a social network community detection algorithm, combined with a genetic algorithm optimization method and Monte Carlo simulation. The methodology is implemented over a fraction of the WSN of Managua city, capital of Nicaragua, generating a net benefit of 25,572 $/year.

  14. A hybrid system identification methodology for wireless structural health monitoring systems based on dynamic substructuring

    Science.gov (United States)

    Dragos, Kosmas; Smarsly, Kay

    2016-04-01

    System identification has been employed in numerous structural health monitoring (SHM) applications. Traditional system identification methods usually rely on centralized processing of structural response data to extract information on structural parameters. However, in wireless SHM systems the centralized processing of structural response data introduces a significant communication bottleneck. Exploiting the merits of decentralization and on-board processing power of wireless SHM systems, many system identification methods have been successfully implemented in wireless sensor networks. While several system identification approaches for wireless SHM systems have been proposed, little attention has been paid to obtaining information on the physical parameters (e.g. stiffness, damping) of the monitored structure. This paper presents a hybrid system identification methodology suitable for wireless sensor networks based on the principles of component mode synthesis (dynamic substructuring). A numerical model of the monitored structure is embedded into the wireless sensor nodes in a distributed manner, i.e. the entire model is segmented into sub-models, each embedded into one sensor node corresponding to the substructure the sensor node is assigned to. The parameters of each sub-model are estimated by extracting local mode shapes and by applying the equations of the Craig-Bampton method on dynamic substructuring. The proposed methodology is validated in a laboratory test conducted on a four-story frame structure to demonstrate the ability of the methodology to yield accurate estimates of stiffness parameters. Finally, the test results are discussed and an outlook on future research directions is provided.

  15. Computerized methodology for micro-CT and histological data inflation using an IVUS based translation map.

    Science.gov (United States)

    Athanasiou, Lambros S; Rigas, George A; Sakellarios, Antonis I; Exarchos, Themis P; Siogkas, Panagiotis K; Naka, Katerina K; Panetta, Daniele; Pelosi, Gualtiero; Vozzi, Federico; Michalis, Lampros K; Parodi, Oberdan; Fotiadis, Dimitrios I

    2015-10-01

    A framework for the inflation of micro-CT and histology data using intravascular ultrasound (IVUS) images, is presented. The proposed methodology consists of three steps. In the first step the micro-CT/histological images are manually co-registered with IVUS by experts using fiducial points as landmarks. In the second step the lumen of both the micro-CT/histological images and IVUS images are automatically segmented. Finally, in the third step the micro-CT/histological images are inflated by applying a transformation method on each image. The transformation method is based on the IVUS and micro-CT/histological contour difference. In order to validate the proposed image inflation methodology, plaque areas in the inflated micro-CT and histological images are compared with the ones in the IVUS images. The proposed methodology for inflating micro-CT/histological images increases the sensitivity of plaque area matching between the inflated and the IVUS images (7% and 22% in histological and micro-CT images, respectively). PMID:25771781

  16. Efficient Finite Element Methodology Based on Cartesian Grids: Application to Structural Shape Optimization

    Directory of Open Access Journals (Sweden)

    E. Nadal

    2013-01-01

    Full Text Available This work presents an analysis methodology based on the use of the Finite Element Method (FEM nowadays considered one of the main numerical tools for solving Boundary Value Problems (BVPs. The proposed methodology, so-called cg-FEM (Cartesian grid FEM, has been implemented for fast and accurate numerical analysis of 2D linear elasticity problems. The traditional FEM uses geometry-conforming meshes; however, in cg-FEM the analysis mesh is not conformal to the geometry. This allows for defining very efficient mesh generation techniques and using a robust integration procedure, to accurately integrate the domain’s geometry. The hierarchical data structure used in cg-FEM together with the Cartesian meshes allow for trivial data sharing between similar entities. The cg-FEM methodology uses advanced recovery techniques to obtain an improved solution of the displacement and stress fields (for which a discretization error estimator in energy norm is available that will be the output of the analysis. All this results in a substantial increase in accuracy and computational efficiency with respect to the standard FEM. cg-FEM has been applied in structural shape optimization showing robustness and computational efficiency in comparison with FEM solutions obtained with a commercial code, despite the fact that cg-FEM has been fully implemented in MATLAB.

  17. A SystemC-Based Design Methodology for Digital Signal Processing Systems

    Directory of Open Access Journals (Sweden)

    Christian Haubelt

    2007-03-01

    Full Text Available Digital signal processing algorithms are of big importance in many embedded systems. Due to complexity reasons and due to the restrictions imposed on the implementations, new design methodologies are needed. In this paper, we present a SystemC-based solution supporting automatic design space exploration, automatic performance evaluation, as well as automatic system generation for mixed hardware/software solutions mapped onto FPGA-based platforms. Our proposed hardware/software codesign approach is based on a SystemC-based library called SysteMoC that permits the expression of different models of computation well known in the domain of digital signal processing. It combines the advantages of executability and analyzability of many important models of computation that can be expressed in SysteMoC. We will use the example of an MPEG-4 decoder throughout this paper to introduce our novel methodology. Results from a five-dimensional design space exploration and from automatically mapping parts of the MPEG-4 decoder onto a Xilinx FPGA platform will demonstrate the effectiveness of our approach.

  18. Modal macro-strain vector based damage detection methodology with long-gauge FBG sensors

    Science.gov (United States)

    Xu, Bin; Liu, Chongwu W.; Masri, Sami F.

    2009-07-01

    Advances in optic fiber sensing technology provide easy and reliable way for the vibration-based strain measurement of engineering structures. As a typical optic fiber sensing techniques with high accuracy and resolution, long-gauge Fiber Bragg Grating (FBG) sensors have been widely employed in health monitoring of civil engineering structures. Therefore, the development of macro strain-based identification methods is crucial for damage detection and structural condition evaluation. In the previous study by the authors, a damage detection algorithm for a beam structure with the direct use of vibration-based macro-strain measurement time history with neural networks had been proposed and validated with experimental measurements. In this paper, a damage locating and quantifying method was proposed using modal macrostrain vectors (MMSVs) which can be extracted from vibration induced macro-strain response measurement time series from long-gage FBG sensors. The performance of the proposed methodology for damage detection of a beam with different damage scenario was studied with numerical simulation firstly. Then, dynamic tests on a simply-supported steel beam with different damage scenarios were carried out and macro-strain measurements were employed to detect the damage severity. Results show that the proposed MMSV based structural identification and damage detection methodology can locate and identify the structural damage severity with acceptable accuracy.

  19. Methodology of the National School-based Health Survey in Malaysia, 2012.

    Science.gov (United States)

    Yusoff, Fadhli; Saari, Riyanti; Naidu, Balkish M; Ahmad, Noor Ani; Omar, Azahadi; Aris, Tahir

    2014-09-01

    The National School-Based Health Survey 2012 was a nationwide school health survey of students in Standard 4 to Form 5 (10-17 years of age), who were schooling in government schools in Malaysia during the period of data collection. The survey comprised 3 subsurveys: the Global School Health Survey (GSHS), the Mental Health Survey, and the National School-Based Nutrition Survey. The aim of the survey was to provide data on the health status of adolescents in Malaysia toward strengthening the adolescent health program in the country. The design of the survey was created to fulfill the requirements of the 3 subsurveys. A 2-stage stratified sampling method was adopted in the sampling. The methods for data collection were via questionnaire and physical examination. The National School-Based Health Survey 2012 adopted an appropriate methodology for a school-based survey to ensure valid and reliable findings.

  20. Methodology for cost analysis of film-based and filmless portable chest systems

    Science.gov (United States)

    Melson, David L.; Gauvain, Karen M.; Beardslee, Brian M.; Kraitsik, Michael J.; Burton, Larry; Blaine, G. James; Brink, Gary S.

    1996-05-01

    Many studies analyzing the costs of film-based and filmless radiology have focused on multi- modality, hospital-wide solutions. Yet due to the enormous cost of converting an entire large radiology department or hospital to a filmless environment all at once, institutions often choose to eliminate film one area at a time. Narrowing the focus of cost-analysis may be useful in making such decisions. This presentation will outline a methodology for analyzing the cost per exam of film-based and filmless solutions for providing portable chest exams to Intensive Care Units (ICUs). The methodology, unlike most in the literature, is based on parallel data collection from existing filmless and film-based ICUs, and is currently being utilized at our institution. Direct costs, taken from the perspective of the hospital, for portable computed radiography chest exams in one filmless and two film-based ICUs are identified. The major cost components are labor, equipment, materials, and storage. Methods for gathering and analyzing each of the cost components are discussed, including FTE-based and time-based labor analysis, incorporation of equipment depreciation, lease, and maintenance costs, and estimation of materials costs. Extrapolation of data from three ICUs to model hypothetical, hospital-wide film-based and filmless ICU imaging systems is described. Performance of sensitivity analysis on the filmless model to assess the impact of anticipated reductions in specific labor, equipment, and archiving costs is detailed. A number of indirect costs, which are not explicitly included in the analysis, are identified and discussed.

  1. Optical fibre-based methodology for screening the effect of probiotic bacteria on conjugated linoleic acid (CLA) in curdled milk

    OpenAIRE

    Silva, Lurdes I. B.; Rodrigues, Dina M.; Freitas, Ana C.; Gomes, Ana M; Teresa A P Rocha-Santos; Pereira, M. E.; Duarte, A.C.

    2011-01-01

    A methodology based on optical fibre (OF) detection was developed for screening the potential of CLA production by Lactobacillus casei-01, Lactobacillus acidophilus La-5 and Bifidobacterium lactis B94 in probiotic curdled milk. The OF based methodology was validated by comparison with an analytical method based on gas chromatography–mass spectrometry (GC–MS) and it showed comparable linearity (between 5 and 130 lg), accuracy and detection limits, which ranged from 1.92 to 2.56 lg ...

  2. Uses of depleted uranium

    International Nuclear Information System (INIS)

    The depleted uranium is that in which percentage of uranium-235 fission executable is less than 0.2% or 0.3%. It is usually caused by the process of reprocessing the nuclear fuel burning, and also mixed with some other radioactive elements such as uranium 236, 238 and plutonium 239. The good features of the depleted uranium are its high density, low price and easily mined. So, the specifications for depleted uranium make it one of the best materials in case you need to have objects small in size, but quite heavy regarding its size. Uses of deplet ed uranium were relatively increased in domestic industrial uses as well as some uses in nuclear industry in the last few years. So it has increased uses in many areas of military and peaceful means such as: in balancing the giant air crafts, ships and missiles and in the manufacture of some types of concrete with severe hardness. (author)

  3. The API methodology for risk-based inspection (RBI) analysis for the petroleum and petrochemical industry

    International Nuclear Information System (INIS)

    Twenty-one petroleum and petrochemical companies are currently sponsoring a project within the American Petroleum Institute (API) to develop risk-based inspection (RBI) methodology for application in the refining and petrochemical industry. This paper describes that particular RBI methodology and provides a summary of the three levels of RBI analysis developed by the project. Also included is a review of the first pilot project to validate the methodology by applying RBI to several existing refining units. The failure for pressure equipment in a process unit can have several undesirable effects. For the purpose of RBI analysis, the API RBI program categorizes these effects into four basic risk outcomes: flammable events, toxic releases, major environmental damage, and business interruption losses. API RBI is a strategic process, both qualitative and quantitative, for understanding and reducing these risks associated with operating pressure equipment. This paper will show how API RBI assesses the potential consequences of a failure of the pressure boundary, as well as assessing the likelihood (probability) of failure. Risk-based inspection also prioritizes risk levels in a systematic manner so that the owner-user can then plan an inspection program that focuses more resources on the higher risk equipment; while possibly saving inspection resources that are not doing an effective job of reducing risk. At the same time, if consequence of failure is a significant driving force for high risk equipment items, plant management also has the option of applying consequence mitigation steps to minimize the impact of a hazardous release, should one occur. The target audience for this paper is engineers, inspectors, and managers who want to understand what API Risk-Based Inspection is all about, what are the benefits and limitations of RBI, and how inspection practices can be changed to reduce risks and/or save costs without impacting safety risk. (Author)

  4. Regression Tree-Based Methodology for Customizing Building Energy Benchmarks to Individual Commercial Buildings

    Science.gov (United States)

    Kaskhedikar, Apoorva Prakash

    According to the U.S. Energy Information Administration, commercial buildings represent about 40% of the United State's energy consumption of which office buildings consume a major portion. Gauging the extent to which an individual building consumes energy in excess of its peers is the first step in initiating energy efficiency improvement. Energy Benchmarking offers initial building energy performance assessment without rigorous evaluation. Energy benchmarking tools based on the Commercial Buildings Energy Consumption Survey (CBECS) database are investigated in this thesis. This study proposes a new benchmarking methodology based on decision trees, where a relationship between the energy use intensities (EUI) and building parameters (continuous and categorical) is developed for different building types. This methodology was applied to medium office and school building types contained in the CBECS database. The Random Forest technique was used to find the most influential parameters that impact building energy use intensities. Subsequently, correlations which were significant were identified between EUIs and CBECS variables. Other than floor area, some of the important variables were number of workers, location, number of PCs and main cooling equipment. The coefficient of variation was used to evaluate the effectiveness of the new model. The customization technique proposed in this thesis was compared with another benchmarking model that is widely used by building owners and designers namely, the ENERGY STAR's Portfolio Manager. This tool relies on the standard Linear Regression methods which is only able to handle continuous variables. The model proposed uses data mining technique and was found to perform slightly better than the Portfolio Manager. The broader impacts of the new benchmarking methodology proposed is that it allows for identifying important categorical variables, and then incorporating them in a local, as against a global, model framework for EUI

  5. A segmental hidden semi-Markov model (HSMM)-based diagnostics and prognostics framework and methodology

    Science.gov (United States)

    Dong, Ming; He, David

    2007-07-01

    Diagnostics and prognostics are two important aspects in a condition-based maintenance (CBM) program. However, these two tasks are often separately performed. For example, data might be collected and analysed separately for diagnosis and prognosis. This practice increases the cost and reduces the efficiency of CBM and may affect the accuracy of the diagnostic and prognostic results. In this paper, a statistical modelling methodology for performing both diagnosis and prognosis in a unified framework is presented. The methodology is developed based on segmental hidden semi-Markov models (HSMMs). An HSMM is a hidden Markov model (HMM) with temporal structures. Unlike HMM, an HSMM does not follow the unrealistic Markov chain assumption and therefore provides more powerful modelling and analysis capability for real problems. In addition, an HSMM allows modelling the time duration of the hidden states and therefore is capable of prognosis. To facilitate the computation in the proposed HSMM-based diagnostics and prognostics, new forward-backward variables are defined and a modified forward-backward algorithm is developed. The existing state duration estimation methods are inefficient because they require a huge storage and computational load. Therefore, a new approach is proposed for training HSMMs in which state duration probabilities are estimated on the lattice (or trellis) of observations and states. The model parameters are estimated through the modified forward-backward training algorithm. The estimated state duration probability distributions combined with state-changing point detection can be used to predict the useful remaining life of a system. The evaluation of the proposed methodology was carried out through a real world application: health monitoring of hydraulic pumps. In the tests, the recognition rates for all states are greater than 96%. For each individual pump, the recognition rate is increased by 29.3% in comparison with HMMs. Because of the temporal

  6. A Fault Diagnosis Methodology for Gear Pump Based on EEMD and Bayesian Network.

    Directory of Open Access Journals (Sweden)

    Zengkai Liu

    Full Text Available This paper proposes a fault diagnosis methodology for a gear pump based on the ensemble empirical mode decomposition (EEMD method and the Bayesian network. Essentially, the presented scheme is a multi-source information fusion based methodology. Compared with the conventional fault diagnosis with only EEMD, the proposed method is able to take advantage of all useful information besides sensor signals. The presented diagnostic Bayesian network consists of a fault layer, a fault feature layer and a multi-source information layer. Vibration signals from sensor measurement are decomposed by the EEMD method and the energy of intrinsic mode functions (IMFs are calculated as fault features. These features are added into the fault feature layer in the Bayesian network. The other sources of useful information are added to the information layer. The generalized three-layer Bayesian network can be developed by fully incorporating faults and fault symptoms as well as other useful information such as naked eye inspection and maintenance records. Therefore, diagnostic accuracy and capacity can be improved. The proposed methodology is applied to the fault diagnosis of a gear pump and the structure and parameters of the Bayesian network is established. Compared with artificial neural network and support vector machine classification algorithms, the proposed model has the best diagnostic performance when sensor data is used only. A case study has demonstrated that some information from human observation or system repair records is very helpful to the fault diagnosis. It is effective and efficient in diagnosing faults based on uncertain, incomplete information.

  7. Recursion-based depletion of human immunodeficiency virus-specific naive CD4(+) T cells may facilitate persistent viral replication and chronic viraemia leading to acquired immunodeficiency syndrome.

    Science.gov (United States)

    Tsukamoto, Tetsuo; Yamamoto, Hiroyuki; Okada, Seiji; Matano, Tetsuro

    2016-09-01

    Although antiretroviral therapy has made human immunodeficiency virus (HIV) infection a controllable disease, it is still unclear how viral replication persists in untreated patients and causes CD4(+) T-cell depletion leading to acquired immunodeficiency syndrome (AIDS) in several years. Theorists tried to explain it with the diversity threshold theory in which accumulated mutations in the HIV genome make the virus so diverse that the immune system will no longer be able to recognize all the variants and fail to control the viraemia. Although the theory could apply to a number of cases, macaque AIDS models using simian immunodeficiency virus (SIV) have shown that failed viral control at the set point is not always associated with T-cell escape mutations. Moreover, even monkeys without a protective major histocompatibility complex (MHC) allele can contain replication of a super infected SIV following immunization with a live-attenuated SIV vaccine, while those animals are not capable of fighting primary SIV infection. Here we propose a recursion-based virus-specific naive CD4(+) T-cell depletion hypothesis through thinking on what may happen in individuals experiencing primary immunodeficiency virus infection. This could explain the mechanism for impairment of virus-specific immune response in the course of HIV infection. PMID:27515208

  8. A Methodology to Evaluate Object oriented Software Systems Using Change Requirement Traceability Based on Impact Analysis

    Directory of Open Access Journals (Sweden)

    Sunil T. D

    2014-06-01

    Full Text Available It is a well known fact that software maintenance plays a major role and finds importance in software development life cycle. As object - oriented programming has become the standard, it is very important to understand the problems of maintaining object -oriented software systems. This paper aims at evaluating object - oriented software system through change requirement traceability – based impact analysis methodology for non functional requirements using functional requirements . The major issues have been related to change impact algorithms and inheritance of functionality.

  9. Smart learning objects for smart education in computer science theory, methodology and robot-based implementation

    CERN Document Server

    Stuikys, Vytautas

    2015-01-01

    This monograph presents the challenges, vision and context to design smart learning objects (SLOs) through Computer Science (CS) education modelling and feature model transformations. It presents the latest research on the meta-programming-based generative learning objects (the latter with advanced features are treated as SLOs) and the use of educational robots in teaching CS topics. The introduced methodology includes the overall processes to develop SLO and smart educational environment (SEE) and integrates both into the real education setting to provide teaching in CS using constructivist a

  10. Towards A Model-Based Prognostics Methodology for Electrolytic Capacitors: A Case Study Based on Electrical Overstress Accelerated Aging

    Science.gov (United States)

    Celaya, Jose R.; Kulkarni, Chetan S.; Biswas, Gautam; Goebel, Kai

    2012-01-01

    A remaining useful life prediction methodology for electrolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical degradation model. Electrolytic capacitors are used in several applications ranging from power supplies on critical avionics equipment to power drivers for electro-mechanical actuators. These devices are known for their comparatively low reliability and given their criticality in electronics subsystems they are a good candidate for component level prognostics and health management. Prognostics provides a way to assess remaining useful life of a capacitor based on its current state of health and its anticipated future usage and operational conditions. We present here also, experimental results of an accelerated aging test under electrical stresses. The data obtained in this test form the basis for a remaining life prediction algorithm where a model of the degradation process is suggested. This preliminary remaining life prediction algorithm serves as a demonstration of how prognostics methodologies could be used for electrolytic capacitors. In addition, the use degradation progression data from accelerated aging, provides an avenue for validation of applications of the Kalman filter based prognostics methods typically used for remaining useful life predictions in other applications.

  11. Optimization-based methodology for the development of wastewater facilities for energy and nutrient recovery.

    Science.gov (United States)

    Puchongkawarin, C; Gomez-Mont, C; Stuckey, D C; Chachuat, B

    2015-12-01

    A paradigm shift is currently underway from an attitude that considers wastewater streams as a waste to be treated, to a proactive interest in recovering materials and energy from these streams. This paper is concerned with the development and application of a systematic, model-based methodology for the development of wastewater resource recovery systems that are both economically attractive and sustainable. With the array of available treatment and recovery options growing steadily, a superstructure modeling approach based on rigorous mathematical optimization appears to be a natural approach for tackling these problems. The development of reliable, yet simple, performance and cost models is a key issue with this approach in order to allow for a reliable solution based on global optimization. We argue that commercial wastewater simulators can be used to derive such models, and we illustrate this approach with a simple resource recovery system. The results show that the proposed methodology is computationally tractable, thereby supporting its application as a decision support system for selection of promising resource recovery systems whose development is worth pursuing.

  12. Location of Bioelectricity Plants in the Madrid Community Based on Triticale Crop: A Multicriteria Methodology

    Directory of Open Access Journals (Sweden)

    L. Romero

    2015-01-01

    Full Text Available This paper presents a work whose objective is, first, to quantify the potential of the triticale biomass existing in each of the agricultural regions in the Madrid Community through a crop simulation model based on regression techniques and multiple correlation. Second, a methodology for defining which area has the best conditions for the installation of electricity plants from biomass has been described and applied. The study used a methodology based on compromise programming in a discrete multicriteria decision method (MDM context. To make a ranking, the following criteria were taken into account: biomass potential, electric power infrastructure, road networks, protected spaces, and urban nuclei surfaces. The results indicate that, in the case of the Madrid Community, the Campiña region is the most suitable for setting up plants powered by biomass. A minimum of 17,339.9 tons of triticale will be needed to satisfy the requirements of a 2.2 MW power plant. The minimum range of action for obtaining the biomass necessary in Campiña region would be 6.6 km around the municipality of Algete, based on Geographic Information Systems. The total biomass which could be made available in considering this range in this region would be 18,430.68 t.

  13. Unravelling emotional viewpoints on a bio-based economy using Q methodology.

    Science.gov (United States)

    Sleenhoff, Susanne; Cuppen, Eefje; Osseweijer, Patricia

    2015-10-01

    A transition to a bio-based economy will affect society and requires collective action from a broad range of stakeholders. This includes the public, who are largely unaware of this transition. For meaningful public engagement people's emotional viewpoints play an important role. However, what the public's emotions about the transition are and how they can be taken into account is underexposed in public engagement literature and practice. This article aims to unravel the public's emotional views of the bio-based economy as a starting point for public engagement. Using Q methodology with visual representations of a bio-based economy we found four emotional viewpoints: (1) compassionate environmentalist, (2) principled optimist, (3) hopeful motorist and (4) cynical environmentalist. These provide insight into the distinct and shared ways through which members of the public connect with the transition. Implications for public engagement are discussed.

  14. Pediatric hydrocephalus: systematic literature review and evidence-based guidelines. Part 1: Introduction and methodology.

    Science.gov (United States)

    Flannery, Ann Marie; Mitchell, Laura

    2014-11-01

    This clinical systematic review of and evidence-based guidelines for the treatment of pediatric hydrocephalus were developed by a physician volunteer task force. They are provided as an educational tool based on an assessment of current scientific and clinical information as well as accepted approaches to treatment. They are not intended to be a fixed protocol, because some patients may require more or less treatment. In Part 1, the authors introduce the reader to the complex topic of hydrocephalus and the lack of consensus on its appropriate treatment. The authors describe the development of the Pediatric Hydrocephalus Systematic Review and Evidence-Based Guidelines Task Force charged with reviewing the literature and recommending treatments for hydrocephalus, and they set out the basic methodology used throughout the specific topics covered in later chapters.

  15. Optimization of cocoa nib roasting based on sensory properties and colour using response surface methodology

    Directory of Open Access Journals (Sweden)

    D.M.H. A.H. Farah

    2012-05-01

    Full Text Available Roasting of cocoa beans is a critical stage for development of its desirable flavour, aroma and colour. Prior to roasting, cocoa bean may taste astringent, bitter, acidy, musty, unclean, nutty or even chocolate-like, depends on the bean sources and their preparations. After roasting, the bean possesses a typical intense cocoa flavour. The Maillard or non-enzymatic browning reactions is a very important process for the development of cocoa flavor, which occurs primarily during the roasting process and it has generally been agreed that the main flavor components, pyrazines formation is associated within this reaction involving amino acids and reducing sugars. The effect of cocoa nib roasting conditions on sensory properties and colour of cocoa beans were investigated in this study. Roasting conditions in terms of temperature ranged from 110 to 160OC and time ranged from 15 to 40 min were optimized by using Response Surface Methodology based on the cocoa sensory characteristics including chocolate aroma, acidity, astringency, burnt taste and overall acceptability. The analyses used 9- point hedonic scale with twelve trained panelist. The changes in colour due to the roasting condition were also monitored using chromameter. Result of this study showed that sensory quality of cocoa liquor increased with the increase in roasting time and temperature up to 160OC and up to 40 min, respectively. Based on the Response Surface Methodology, the optimised operating condition for the roaster was at temperature of 127OC and time of 25 min. The proposed roasting conditions were able to produce superior quality cocoa beans that will be very useful for cocoa manufactures.Key words : Cocoa, cocoa liquor, flavour, aroma, colour, sensory characteristic, response surface methodology.

  16. Risk Based Inspection Methodology and Software Applied to Atmospheric Storage Tanks

    Science.gov (United States)

    Topalis, P.; Korneliussen, G.; Hermanrud, J.; Steo, Y.

    2012-05-01

    A new risk-based inspection (RBI) methodology and software is presented in this paper. The objective of this work is to allow management of the inspections of atmospheric storage tanks in the most efficient way, while, at the same time, accident risks are minimized. The software has been built on the new risk framework architecture, a generic platform facilitating efficient and integrated development of software applications using risk models. The framework includes a library of risk models and the user interface is automatically produced on the basis of editable schemas. This risk-framework-based RBI tool has been applied in the context of RBI for above-ground atmospheric storage tanks (AST) but it has been designed with the objective of being generic enough to allow extension to the process plants in general. This RBI methodology is an evolution of an approach and mathematical models developed for Det Norske Veritas (DNV) and the American Petroleum Institute (API). The methodology assesses damage mechanism potential, degradation rates, probability of failure (PoF), consequence of failure (CoF) in terms of environmental damage and financial loss, risk and inspection intervals and techniques. The scope includes assessment of the tank floor for soil-side external corrosion and product-side internal corrosion and the tank shell courses for atmospheric corrosion and internal thinning. It also includes preliminary assessment for brittle fracture and cracking. The data are structured according to an asset hierarchy including Plant, Production Unit, Process Unit, Tag, Part and Inspection levels and the data are inherited / defaulted seamlessly from a higher hierarchy level to a lower level. The user interface includes synchronized hierarchy tree browsing, dynamic editor and grid-view editing and active reports with drill-in capability.

  17. Flow methodology for methanol determination in biodiesel exploiting membrane-based extraction

    International Nuclear Information System (INIS)

    A methodology based in flow analysis and membrane-based extraction has been applied to the determination of methanol in biodiesel samples. A hydrophilic membrane was used to perform the liquid-liquid extraction in the system with the organic sample fed to the donor side of the membrane and the methanol transfer to an aqueous acceptor buffer solution. The quantification of the methanol was then achieved in aqueous solution by the combined use of immobilised alcohol oxidase (AOD), soluble peroxidase and 2,2'-azino-bis(3-ethylbenzothiazoline-6-sulfonic acid) (ABTS). The optimization of parameters such as the type of membrane, the groove volume and configuration of the membrane unit, the appropriate organic solvent, sample injection volume, as well as immobilised packed AOD reactor was performed. Two dynamic analytical working ranges were achieved, up to 0.015% and up to 0.200% (m/m) methanol concentrations, just by changing the volume of acceptor aqueous solution. Detection limits of 0.0002% (m/m) and 0.007% (m/m) methanol were estimated, respectively. The decision limit (CCα) and the detection capacity (CCβ) were 0.206 and 0.211% (m/m), respectively. The developed methodology showed good precision, with a relative standard deviation (R.S.D.) <5.0% (n = 10). Biodiesel samples from different sources were then directly analyzed without any sample pre-treatment. Statistical evaluation showed good compliance, for a 95% confidence level, between the results obtained with the flow system and those furnished by the gas chromatography reference method. The proposed methodology turns out to be more environmental friendly and cost-effective than the reference method

  18. Flow methodology for methanol determination in biodiesel exploiting membrane-based extraction

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, Andre R.T.S. [REQUIMTE, Departamento de Quimica-Fisica, Faculdade de Farmacia da Universidade do Porto, Rua Anibal Cunha 164, 4099-030 Porto (Portugal); Saraiva, M. Lucia M.F.S. [REQUIMTE, Departamento de Quimica-Fisica, Faculdade de Farmacia da Universidade do Porto, Rua Anibal Cunha 164, 4099-030 Porto (Portugal)], E-mail: lsaraiva@ff.up.pt; Lima, Jose L.F.C. [REQUIMTE, Departamento de Quimica-Fisica, Faculdade de Farmacia da Universidade do Porto, Rua Anibal Cunha 164, 4099-030 Porto (Portugal); Korn, M. Gracas A. [Grupo de Pesquisa em Quimica Analitica, Instituto de Quimica, Universidade Federal da Bahia, Campus de Ondina, 40170-290 Salvador, Bahia (Brazil)

    2008-04-21

    A methodology based in flow analysis and membrane-based extraction has been applied to the determination of methanol in biodiesel samples. A hydrophilic membrane was used to perform the liquid-liquid extraction in the system with the organic sample fed to the donor side of the membrane and the methanol transfer to an aqueous acceptor buffer solution. The quantification of the methanol was then achieved in aqueous solution by the combined use of immobilised alcohol oxidase (AOD), soluble peroxidase and 2,2'-azino-bis(3-ethylbenzothiazoline-6-sulfonic acid) (ABTS). The optimization of parameters such as the type of membrane, the groove volume and configuration of the membrane unit, the appropriate organic solvent, sample injection volume, as well as immobilised packed AOD reactor was performed. Two dynamic analytical working ranges were achieved, up to 0.015% and up to 0.200% (m/m) methanol concentrations, just by changing the volume of acceptor aqueous solution. Detection limits of 0.0002% (m/m) and 0.007% (m/m) methanol were estimated, respectively. The decision limit (CC{alpha}) and the detection capacity (CC{beta}) were 0.206 and 0.211% (m/m), respectively. The developed methodology showed good precision, with a relative standard deviation (R.S.D.) <5.0% (n = 10). Biodiesel samples from different sources were then directly analyzed without any sample pre-treatment. Statistical evaluation showed good compliance, for a 95% confidence level, between the results obtained with the flow system and those furnished by the gas chromatography reference method. The proposed methodology turns out to be more environmental friendly and cost-effective than the reference method.

  19. A proposal on teaching methodology: cooperative learning by peer tutoring based on the case method

    Science.gov (United States)

    Pozo, Antonio M.; Durbán, Juan J.; Salas, Carlos; del Mar Lázaro, M.

    2014-07-01

    The European Higher Education Area (EHEA) proposes substantial changes in the teaching-learning model, moving from a model based mainly on the activity of teachers to a model in which the true protagonist is the student. This new framework requires that students develop new abilities and acquire specific skills. This also implies that the teacher should incorporate new methodologies in class. In this work, we present a proposal on teaching methodology based on cooperative learning and peer tutoring by case study. A noteworthy aspect of the case-study method is that it presents situations that can occur in real life. Therefore, students can acquire certain skills that will be useful in their future professional practice. An innovative aspect in the teaching methodology that we propose is to form work groups consisting of students from different levels in the same major. In our case, the teaching of four subjects would be involved: one subject of the 4th year, one subject of the 3rd year, and two subjects of the 2nd year of the Degree in Optics and Optometry of the University of Granada, Spain. Each work group would consist of a professor and a student of the 4th year, a professor and a student of the 3rd year, and two professors and two students of the 2nd year. Each work group would have a tutoring process from each professor for the corresponding student, and a 4th-year student providing peer tutoring for the students of the 2nd and 3rd year.

  20. EUROCONTROL-Systemic Occurrence Analysis Methodology (SOAM)-A 'Reason'-based organisational methodology for analysing incidents and accidents

    Energy Technology Data Exchange (ETDEWEB)

    Licu, Tony [EUROCONTROL-European Organization for the Safety of Air Navigation and Dedale Asia Pacific, Safety Team, Rue de la Fusee, 96, 1130 Brussels (Belgium)]. E-mail: antonio.licu@eurocontrol.int; Cioran, Florin [EUROCONTROL-European Organization for the Safety of Air Navigation and Dedale Asia Pacific, Safety Team, Rue de la Fusee, 96, 1130 Brussels (Belgium)]. E-mail: florin.cioran@eurocontrol.int; Hayward, Brent [EUROCONTROL-European Organization for the Safety of Air Navigation and Dedale Asia Pacific, Safety Team, Rue de la Fusee, 96, 1130 Brussels (Belgium)]. E-mail: bhayward@dedale.net; Lowe, Andrew [EUROCONTROL-European Organization for the Safety of Air Navigation and Dedale Asia Pacific, Safety Team, Rue de la Fusee, 96, 1130 Brussels (Belgium)]. E-mail: alowe@dedale.net

    2007-09-15

    The Safety Occurrence Analysis Methodology (SOAM) developed for EUROCONTROL is an accident investigation methodology based on the Reason Model of organisational accidents. The purpose of a SOAM is to broaden the focus of an investigation from human involvement issues, also known as 'active failures of operational personnel' under Reason's original model, to include analysis of the latent conditions deeper within the organisation that set the context for the event. Such an approach is consistent with the tenets of Just Culture in which people are encouraged to provide full and open information about how incidents occurred, and are not penalised for errors. A truly systemic approach is not simply a means of transferring responsibility for a safety occurrence from front-line employees to senior managers. A consistent philosophy must be applied, where the investigation process seeks to correct deficiencies wherever they may be found, without attempting to apportion blame or liability.

  1. Design Methodology for Magnetic Field-Based Soft Tri-Axis Tactile Sensors.

    Science.gov (United States)

    Wang, Hongbo; de Boer, Greg; Kow, Junwai; Alazmani, Ali; Ghajari, Mazdak; Hewson, Robert; Culmer, Peter

    2016-08-24

    Tactile sensors are essential if robots are to safely interact with the external world and to dexterously manipulate objects. Current tactile sensors have limitations restricting their use, notably being too fragile or having limited performance. Magnetic field-based soft tactile sensors offer a potential improvement, being durable, low cost, accurate and high bandwidth, but they are relatively undeveloped because of the complexities involved in design and calibration. This paper presents a general design methodology for magnetic field-based three-axis soft tactile sensors, enabling researchers to easily develop specific tactile sensors for a variety of applications. All aspects (design, fabrication, calibration and evaluation) of the development of tri-axis soft tactile sensors are presented and discussed. A moving least square approach is used to decouple and convert the magnetic field signal to force output to eliminate non-linearity and cross-talk effects. A case study of a tactile sensor prototype, MagOne, was developed. This achieved a resolution of 1.42 mN in normal force measurement (0.71 mN in shear force), good output repeatability and has a maximum hysteresis error of 3.4%. These results outperform comparable sensors reported previously, highlighting the efficacy of our methodology for sensor design.

  2. Model-Driven Methodology for Rapid Deployment of Smart Spaces Based on Resource-Oriented Architectures

    Directory of Open Access Journals (Sweden)

    José R. Casar

    2012-07-01

    Full Text Available Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT and Web of Things (WoT are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i to integrate sensing and actuating functionalities into everyday objects, and (ii to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD methodology based on the Model Driven Architecture (MDA. This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.

  3. Design Methodology for Magnetic Field-Based Soft Tri-Axis Tactile Sensors

    Directory of Open Access Journals (Sweden)

    Hongbo Wang

    2016-08-01

    Full Text Available Tactile sensors are essential if robots are to safely interact with the external world and to dexterously manipulate objects. Current tactile sensors have limitations restricting their use, notably being too fragile or having limited performance. Magnetic field-based soft tactile sensors offer a potential improvement, being durable, low cost, accurate and high bandwidth, but they are relatively undeveloped because of the complexities involved in design and calibration. This paper presents a general design methodology for magnetic field-based three-axis soft tactile sensors, enabling researchers to easily develop specific tactile sensors for a variety of applications. All aspects (design, fabrication, calibration and evaluation of the development of tri-axis soft tactile sensors are presented and discussed. A moving least square approach is used to decouple and convert the magnetic field signal to force output to eliminate non-linearity and cross-talk effects. A case study of a tactile sensor prototype, MagOne, was developed. This achieved a resolution of 1.42 mN in normal force measurement (0.71 mN in shear force, good output repeatability and has a maximum hysteresis error of 3.4%. These results outperform comparable sensors reported previously, highlighting the efficacy of our methodology for sensor design.

  4. Design Methodology for Magnetic Field-Based Soft Tri-Axis Tactile Sensors

    Science.gov (United States)

    Wang, Hongbo; de Boer, Greg; Kow, Junwai; Alazmani, Ali; Ghajari, Mazdak; Hewson, Robert; Culmer, Peter

    2016-01-01

    Tactile sensors are essential if robots are to safely interact with the external world and to dexterously manipulate objects. Current tactile sensors have limitations restricting their use, notably being too fragile or having limited performance. Magnetic field-based soft tactile sensors offer a potential improvement, being durable, low cost, accurate and high bandwidth, but they are relatively undeveloped because of the complexities involved in design and calibration. This paper presents a general design methodology for magnetic field-based three-axis soft tactile sensors, enabling researchers to easily develop specific tactile sensors for a variety of applications. All aspects (design, fabrication, calibration and evaluation) of the development of tri-axis soft tactile sensors are presented and discussed. A moving least square approach is used to decouple and convert the magnetic field signal to force output to eliminate non-linearity and cross-talk effects. A case study of a tactile sensor prototype, MagOne, was developed. This achieved a resolution of 1.42 mN in normal force measurement (0.71 mN in shear force), good output repeatability and has a maximum hysteresis error of 3.4%. These results outperform comparable sensors reported previously, highlighting the efficacy of our methodology for sensor design. PMID:27563908

  5. Model-driven methodology for rapid deployment of smart spaces based on resource-oriented architectures.

    Science.gov (United States)

    Corredor, Iván; Bernardos, Ana M; Iglesias, Josué; Casar, José R

    2012-01-01

    Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT) and Web of Things (WoT) are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i) to integrate sensing and actuating functionalities into everyday objects, and (ii) to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD) methodology based on the Model Driven Architecture (MDA). This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.

  6. Measurement-based auralization methodology for the assessment of noise mitigation measures

    Science.gov (United States)

    Thomas, Pieter; Wei, Weigang; Van Renterghem, Timothy; Botteldooren, Dick

    2016-09-01

    The effect of noise mitigation measures is generally expressed by noise levels only, neglecting the listener's perception. In this study, an auralization methodology is proposed that enables an auditive preview of noise abatement measures for road traffic noise, based on the direction dependent attenuation of a priori recordings made with a dedicated 32-channel spherical microphone array. This measurement-based auralization has the advantage that all non-road traffic sounds that create the listening context are present. The potential of this auralization methodology is evaluated through the assessment of the effect of an L-shaped mound. The angular insertion loss of the mound is estimated by using the ISO 9613-2 propagation model, the Pierce barrier diffraction model and the Harmonoise point-to-point model. The realism of the auralization technique is evaluated by listening tests, indicating that listeners had great difficulty in differentiating between a posteriori recordings and auralized samples, which shows the validity of the followed approaches.

  7. Attitudes toward simulation-based learning in nursing students: an application of Q methodology.

    Science.gov (United States)

    Yeun, Eun Ja; Bang, Ho Yoon; Ryoo, Eon Na; Ha, Eun-Ho

    2014-07-01

    SBL is a highly advanced educational method that promotes technical/non-technical skills, increases team competency, and increases health care team interaction in a safe health care environment with no potential for harm to the patient. Even though students may experience the same simulation, their reactions are not necessarily uniform. This study aims at identifying the diversely perceived attitudes of undergraduate nursing students toward simulation-based learning. This study design was utilized using a Q methodology, which analyzes the subjectivity of each type of attitude. Data were collected from 22 undergraduate nursing students who had an experience of simulation-based learning before going to the clinical setting. The 45 selected Q-statements from each of 22 participants were classified into the shape of a normal distribution using a 9-point scale. The collected data was analyzed using the pc-QUANL program. The results revealed two discrete groups of students toward simulation-based learning: 'adventurous immersion' and 'constructive criticism'. The findings revealed that teaching and learning strategies based on the two factors of attitudes could beneficially contribute to the customization of simulation-based learning. In nursing education and clinical practice, teaching and learning strategies based on types I and II can be used to refine an alternative learning approach that supports and complements clinical practice. Recommendations have been provided based on the findings. PMID:24629271

  8. Attitudes toward simulation-based learning in nursing students: an application of Q methodology.

    Science.gov (United States)

    Yeun, Eun Ja; Bang, Ho Yoon; Ryoo, Eon Na; Ha, Eun-Ho

    2014-07-01

    SBL is a highly advanced educational method that promotes technical/non-technical skills, increases team competency, and increases health care team interaction in a safe health care environment with no potential for harm to the patient. Even though students may experience the same simulation, their reactions are not necessarily uniform. This study aims at identifying the diversely perceived attitudes of undergraduate nursing students toward simulation-based learning. This study design was utilized using a Q methodology, which analyzes the subjectivity of each type of attitude. Data were collected from 22 undergraduate nursing students who had an experience of simulation-based learning before going to the clinical setting. The 45 selected Q-statements from each of 22 participants were classified into the shape of a normal distribution using a 9-point scale. The collected data was analyzed using the pc-QUANL program. The results revealed two discrete groups of students toward simulation-based learning: 'adventurous immersion' and 'constructive criticism'. The findings revealed that teaching and learning strategies based on the two factors of attitudes could beneficially contribute to the customization of simulation-based learning. In nursing education and clinical practice, teaching and learning strategies based on types I and II can be used to refine an alternative learning approach that supports and complements clinical practice. Recommendations have been provided based on the findings.

  9. Specification for the VERA Depletion Benchmark Suite

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kang Seog [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-12-17

    CASL-X-2015-1014-000 iii Consortium for Advanced Simulation of LWRs EXECUTIVE SUMMARY The CASL neutronics simulator MPACT is under development for the neutronics and T-H coupled simulation for the pressurized water reactor. MPACT includes the ORIGEN-API and internal depletion module to perform depletion calculations based upon neutron-material reaction and radioactive decay. It is a challenge to validate the depletion capability because of the insufficient measured data. One of the detoured methods to validate it is to perform a code-to-code comparison for benchmark problems. In this study a depletion benchmark suite has been developed and a detailed guideline has been provided to obtain meaningful computational outcomes which can be used in the validation of the MPACT depletion capability.

  10. Hydrologic transport of depleted uranium associated with open air dynamic range testing at Los Alamos National Laboratory, New Mexico, and Eglin Air Force Base, Florida

    Energy Technology Data Exchange (ETDEWEB)

    Becker, N.M. [Los Alamos National Lab., NM (United States); Vanta, E.B. [Wright Laboratory Armament Directorate, Eglin Air Force Base, FL (United States)

    1995-05-01

    Hydrologic investigations on depleted uranium fate and transport associated with dynamic testing activities were instituted in the 1980`s at Los Alamos National Laboratory and Eglin Air Force Base. At Los Alamos, extensive field watershed investigations of soil, sediment, and especially runoff water were conducted. Eglin conducted field investigations and runoff studies similar to those at Los Alamos at former and active test ranges. Laboratory experiments complemented the field investigations at both installations. Mass balance calculations were performed to quantify the mass of expended uranium which had transported away from firing sites. At Los Alamos, it is estimated that more than 90 percent of the uranium still remains in close proximity to firing sites, which has been corroborated by independent calculations. At Eglin, we estimate that 90 to 95 percent of the uranium remains at test ranges. These data demonstrate that uranium moves slowly via surface water, in both semi-arid (Los Alamos) and humid (Eglin) environments.

  11. Hydrologic transport of depleted uranium associated with open air dynamic range testing at Los Alamos National Laboratory, New Mexico, and Eglin Air Force Base, Florida

    International Nuclear Information System (INIS)

    Hydrologic investigations on depleted uranium fate and transport associated with dynamic testing activities were instituted in the 1980's at Los Alamos National Laboratory and Eglin Air Force Base. At Los Alamos, extensive field watershed investigations of soil, sediment, and especially runoff water were conducted. Eglin conducted field investigations and runoff studies similar to those at Los Alamos at former and active test ranges. Laboratory experiments complemented the field investigations at both installations. Mass balance calculations were performed to quantify the mass of expended uranium which had transported away from firing sites. At Los Alamos, it is estimated that more than 90 percent of the uranium still remains in close proximity to firing sites, which has been corroborated by independent calculations. At Eglin, we estimate that 90 to 95 percent of the uranium remains at test ranges. These data demonstrate that uranium moves slowly via surface water, in both semi-arid (Los Alamos) and humid (Eglin) environments

  12. Delivering spacecraft control centers with embedded knowledge-based systems: The methodology issue

    Science.gov (United States)

    Ayache, S.; Haziza, M.; Cayrac, D.

    1994-01-01

    Matra Marconi Space (MMS) occupies a leading place in Europe in the domain of satellite and space data processing systems. The maturity of the knowledge-based systems (KBS) technology, the theoretical and practical experience acquired in the development of prototype, pre-operational and operational applications, make it possible today to consider the wide operational deployment of KBS's in space applications. In this perspective, MMS has to prepare the introduction of the new methods and support tools that will form the basis of the development of such systems. This paper introduces elements of the MMS methodology initiatives in the domain and the main rationale that motivated the approach. These initiatives develop along two main axes: knowledge engineering methods and tools, and a hybrid method approach for coexisting knowledge-based and conventional developments.

  13. A Visual Analytics Based Decision Support Methodology For Evaluating Low Energy Building Design Alternatives

    Science.gov (United States)

    Dutta, Ranojoy

    The ability to design high performance buildings has acquired great importance in recent years due to numerous federal, societal and environmental initiatives. However, this endeavor is much more demanding in terms of designer expertise and time. It requires a whole new level of synergy between automated performance prediction with the human capabilities to perceive, evaluate and ultimately select a suitable solution. While performance prediction can be highly automated through the use of computers, performance evaluation cannot, unless it is with respect to a single criterion. The need to address multi-criteria requirements makes it more valuable for a designer to know the "latitude" or "degrees of freedom" he has in changing certain design variables while achieving preset criteria such as energy performance, life cycle cost, environmental impacts etc. This requirement can be met by a decision support framework based on near-optimal "satisficing" as opposed to purely optimal decision making techniques. Currently, such a comprehensive design framework is lacking, which is the basis for undertaking this research. The primary objective of this research is to facilitate a complementary relationship between designers and computers for Multi-Criterion Decision Making (MCDM) during high performance building design. It is based on the application of Monte Carlo approaches to create a database of solutions using deterministic whole building energy simulations, along with data mining methods to rank variable importance and reduce the multi-dimensionality of the problem. A novel interactive visualization approach is then proposed which uses regression based models to create dynamic interplays of how varying these important variables affect the multiple criteria, while providing a visual range or band of variation of the different design parameters. The MCDM process has been incorporated into an alternative methodology for high performance building design referred to as

  14. Evaluation methodology based on physical security assessment results: a utility theory approach

    International Nuclear Information System (INIS)

    This report describes an evaluation methodology which aggregates physical security assessment results for nuclear facilities into an overall measure of adequacy. This methodology utilizes utility theory and conforms to a hierarchical structure developed by the NRC. Implementation of the methodology is illustrated by several examples. Recommendations for improvements in the evaluation process are given

  15. Ni-based Superalloy Development for VHTR - Methodology Using Design of Experiments and Thermodynamic Calculation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung Woo; Kim, Dong Jin [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    In this work, to develop novel structural materials for the IHX of a VHTR, a more systematic methodology using the design of experiments (DOE) and thermodynamic calculations was proposed. For 32 sets of designs of Ni-Cr-Co-Mo alloys with minor elements of W and Ta, the mass fraction of TCP phases and mechanical properties were calculated, and finally the chemical composition was optimized for further experimental studies by applying the proposed . The highly efficient generation of electricity and the production of massive hydrogen are possible using a very high temperature gas-cooled reactor (VHTR) among generation IV nuclear power plants. The structural material for an intermediate heat exchanger (IHX) among numerous components should be endurable at high temperature of up to 950 .deg. C during long-term operation. Impurities inevitably introduced in helium as a coolant facilitate the material degradation by corrosion at high temperature. This work is concerning a methodology of Ni-Cr-Co-Mo based superalloy developed for VHTR using the design of experiments (DOE) and thermodynamic calculationsmethodology.

  16. Methodology for definition of yellow fever priority areas, based on environmental variables and multiple correspondence analyses.

    Science.gov (United States)

    Moreno, Eduardo Stramandinoli; Barata, Rita de Cássia Barradas

    2012-01-01

    Yellow fever (YF) is endemic in much of Brazil, where cases of the disease are reported every year. Since 2008, outbreaks of the disease have occurred in regions of the country where no reports had been registered for decades, which has obligated public health authorities to redefine risk areas for the disease. The aim of the present study was to propose a methodology of environmental risk analysis for defining priority municipalities for YF vaccination, using as example, the State of São Paulo, Brazil. The municipalities were divided into two groups (affected and unaffected by YF) and compared based on environmental parameters related to the disease's eco-epidemiology. Bivariate analysis was used to identify statistically significant associations between the variables and virus circulation. Multiple correspondence analysis (MCA) was used to evaluate the relationship among the variables and their contribution to the dynamics of YF in Sao Paulo. The MCA generated a factor that was able to differentiate between affected and unaffected municipalities and was used to determine risk levels. This methodology can be replicated in other regions, standardized, and adapted to each context.

  17. Methodology for definition of yellow fever priority areas, based on environmental variables and multiple correspondence analyses.

    Directory of Open Access Journals (Sweden)

    Eduardo Stramandinoli Moreno

    Full Text Available Yellow fever (YF is endemic in much of Brazil, where cases of the disease are reported every year. Since 2008, outbreaks of the disease have occurred in regions of the country where no reports had been registered for decades, which has obligated public health authorities to redefine risk areas for the disease. The aim of the present study was to propose a methodology of environmental risk analysis for defining priority municipalities for YF vaccination, using as example, the State of São Paulo, Brazil. The municipalities were divided into two groups (affected and unaffected by YF and compared based on environmental parameters related to the disease's eco-epidemiology. Bivariate analysis was used to identify statistically significant associations between the variables and virus circulation. Multiple correspondence analysis (MCA was used to evaluate the relationship among the variables and their contribution to the dynamics of YF in Sao Paulo. The MCA generated a factor that was able to differentiate between affected and unaffected municipalities and was used to determine risk levels. This methodology can be replicated in other regions, standardized, and adapted to each context.

  18. Assessment of bioenergy potential in Sicily: A GIS-based support methodology

    International Nuclear Information System (INIS)

    A Geographical Information System (GIS) supported methodology has been developed in order to assess the technical and economic potential of biomass exploitation for energy production in Sicily. The methodology was based on the use of agricultural, economic, climatic, and infrastructural data in a GIS. Data about land use, transportation facilities, urban cartography, regional territorial planning, terrain digital model, lithology, climatic types, and civil and industrial users have been stored in the GIS to define potential areas for gathering the residues coming from the pruning of olive groves, vineyards, and other agricultural crops, and to assess biomass available for energy cultivation. Further, it was possible to assess the potential of biodiesel production, supposing the cultivation of rapeseed in arable crop areas. For the biomass used for direct combustion purposes, the economic availability has been assessed assuming a price of the biomass and comparing it with other fuels. This assessment has shown the strong competitiveness of firewood in comparison with traditional fossil fuels when the collection system is implemented in an efficient way. Moreover, the economic potential of biodiesel was assessed considering the on-going financial regime for fuel. At the same time, the study has shown a significant competitiveness of the finished biomass (pellets), and good potential for a long-term development of this market. An important result was the determination of biofuel production potential in Sicily. An outcome of the study was to show the opportunities stemming from the harmonisation of Energy Policy with the Waste Management System and Rural Development Plan. (author)

  19. Rapid and Robust PCR-Based All-Recombinant Cloning Methodology.

    Science.gov (United States)

    Dubey, Abhishek Anil; Singh, Manika Indrajit; Jain, Vikas

    2016-01-01

    We report here a PCR-based cloning methodology that requires no post-PCR modifications such as restriction digestion and phosphorylation of the amplified DNA. The advantage of the present method is that it yields only recombinant clones thus eliminating the need for screening. Two DNA amplification reactions by PCR are performed wherein the first reaction amplifies the gene of interest from a source template, and the second reaction fuses it with the designed expression vector fragments. These vector fragments carry the essential elements that are required for the fusion product selection. The entire process can be completed in less than 8 hours. Furthermore, ligation of the amplified DNA by a DNA ligase is not required before transformation, although the procedure yields more number of colonies upon transformation if ligation is carried out. As a proof-of-concept, we show the cloning and expression of GFP, adh, and rho genes. Using GFP production as an example, we further demonstrate that the E. coli T7 express strain can directly be used in our methodology for the protein expression immediately after PCR. The expressed protein is without or with 6xHistidine tag at either terminus, depending upon the chosen vector fragments. We believe that our method will find tremendous use in molecular and structural biology. PMID:27007922

  20. A GIS – Based Methodology for Land Suitability Evaluation in Veneto (NE Italy

    Directory of Open Access Journals (Sweden)

    Alba Gallo

    2014-12-01

    Full Text Available Since almost ten years, the Soil Science Research Group in Venice is carrying out studies on the characterization of soils in the Veneto region and their suitability for specific uses. Several areas have been investigated with the aim to select the best land use for a sustainable environment. The scenarios taken into consideration range from the Alpine and pre – Alpine region to the alluvial plain. Attention has been focused especially to land suitability for forestry, typical and niche crops, pasture and vineyard. The land evaluation procedure has been applied by a GIS – based methodology. Today, the GIS techniques are essential for the success of a correct and fast work, concerning the interpretation and processing of soil data and its display in form of map. Integrating information with crop and soil requirements, by means of "matching tables", it was possible to edit and manage land suitability maps for specific purposes. The applied methodology proved a useful and effective tool for sustainable land management.

  1. Ni-based Superalloy Development for VHTR - Methodology Using Design of Experiments and Thermodynamic Calculation

    International Nuclear Information System (INIS)

    In this work, to develop novel structural materials for the IHX of a VHTR, a more systematic methodology using the design of experiments (DOE) and thermodynamic calculations was proposed. For 32 sets of designs of Ni-Cr-Co-Mo alloys with minor elements of W and Ta, the mass fraction of TCP phases and mechanical properties were calculated, and finally the chemical composition was optimized for further experimental studies by applying the proposed . The highly efficient generation of electricity and the production of massive hydrogen are possible using a very high temperature gas-cooled reactor (VHTR) among generation IV nuclear power plants. The structural material for an intermediate heat exchanger (IHX) among numerous components should be endurable at high temperature of up to 950 .deg. C during long-term operation. Impurities inevitably introduced in helium as a coolant facilitate the material degradation by corrosion at high temperature. This work is concerning a methodology of Ni-Cr-Co-Mo based superalloy developed for VHTR using the design of experiments (DOE) and thermodynamic calculationsmethodology

  2. THE METHODOLOGY OF STUDENTS’ SYNERGETIC WORLD OUTLOOK DEVELOPMENT BASED ON THE TRANS-DISCIPLINARY APPROACH

    Directory of Open Access Journals (Sweden)

    Y. A. Solodova

    2015-03-01

    Full Text Available The paper discusses the present stage of the world educational system development influenced by the fast increasing flow of information and knowledge. The situation requires the adequate pedagogical technologies for compressing the learning information; one of them is the transdisciplinary technology based on the synergetic methodology identifying the order parameters and general conformities of organizing the academic content. The trans-disciplinary technologies incorporate the general laws of evolution, Bohr’s principle of complementarity, fundamental concepts of nonlinearity, fractality, actual and potential infinity, etc. As an illustration to the trans-disciplinary approach, the author analyzes the fundamental methodology principles of Aristotle and Newton’s mechanics. The author points out the equal importance of understanding the asymptotic adequacy principle by students of the natural sciences and humanities profiles; implementation of the trans-disciplinary approach being regarded as a way for the fundamental knowledge acquisition and the world outlook development. The research findings are addressed to the higher school academic staff for theoretical and practical applications

  3. THE METHODOLOGY OF STUDENTS’ SYNERGETIC WORLD OUTLOOK DEVELOPMENT BASED ON THE TRANS-DISCIPLINARY APPROACH

    Directory of Open Access Journals (Sweden)

    Y. A. Solodova

    2014-01-01

    Full Text Available The paper discusses the present stage of the world educational system development influenced by the fast increasing flow of information and knowledge. The situation requires the adequate pedagogical technologies for compressing the learning information; one of them is the transdisciplinary technology based on the synergetic methodology identifying the order parameters and general conformities of organizing the academic content. The trans-disciplinary technologies incorporate the general laws of evolution, Bohr’s principle of complementarity, fundamental concepts of nonlinearity, fractality, actual and potential infinity, etc. As an illustration to the trans-disciplinary approach, the author analyzes the fundamental methodology principles of Aristotle and Newton’s mechanics. The author points out the equal importance of understanding the asymptotic adequacy principle by students of the natural sciences and humanities profiles; implementation of the trans-disciplinary approach being regarded as a way for the fundamental knowledge acquisition and the world outlook development. The research findings are addressed to the higher school academic staff for theoretical and practical applications

  4. An Evidence Based Methodology to Facilitate Public Library Non-fiction Collection Development

    Directory of Open Access Journals (Sweden)

    Matthew Kelly

    2015-12-01

    Full Text Available Objective – This research was designed as a pilot study to test a methodology for subject based collection analysis for public libraries. Methods – WorldCat collection data from eight Australian public libraries was extracted using the Collection Evaluation application. The data was aggregated and filtered to assess how the sample’s titles could be compared against the OCLC Conspectus subject categories. A hierarchy of emphasis emerged and this was divided into tiers ranging from 1% of the sample. These tiers were further analysed to quantify their representativeness against both the sample’s titles and the subject categories taken as a whole. The interpretive aspect of the study sought to understand the types of knowledge embedded in the tiers and was underpinned by hermeneutic phenomenology. Results – The study revealed that there was a marked tendency for a small percentage of subject categories to constitute a large proportion of the potential topicality that might have been represented in these types of collections. The study also found that distribution of the aggregated collection conformed to a Power Law distribution (80/20 so that approximately 80% of the collection was represented by 20% of the subject categories. The study also found that there were significant commonalities in the types of subject categories that were found in the designated tiers and that it may be possible to develop ontologies that correspond to the collection tiers. Conclusions – The evidence-based methodology developed in this pilot study has the potential for further development to help to improve the practice of collection development. The introduction of the concept of the epistemic role played by collection tiers is a promising aid to inform our understanding of knowledge organization for public libraries. The research shows a way forward to help to link subjective decision making with a scientifically based approach to managing knowledge

  5. Stratospheric ozone depletion.

    Science.gov (United States)

    Rowland, F Sherwood

    2006-05-29

    Solar ultraviolet radiation creates an ozone layer in the atmosphere which in turn completely absorbs the most energetic fraction of this radiation. This process both warms the air, creating the stratosphere between 15 and 50 km altitude, and protects the biological activities at the Earth's surface from this damaging radiation. In the last half-century, the chemical mechanisms operating within the ozone layer have been shown to include very efficient catalytic chain reactions involving the chemical species HO, HO2, NO, NO2, Cl and ClO. The NOX and ClOX chains involve the emission at Earth's surface of stable molecules in very low concentration (N2O, CCl2F2, CCl3F, etc.) which wander in the atmosphere for as long as a century before absorbing ultraviolet radiation and decomposing to create NO and Cl in the middle of the stratospheric ozone layer. The growing emissions of synthetic chlorofluorocarbon molecules cause a significant diminution in the ozone content of the stratosphere, with the result that more solar ultraviolet-B radiation (290-320 nm wavelength) reaches the surface. This ozone loss occurs in the temperate zone latitudes in all seasons, and especially drastically since the early 1980s in the south polar springtime-the 'Antarctic ozone hole'. The chemical reactions causing this ozone depletion are primarily based on atomic Cl and ClO, the product of its reaction with ozone. The further manufacture of chlorofluorocarbons has been banned by the 1992 revisions of the 1987 Montreal Protocol of the United Nations. Atmospheric measurements have confirmed that the Protocol has been very successful in reducing further emissions of these molecules. Recovery of the stratosphere to the ozone conditions of the 1950s will occur slowly over the rest of the twenty-first century because of the long lifetime of the precursor molecules. PMID:16627294

  6. Shear-affected depletion interaction

    NARCIS (Netherlands)

    July, C.; Kleshchanok, D.; Lang, P.R.

    2012-01-01

    We investigate the influence of flow fields on the strength of the depletion interaction caused by disc-shaped depletants. At low mass concentration of discs, it is possible to continuously decrease the depth of the depletion potential by increasing the applied shear rate until the depletion force i

  7. Satellite based radar interferometry to estimate large-scale soil water depletion from clay shrinkage: possibilities and limitations

    NARCIS (Netherlands)

    Brake, te B.; Hanssen, R.F.; Ploeg, van der M.J.; Rooij, de G.H.

    2013-01-01

    Satellite-based radar interferometry is a technique capable of measuring small surface elevation changes at large scales and with a high resolution. In vadose zone hydrology, it has been recognized for a long time that surface elevation changes due to swell and shrinkage of clayey soils can serve as

  8. Design and Development of a Maintenance Knowledge-Base System Based on Common KADS Methodology

    OpenAIRE

    Arab Maki, Alireza; Shariat Zadeh, Navid

    2010-01-01

    The objective of this thesis is to design and develop a knowledge base model to support the maintenance system structure. The aim of this model is to identify the failure modes which are the heart of maintenance system through the functional analysis and then serves as adecision support system to define the maintenance tasks and finally to implement a preventive maintenance task. This knowledge base management system is suitable to design and develop maintenance system since it encompasses al...

  9. A Model-Based Methodology for Simultaneous Design and Control of a Bioethanol Production Process

    DEFF Research Database (Denmark)

    Alvarado-Morales, Merlin; Abd.Hamid, Mohd-Kamaruddin; Sin, Gürkan;

    2010-01-01

    In this work, a framework for the simultaneous solution of design and control problems is presented. Within this framework, two methodologies are presented, the integrated process design and controller design (IPDC) methodology and the process-group contribution (PGC) methodology. The concepts...... of attainable region (AR), driving force (DF), process-group (PG) and reverse simulation are used within these methodologies. The IPDC methodology is used to find the optimal design-control strategy of a process by locating the maximum point in the AR and DF diagrams for reactor and separator, respectively....... The PGC methodology is used to generate more efficient separation designs in terms of energy consumption by targeting the separation task at the largest DF. Both methodologies are highlighted through the application of two case studies, a bioethanol production process and a succinic acid production...

  10. Generic System Verilog Universal Verification Methodology Based Reusable Verification Environment for Efficient Verification of Image Signal Processing IPS/SOCS

    Directory of Open Access Journals (Sweden)

    Abhishek Jain

    2012-12-01

    Full Text Available In this paper, we present Generic System Verilog Universal Verification Methodology based Reusable Verification Environment for efficient verification of Image Signal Processing IP’s/SoC’s. With the tight schedules on all projects it is important to have a strong verification methodology which contributes to First Silicon Success. Deploy methodologies which enforce full functional coverage and verification of corner cases through pseudo random test scenarios is required. Also, standardization of verification flow is needed. Previously, inside imaging group of ST, Specman (e/Verilog based Verification Environment forIP/Subsystem level verification and C/C++/Verilog based Directed Verification Environment for SoC Level Verification was used for Functional Verification. Different Verification Environments were used at IP level and SoC level. Different Verification/Validation Methodologies were used for SoC Verification across multiple sites. Verification teams were also looking for the ways how to catch bugs early in the design cycle? Thus, Generic System Verilog Universal Verification Methodology (UVM based Reusable Verification Environment is required to avoid the problem of having so many methodologies and provides a standard unified solution which compiles on all tools.

  11. A Methodology for Assessing Skill-Based Educational Outcomes in a Pharmacy Course.

    Science.gov (United States)

    Alston, Gregory L; Griffiths, Carrie L

    2015-09-25

    Objective. To develop a methodology for assessing skill development in a course while providing objective evidence of success and actionable data to improve instructional effectiveness. Design. Course objectives were recast as skills to be demonstrated. Confidence in these skills was surveyed before and after the course. Student skills were demonstrated using 4 work products and a multiple-choice examination. Assessment. The change from precourse survey to postcourse survey was analyzed with a paired t test. Quality of the student work product was assessed using scoring guides. All students demonstrated skill mastery by scoring 70% or above on the work product, and 87/88 demonstrated individual progress on the surveyed skills during the 15-week course. Conclusion. This assessment strategy is based on sound design principles and provides robust multi-modal evidence of student achievement in skill development, which is not currently available using traditional student course evaluation surveys. PMID:27168618

  12. Methodology for reliability allocation based on fault tree analysis and dualistic contrast

    Institute of Scientific and Technical Information of China (English)

    TONG Lili; CAO Xuewu

    2008-01-01

    Reliability allocation is a difficult multi-objective optimization problem.This paper presents a methodology for reliability allocation that can be applied to determine the reliability characteristics of reactor systems or subsystems.The dualistic contrast,known as one of the most powerful tools for optimization problems,is applied to the reliability allocation model of a typical system in this article.And the fault tree analysis,deemed to be one of the effective methods of reliability analysis,is also adopted.Thus a failure rate allocation model based on the fault tree analysis and dualistic contrast is achieved.An application on the emergency diesel generator in the nuclear power plant is given to illustrate the proposed method.

  13. Failure detection and isolation methodology based on the sequential analysis and extended Kalman filter technique

    International Nuclear Information System (INIS)

    A nuclear power plant operation relies on accurate and precise response of the monitoring system in order to assure a safety operational standard during the most predictable operational transients. The signal from the sensor are in general contaminated with noise and also with the randomic fluctuations making a precise plant assessment uncertain, thus with the possibility of erroneous operator decision or even with the false alarm actuation. In practice the noisy environment could even overcome the sensor malfunction misreading the plant operational status. In the present work a new failure detection and isolation (FDI) algorithm has been developed based on the sequential analysis and extended Kalman filter residue monitoring. The present methodology has been applied to both highly redundant monitoring systems and to non redundant systems where high signal reliability is required. (C.M.)

  14. Software development methodology for computer based I&C systems of prototype fast breeder reactor

    International Nuclear Information System (INIS)

    Highlights: • Software development methodology adopted for computer based I&C systems of PFBR is detailed. • Constraints imposed as part of software requirements and coding phase are elaborated. • Compliance to safety and security requirements are described. • Usage of CASE (Computer Aided Software Engineering) tools during software design, analysis and testing phase are explained. - Abstract: Prototype Fast Breeder Reactor (PFBR) is sodium cooled reactor which is in the advanced stage of construction in Kalpakkam, India. Versa Module Europa bus based Real Time Computer (RTC) systems are deployed for Instrumentation & Control of PFBR. RTC systems have to perform safety functions within the stipulated time which calls for highly dependable software. Hence, well defined software development methodology is adopted for RTC systems starting from the requirement capture phase till the final validation of the software product. V-model is used for software development. IEC 60880 standard and AERB SG D-25 guideline are followed at each phase of software development. Requirements documents and design documents are prepared as per IEEE standards. Defensive programming strategies are followed for software development using C language. Verification and validation (V&V) of documents and software are carried out at each phase by independent V&V committee. Computer aided software engineering tools are used for software modelling, checking for MISRA C compliance and to carry out static and dynamic analysis. Various software metrics such as cyclomatic complexity, nesting depth and comment to code are checked. Test cases are generated using equivalence class partitioning, boundary value analysis and cause and effect graphing techniques. System integration testing is carried out wherein functional and performance requirements of the system are monitored

  15. Residual stresses in coating-based systems, part Ⅱ: Optimal designing methodologies

    Institute of Scientific and Technical Information of China (English)

    ZHANG Xiancheng; WU Yixiong; XU Binshi; WANG Haidou

    2007-01-01

    In this part of the work,different cases are studied to illustrate the implementation of the analytical models that have been developed in Part Ⅰ [Front.Mech.Eng.China,2007,2(1):1-12].Different topics are involved in the optimal design of coating-based systems.Some essential relations among material properties and dimensions of the coating and substrate and the residual stress variations are reflected.For the multilayered coating-based systems,some optimal design methodologies are proposed,such as the decrease in stress discontinuity at the interface between the adjacent layers,the curvature as a function of the coating thickness,the effect of the interlayer on the residual stress redistribution,and so on.For the single-layered coating-based systems,some typical approximations that are often used to predict the residual stresses in the coating-based system or bilayer structure are corrected.A simplified model for predicting the quenching stress in a coating is also developed.

  16. An overview of case-based and problem-based learning methodologies for dental education.

    Science.gov (United States)

    Nadershahi, Nader A; Bender, Daniel J; Beck, Lynn; Lyon, Cindy; Blaseio, Alexander

    2013-10-01

    Dental education has undergone significant curriculum reform in response to the 1995 Institute of Medicine report Dental Education at the Crossroads and the series of white papers from the American Dental Education Association Commission on Change and Innovation in Dental Education (ADEA CCI) first published in the Journal of Dental Education and subsequently collected in a volume titled Beyond the Crossroads: Change and Innovation in Dental Education. An important element of this reform has been the introduction into academic dentistry of active learning strategies such as problem-based and case-based learning. As an aide to broadening understanding of these approaches in order to support their expansion in dental education, this article reviews the major characteristics of each approach, situates each in adult learning theory, and discusses the advantages of case-based learning in the development of a multidisciplinary, integrated predoctoral dental curriculum. PMID:24098033

  17. A Methodology for Developing Web-based CAD/CAM Systems: Case Studies on Gear Shaper Cutters

    OpenAIRE

    Malahova, Anna

    2014-01-01

    The research establishes a methodology for developing Web-based CAD/CAM software systems to industrial quality standards in a time and cost e ective manner. The methodology de nes the scope of applicability, outlines major considerations and key principles to follow when developing this kind of software, describes an approach to requirements elicitation, resource allocation and collaboration, establishes strategies for overcoming uncertainty and describes the design concerns fo...

  18. Standardization of formulations for the acute amino acid depletion and loading tests.

    Science.gov (United States)

    Badawy, Abdulla A-B; Dougherty, Donald M

    2015-04-01

    The acute tryptophan depletion and loading and the acute tyrosine plus phenylalanine depletion tests are powerful tools for studying the roles of cerebral monoamines in behaviour and symptoms related to various disorders. The tests use either amino acid mixtures or proteins. Current amino acid mixtures lack specificity in humans, but not in rodents, because of the faster disposal of branched-chain amino acids (BCAAs) by the latter. The high content of BCAA (30-60%) is responsible for the poor specificity in humans and we recommend, in a 50g dose, a control formulation with a lowered BCAA content (18%) as a common control for the above tests. With protein-based formulations, α-lactalbumin is specific for acute tryptophan loading, whereas gelatine is only partially effective for acute tryptophan depletion. We recommend the use of the whey protein fraction glycomacropeptide as an alternative protein. Its BCAA content is ideal for specificity and the absence of tryptophan, tyrosine and phenylalanine render it suitable as a template for seven formulations (separate and combined depletion or loading and a truly balanced control). We invite the research community to participate in standardization of the depletion and loading methodologies by using our recommended amino acid formulation and developing those based on glycomacropeptide.

  19. A GIS-based methodology for safe site selection of a building in a hilly region

    Directory of Open Access Journals (Sweden)

    Satish Kumar

    2016-03-01

    Full Text Available Worker safety during construction is widely accepted, but the selection of safe sites for a building is generally not considered. Safe site selection (SSS largely depends upon compiling, analyzing, and refining the information of an area where a building is likely to be located. The locational and topographical aspects of an area located in hilly regions play a major role in SSS, but are generally neglected in traditional and CAD-based systems used for site selection. Architects and engineers select a site based on their judgment, knowledge, and experience, but issues related to site safety are generally ignored. This study reviewed the existing literature on site selection techniques, building codes, and approaches of existing standards to identify various aspects crucial for SSS in hilly regions. A questionnaire survey was conducted to identify various aspects that construction professionals consider critical for SSS. This study explored the application of geographic information systems (GIS in modeling the locational and topographical aspects to identify areas of suitability. A GIS-based methodology for locating a safe site that satisfies various spatial safety aspects was developed.

  20. Defatted flaxseed meal incorporated corn-rice flour blend based extruded product by response surface methodology.

    Science.gov (United States)

    Ganorkar, Pravin M; Patel, Jhanvi M; Shah, Vrushti; Rangrej, Vihang V

    2016-04-01

    Considering the evidence of flaxseed and its defatted flaxseed meal (DFM) for human health benefits, response surface methodology (RSM) based on three level four factor central composite rotatable design (CCRD) was employed for the development of DFM incorporated corn - rice flour blend based extruded snack. The effect of DFM fortification (7.5-20 %), moisture content of feed (14-20 %, wb), extruder barrel temperature (115-135 °C) and screw speed (300-330 RPM) on expansion ratio (ER), breaking strength (BS), overall acceptability (OAA) score and water solubility index (WSI) of extrudates were investigated using central composite rotatable design (CCRD). Significant regression models explained the effect of considered variables on all responses. DFM incorporation level was found to be most significant independent variable affecting on extrudates characteristics followed by extruder barrel temperature and then screw rpm. Feed moisture content did not affect extrudates characteristics. As DFM level increased (7.5 % to 20 %), ER and OAA value decreased. However, BS and WSI values were found to increase with increase in DFM level. Based on the defined criteria for numerical optimization, the combination for the production of DFM incorporated extruded snack with desired sensory attributes was achieved by incorporating 10 % DFM (replacing rice flour in flour blend) and by keeping 20 % moisture content, 312 screw rpm and 125 °C barrel temperature.

  1. Competency-based curriculum and active methodology: perceptions of nursing students.

    Science.gov (United States)

    Paranhos, Vania Daniele; Mendes, Maria Manuela Rino

    2010-01-01

    This study identifies the perceptions of undergraduate students at the University of São Paulo at Ribeirão Preto, Brazil, College of Nursing (EERP-USP) concerning the teaching-learning process in two courses: Integrated Seminar: Health-Disease/Care Process in Health Services Policies and Organization, which was offered to first-year students in 2005 and 2006 and Integrality in Health Care I and II, which was offered to second-year students in 2006. The courses proposal was to adopt active methodology and competency-based curriculum. Data were collected from written tests submitted to 62 students at the end of the curse, focusing on the tests pertinence, development of performance, structure and pedagogical dynamics, organization and settings. Thematic analysis indicated that students enjoyed the courses, highlighted the role of the professor/facilitator at points of the pedagogical cycle and learning recorded in students portfolios. Students valued their experience in the Primary Health Care setting, which was based on, and has since the beginning of the program been based on, the theory-professional practice interlocution and closeness to the principles of the Unified Health System (SUS). PMID:20428705

  2. Methodology of Ni-base Superalloy Development for VHTR using Design of Experiments and Thermodynamic Calculation

    International Nuclear Information System (INIS)

    This work is concerning a methodology of Ni-base superalloy development for a very high temperature gas-cooled reactor(VHTR) using design of experiments(DOE) and thermodynamic calculations. Total 32 sets of the Ni-base superalloys with various chemical compositions were formulated based on a fractional factorial design of DOE, and the thermodynamic stability of topologically close-packed(TCP) phases of those alloys was calculated by using the THERMO-CALC software. From the statistical evaluation of the effect of the chemical composition on the formation of TCP phase up to a temperature of 950 .deg. C, which should be suppressed for prolonged service life when it used as the structural components of VHTR, 16 sets were selected for further calculation of the mechanical properties. Considering the yield and ultimate tensile strengths of the selected alloys estimated by using the JMATPRO software, the optimized chemical composition of the alloys for VHTR application, especially intermediate heat exchanger, was proposed for a succeeding experimental study

  3. Grouting design based on characterization of the fractured rock. Presentation and demonstration of a methodology

    International Nuclear Information System (INIS)

    The design methodology presented in this document is based on an approach that considers the individual fractures. The observations and analyses made during production enable the design to adapt to the encountered conditions. The document is based on previously published material and overview flow charts are used to show the different steps. Parts of or the full methodology has been applied for a number of tunneling experiments and projects. SKB projects in the Aespoe tunnel include a pillar experiment and pre-grouting of a 70 meter long tunnel (TASQ). Further, for Hallandsas railway tunnel (Skaane south Sweden), a field pre-grouting experiment and design and post-grouting of a section of 133 meters have been made. For the Nygard railway tunnel (north of Goeteborg, Sweden), design and grouting of a section of 86 meters (pre-grouting) and 60 meters (post-grouting) have been performed. Finally, grouting work at the Tornskog tunnel (Stockholm, Sweden) included design and grouting along a 100 meter long section of one of the two tunnel tubes. Of importance to consider when doing a design and evaluating the result are: - The identification of the extent of the grouting needed based on inflow requirements and estimates of tunnel inflow before grouting. - The selection of grout and performance of grouting materials including penetration ability and length. The penetration length is important for the fan geometry design. - The ungrouted compared to the grouted and excavated rock mass conditions: estimates of tunnel inflow and (if available) measured inflows after grouting and excavation. Identify if possible explanations for deviations. For the Hallandsas, Nygard and Tornskog tunnel sections, the use of a Pareto distribution and the estimate of tunnel inflow identified a need for sealing small aperture fractures (< 50 - 100 μm) to meet the inflow requirements. The tunneling projects show that using the hydraulic aperture as a basis for selection of grout is a good

  4. A methodology for capability-based technology evaluation for systems-of-systems

    Science.gov (United States)

    Biltgen, Patrick Thomas

    2007-12-01

    Post-Cold War military conflicts have highlighted the need for a flexible, agile joint force responsive to emerging crises around the globe. The 2005 Joint Capabilities Integration and Development System (JCIDS) acquisition policy document mandates a shift away from stove-piped threat-based acquisition to a capability-based model focused on the multiple ways and means of achieving an effect. This shift requires a greater emphasis on scenarios, tactics, and operational concepts during the conceptual phase of design and structured processes for technology evaluation to support this transition are lacking. In this work, a methodology for quantitative technology evaluation for systems-of-systems is defined. Physics-based models of an aircraft system are exercised within a hierarchical, object-oriented constructive simulation to quantify technology potential in the context of a relevant scenario. A major technical challenge to this approach is the lack of resources to support real-time human-in-the-loop tactical decision making and technology analysis. An approach that uses intelligent agents to create a "Meta-General" capable of forecasting strategic and tactical decisions based on technology inputs is used. To demonstrate the synergy between new technologies and tactics, surrogate models are utilized to provide intelligence to individual agents within the framework and develop a set of tactics that appropriately exploit new technologies. To address the long run-times associated with constructive military simulations, neural network surrogate models are implemented around the forecasting environment to enable rapid trade studies. Probabilistic techniques are used to quantify uncertainty and richly populate the design space with technology-infused alternatives. Since a large amount of data is produced in the analysis of systems-of-systems, dynamic, interactive visualization techniques are used to enable "what-if" games on assumptions, systems, technologies, tactics, and

  5. Research on Part Experssion Methodology of Part Library Based on ISO13584

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    ISO13584 (PLIB) is an international standard, which is established to realize computer's identification ofdata expression and data exchange of part library. In this international standard, part expression methodology ofpart library is an important characteristic, which distinguishes itself from STEP. So, the methodology is a focus inthe research of part library. This article describes the principles of part information expression of part library basedon ISO13584, and the research results of the methodology in details.

  6. A Probabilistic Transmission Expansion Planning Methodology based on Roulette Wheel Selection and Social Welfare

    OpenAIRE

    Gupta, Neeraj; Shekhar, Rajiv; Kalra, Prem Kumar

    2012-01-01

    A new probabilistic methodology for transmission expansion planning (TEP) that does not require a priori specification of new/additional transmission capacities and uses the concept of social welfare has been proposed. Two new concepts have been introduced in this paper: (i) roulette wheel methodology has been used to calculate the capacity of new transmission lines and (ii) load flow analysis has been used to calculate expected demand not served (EDNS). The overall methodology has been imple...

  7. Miedema model based methodology to predict amorphous-forming-composition range in binary and ternary systems

    Energy Technology Data Exchange (ETDEWEB)

    Das, N., E-mail: nirupamd@barc.gov.in [Materials Science Division, Bhabha Atomic Research Centre, Trombay, Mumbai 400 085 (India); Mittra, J. [Materials Science Division, Bhabha Atomic Research Centre, Trombay, Mumbai 400 085 (India); Murty, B.S. [Department of Metallurgical and Materials Engineering, IIT Madras, Chennai 600 036 (India); Pabi, S.K. [Department of Metallurgical and Materials Engineering, IIT Kharagpur, Kharagpur 721 302 (India); Kulkarni, U.D.; Dey, G.K. [Materials Science Division, Bhabha Atomic Research Centre, Trombay, Mumbai 400 085 (India)

    2013-02-15

    Highlights: Black-Right-Pointing-Pointer A methodology was proposed to predict amorphous forming compositions (AFCs). Black-Right-Pointing-Pointer Chemical contribution to enthalpy of mixing {proportional_to} enthalpy of amorphous for AFCs. Black-Right-Pointing-Pointer Accuracy in the prediction of AFC-range was noticed in Al-Ni-Ti system. Black-Right-Pointing-Pointer Mechanical alloying (MA) results of Al-Ni-Ti followed the predicted AFC-range. Black-Right-Pointing-Pointer Earlier MA results of Al-Ni-Ti also conformed to the predicted AFC-range. - Abstract: From the earlier works on the prediction of amorphous forming composition range (AFCR) using Miedema based model and also, on mechanical alloying experiments it has been observed that all amorphous forming compositions of a given alloy system falls within a linear band when the chemical contribution to enthalpy of the solid solution ({Delta}H{sup ss}) is plotted against the enthalpy of mixing in the amorphous phase ({Delta}H{sup amor}). On the basis of this observation, a methodology has been proposed in this article to identify the AFCR of a ternary system that is likely to be more precise than what can be obtained using {Delta}H{sup amor} - {Delta}H{sup ss} < 0 criterion. MA experiments on various compositions of Al-Ni-Ti system, producing amorphous, crystalline, and mixture of amorphous plus crystalline phases have been carried out and the phases have been characterized using X-ray diffraction and transmission electron microscopy techniques. Data from the present MA experiments and, also, from the literature have been used to validate the proposed approach. Also, the proximity of compositions, producing a mixture of amorphous and crystalline phases to the boundary of AFCR in the Al-Ni-Ti ternary has been found useful to validate the effectiveness of the prediction.

  8. The physical vulnerability of elements at risk: a methodology based on fluid and classical mechanics

    Science.gov (United States)

    Mazzorana, B.; Fuchs, S.; Levaggi, L.

    2012-04-01

    The impacts of the flood events occurred in autumn 2011 in the Italian regions Liguria and Tuscany revived the engagement of the public decision makers to enhance in synergy flood control and land use planning. In this context, the design of efficient flood risk mitigation strategies and their subsequent implementation critically relies on a careful vulnerability analysis of both, the immobile and mobile elements at risk potentially exposed to flood hazards. Based on fluid and classical mechanics notions we developed computation schemes enabling for a dynamic vulnerability and risk analysis facing a broad typological variety of elements at risk. The methodological skeleton consists of (1) hydrodynamic computation of the time-varying flood intensities resulting for each element at risk in a succession of loading configurations; (2) modelling the mechanical response of the impacted elements through static, elasto-static and dynamic analyses; (3) characterising the mechanical response through proper structural damage variables and (4) economic valuation of the expected losses as a function of the quantified damage variables. From a computational perspective we coupled the description of the hydrodynamic flow behaviour and the induced structural modifications of the elements at risk exposed. Valuation methods, suitable to support a correct mapping from the value domains of the physical damage variables to the economic loss values are discussed. In such a way we target to complement from a methodological perspective the existing, mainly empirical, vulnerability and risk assessment approaches to refine the conceptual framework of the cost-benefit analysis. Moreover, we aim to support the design of effective flood risk mitigation strategies by diminishing the main criticalities within the systems prone to flood risk.

  9. Methodology for risk analysis based on atmospheric dispersion modelling from nuclear risk sites

    Science.gov (United States)

    Baklanov, A.; Mahura, A.; Sørensen, J. H.; Rigina, O.

    2003-04-01

    The main purpose of this multidisciplinary study is to develop a methodology for complex nuclear risk and vulnerability assessment, and to test it on example of estimation of nuclear risk to the population in the Nordic countries in case of a severe accident at a nuclear risk site (NRS). The main focus of the paper is the methodology for the evaluation of the atmospheric transport and deposition of radioactive pollutants from NRSs. The method developed for this evaluation is derived from a probabilistic point of view. The main question we are trying to answer is: What is the probability for radionuclide atmospheric transport and impact to different neighbouring regions and countries in case of an accident at an NPP? To answer this question we applied a number of different tools: (i) Trajectory Modelling - to calculate multiyear forward trajectories originating over the locations of selected risk sites; (ii) Dispersion Modelling - for long-term simulation and case studies of radionuclide transport from hypothetical accidental releases at NRSs; (iii) Cluster Analysis - to identify atmospheric transport pathways from NRSs; (iv) Probability Fields Analysis - to construct annual, monthly, and seasonal NRS impact indicators to identify the most impacted geographical regions; (v) Specific Case Studies - to estimate consequences for the environment and the populations after a hypothetical accident; (vi) Vulnerability Evaluation to Radioactive Deposition - to describe its persistence in the ecosystems with a focus to the transfer of certain radionuclides into the food chains of key importance for the intake and exposure for a whole population and for certain population groups; (vii) Risk Evaluation and Mapping - to analyse socio-economical consequences for different geographical areas and various population groups taking into account social-geophysical factors and probabilities, and using demographic databases based on GIS analysis.

  10. Artificial Neural Network and Response Surface Methodology Modeling in Ionic Conductivity Predictions of Phthaloylchitosan-Based Gel Polymer Electrolyte

    Directory of Open Access Journals (Sweden)

    Ahmad Danial Azzahari

    2016-01-01

    Full Text Available A gel polymer electrolyte system based on phthaloylchitosan was prepared. The effects of process variables, such as lithium iodide, caesium iodide, and 1-butyl-3-methylimidazolium iodide were investigated using a distance-based ternary mixture experimental design. A comparative approach was made between response surface methodology (RSM and artificial neural network (ANN to predict the ionic conductivity. The predictive capabilities of the two methodologies were compared in terms of coefficient of determination R2 based on the validation data set. It was shown that the developed ANN model had better predictive outcome as compared to the RSM model.

  11. Testing fully depleted CCD

    Science.gov (United States)

    Casas, Ricard; Cardiel-Sas, Laia; Castander, Francisco J.; Jiménez, Jorge; de Vicente, Juan

    2014-08-01

    The focal plane of the PAU camera is composed of eighteen 2K x 4K CCDs. These devices, plus four spares, were provided by the Japanese company Hamamatsu Photonics K.K. with type no. S10892-04(X). These detectors are 200 μm thick fully depleted and back illuminated with an n-type silicon base. They have been built with a specific coating to be sensitive in the range from 300 to 1,100 nm. Their square pixel size is 15 μm. The read-out system consists of a Monsoon controller (NOAO) and the panVIEW software package. The deafualt CCD read-out speed is 133 kpixel/s. This is the value used in the calibration process. Before installing these devices in the camera focal plane, they were characterized using the facilities of the ICE (CSIC- IEEC) and IFAE in the UAB Campus in Bellaterra (Barcelona, Catalonia, Spain). The basic tests performed for all CCDs were to obtain the photon transfer curve (PTC), the charge transfer efficiency (CTE) using X-rays and the EPER method, linearity, read-out noise, dark current, persistence, cosmetics and quantum efficiency. The X-rays images were also used for the analysis of the charge diffusion for different substrate voltages (VSUB). Regarding the cosmetics, and in addition to white and dark pixels, some patterns were also found. The first one, which appears in all devices, is the presence of half circles in the external edges. The origin of this pattern can be related to the assembly process. A second one appears in the dark images, and shows bright arcs connecting corners along the vertical axis of the CCD. This feature appears in all CCDs exactly in the same position so our guess is that the pattern is due to electrical fields. Finally, and just in two devices, there is a spot with wavelength dependence whose origin could be the result of a defectous coating process.

  12. Towards A Model-based Prognostics Methodology for Electrolytic Capacitors: A Case Study Based on Electrical Overstress Accelerated Aging

    Data.gov (United States)

    National Aeronautics and Space Administration — A remaining useful life prediction methodology for elec- trolytic capacitors is presented. This methodology adopts a Kalman filter approach in conjunction with an...

  13. Towards A Model-Based Prognostics Methodology For Electrolytic Capacitors: A Case Study Based On Electrical Overstress Accelerated Aging

    Data.gov (United States)

    National Aeronautics and Space Administration — This paper presents a model-driven methodology for predict- ing the remaining useful life of electrolytic capacitors. This methodology adopts a Kalman filter...

  14. Depletion of intense fields

    CERN Document Server

    Seipt, D; Marklund, M; Bulanov, S S

    2016-01-01

    The interaction of charged particles and photons with intense electromagnetic fields gives rise to multi-photon Compton and Breit-Wheeler processes. These are usually described in the framework of the external field approximation, where the electromagnetic field is assumed to have infinite energy. However, the multi-photon nature of these processes implies the absorption of a significant number of photons, which scales as the external field amplitude cubed. As a result, the interaction of a highly charged electron bunch with an intense laser pulse can lead to significant depletion of the laser pulse energy, thus rendering the external field approximation invalid. We provide relevant estimates for this depletion and find it to become important in the interaction between fields of amplitude $a_0 \\sim 10^3$ and electron bunches with charges of the order of nC.

  15. Wheeling Charges Methodology for Deregulated Electricity Markets using Tracing-based Postage Stamp Methods

    Directory of Open Access Journals (Sweden)

    M.Y. Hassan

    2011-12-01

    Full Text Available MW-mile and Postage-stamp methods is traditionally used by electric utilities to determine a fixed transmission cost among users of firm transmission service. MW-Mile method is charging the users by determining the actual paths the power follows through the network. However, this method is not sufficient to recover the total transmission system cost. To recover the total transmission system cost, the Postage Stamp Method is adopted. This method is simple but its main drawback is that the charges paid by each user do not reflect the actual use of the network but based on the average usage of the entire network. This paper proposes a new wheeling charges methodology using tracing-based postage stamp methods. The proposed method allocates transmission costs among the generators proportional to the total power delivered to the load through transmission lines. The proposed method incorporates with generalised generation distribution factors to trace the contribution of each generator to the line flow. One unique feature of the proposed method is the consideration of the local load on the power flow allocation. Two case studies of 3-bus and IEEE 14-bus systems are used to illustrate the proposed method. Results show that the proposed method provides fair and equitable wheeling charges to generators reflecting the actual usage of the transmission system.

  16. Affinity-based methodologies and ligands for antibody purification: advances and perspectives.

    Science.gov (United States)

    Roque, Ana C A; Silva, Cláudia S O; Taipa, M Angela

    2007-08-10

    Many successful, recent therapies for life-threatening diseases such as cancer and rheumatoid arthritis are based on the recognition between native or genetically engineered antibodies and cell-surface receptors. Although naturally produced by the immune system, the need for antibodies with unique specificities and designed for single application, has encouraged the search for novel antibody purification strategies. The availability of these products to the end-consumer is strictly related to manufacture costs, particularly those attributed to downstream processing. Over the last decades, academia and industry have developed different types of interactions and separation techniques for antibody purification, affinity-based strategies being the most common and efficient methodologies. The affinity ligands utilized range from biological to synthetic designed molecules with enhanced resistance and stability. Despite the successes achieved, the purification "paradigm" still moves interests and efforts in the continuous demand for improved separation performances. This review will focus on recent advances and perspectives in antibody purification by affinity interactions using different techniques, with particular emphasis on affinity chromatography. PMID:17618635

  17. A Test Methodology for Determining Space-Readiness of Xilinx SRAM-Based FPGA Designs

    Energy Technology Data Exchange (ETDEWEB)

    Quinn, Heather M [Los Alamos National Laboratory; Graham, Paul S [Los Alamos National Laboratory; Morgan, Keith S [Los Alamos National Laboratory; Caffrey, Michael P [Los Alamos National Laboratory

    2008-01-01

    Using reconfigurable, static random-access memory (SRAM) based field-programmable gate arrays (FPGAs) for space-based computation has been an exciting area of research for the past decade. Since both the circuit and the circuit's state is stored in radiation-tolerant memory, both could be alterd by the harsh space radiation environment. Both the circuit and the circuit's state can be prote cted by triple-moduler redundancy (TMR), but applying TMR to FPGA user designs is often an error-prone process. Faulty application of TMR could cause the FPGA user circuit to output incorrect data. This paper will describe a three-tiered methodology for testing FPGA user designs for space-readiness. We will describe the standard approach to testing FPGA user designs using a particle accelerator, as well as two methods using fault injection and a modeling tool. While accelerator testing is the current 'gold standard' for pre-launch testing, we believe the use of fault injection and modeling tools allows for easy, cheap and uniform access for discovering errors early in the design process.

  18. Three-dimensional design methodologies for tree-based FPGA architecture

    CERN Document Server

    Pangracious, Vinod; Mehrez, Habib

    2015-01-01

    This book focuses on the development of 3D design and implementation methodologies for Tree-based FPGA architecture. It also stresses the needs for new and augmented 3D CAD tools to support designs such as, the design for 3D, to manufacture high performance 3D integrated circuits and reconfigurable FPGA-based systems. This book was written as a text that covers the foundations of 3D integrated system design and FPGA architecture design. It was written for the use in an elective or core course at the graduate level in field of Electrical Engineering, Computer Engineering and Doctoral Research programs. No previous background on 3D integration is required, nevertheless fundamental understanding of 2D CMOS VLSI design is required. It is assumed that reader has taken the core curriculum in Electrical Engineering or Computer Engineering, with courses like CMOS VLSI design, Digital System Design and Microelectronics Circuits being the most important. It is accessible for self-study by both senior students and profe...

  19. Temperature-based estimation of global solar radiation using soft computing methodologies

    Science.gov (United States)

    Mohammadi, Kasra; Shamshirband, Shahaboddin; Danesh, Amir Seyed; Abdullah, Mohd Shahidan; Zamani, Mazdak

    2016-07-01

    Precise knowledge of solar radiation is indeed essential in different technological and scientific applications of solar energy. Temperature-based estimation of global solar radiation would be appealing owing to broad availability of measured air temperatures. In this study, the potentials of soft computing techniques are evaluated to estimate daily horizontal global solar radiation (DHGSR) from measured maximum, minimum, and average air temperatures ( T max, T min, and T avg) in an Iranian city. For this purpose, a comparative evaluation between three methodologies of adaptive neuro-fuzzy inference system (ANFIS), radial basis function support vector regression (SVR-rbf), and polynomial basis function support vector regression (SVR-poly) is performed. Five combinations of T max, T min, and T avg are served as inputs to develop ANFIS, SVR-rbf, and SVR-poly models. The attained results show that all ANFIS, SVR-rbf, and SVR-poly models provide favorable accuracy. Based upon all techniques, the higher accuracies are achieved by models (5) using T max- T min and T max as inputs. According to the statistical results, SVR-rbf outperforms SVR-poly and ANFIS. For SVR-rbf (5), the mean absolute bias error, root mean square error, and correlation coefficient are 1.1931 MJ/m2, 2.0716 MJ/m2, and 0.9380, respectively. The survey results approve that SVR-rbf can be used efficiently to estimate DHGSR from air temperatures.

  20. A NOVEL METHODOLOGY FOR CONSTRUCTING RULE-BASED NAÏVE BAYESIAN CLASSIFIERS

    Directory of Open Access Journals (Sweden)

    Abdallah Alashqur

    2015-02-01

    Full Text Available Classification is an important data mining technique that is used by many applications. Several types of classifiers have been described in the research literature. Example classifiers are decision tree classifiers, rule-based classifiers, and neural networks classifiers. Another popular classification technique is naïve Bayesian classification. Naïve Bayesian classification is a probabilistic classification approach that uses Bayesian Theorem to predict the classes of unclassified records. A drawback of Naïve Bayesian Classification is that every time a new data record is to be classified, the entire dataset needs to be scanned in order to apply a set of equations that perform the classification. Scanning the dataset is normally a very costly step especially if the dataset is very large. To alleviate this problem, a new approach for using naïve Bayesian classification is introduced in this study. In this approach, a set of classification rules is constructed on top of naïve Bayesian classifier. Hence we call this approach Rule-based Naïve Bayesian Classifier (RNBC. In RNBC, the dataset is canned only once, off-line, at the time of building the classification rule set. Subsequent scanning of the dataset, is avoided. Furthermore, this study introduces a simple three-step methodology for constructing the classification rule set.

  1. Affinity-based methodologies and ligands for antibody purification: advances and perspectives.

    Science.gov (United States)

    Roque, Ana C A; Silva, Cláudia S O; Taipa, M Angela

    2007-08-10

    Many successful, recent therapies for life-threatening diseases such as cancer and rheumatoid arthritis are based on the recognition between native or genetically engineered antibodies and cell-surface receptors. Although naturally produced by the immune system, the need for antibodies with unique specificities and designed for single application, has encouraged the search for novel antibody purification strategies. The availability of these products to the end-consumer is strictly related to manufacture costs, particularly those attributed to downstream processing. Over the last decades, academia and industry have developed different types of interactions and separation techniques for antibody purification, affinity-based strategies being the most common and efficient methodologies. The affinity ligands utilized range from biological to synthetic designed molecules with enhanced resistance and stability. Despite the successes achieved, the purification "paradigm" still moves interests and efforts in the continuous demand for improved separation performances. This review will focus on recent advances and perspectives in antibody purification by affinity interactions using different techniques, with particular emphasis on affinity chromatography.

  2. Methodology for Assessing Daylighting Design Strategies in Classroom with a Climate-Based Method

    Directory of Open Access Journals (Sweden)

    María Beatriz Piderit Moreno

    2015-01-01

    Full Text Available Considering the importance of daylight in the performance and well-being of the students, an investigation has been carried out in daylit classrooms. The objective was applying a methodology that integrates the daylight variations to know the annual lighting performance in typologies that resulted from passive design strategies in order to compare their performance. The context of the study was three zones of Chile: north, center and south. The study was done through a climate-based daylight modelling method using Radiance software. The interior illuminance was evaluated in relation to a target illuminance value (goal-oriented assessment, for which five intervals are defined: low, too low, in range, high and too high. The aim of the goal-oriented approach is to set a target range of values and assess the percentage of time over the year where each range is reached and the percentage of spaces in a temporal map within in range during the year. To see a compliance or non-compliance indicator, a category is proposed that considers the average annual illuminance “in range” over the year identifying which one is the most efficient. Finally, it is concluded that the information obtained is based on target ranges, which allows guiding the design decisions, effectively recognizing the annual performance.

  3. A Platform-Based Methodology for System-Level Mixed-Signal Design

    Directory of Open Access Journals (Sweden)

    Alberto Sangiovanni-Vincentelli

    2010-01-01

    Full Text Available The complexity of today's embedded electronic systems as well as their demanding performance and reliability requirements are such that their design can no longer be tackled with ad hoc techniques while still meeting tight time to-market constraints. In this paper, we present a system level design approach for electronic circuits, utilizing the platform-based design (PBD paradigm as the natural framework for mixed-domain design formalization. In PBD, a meet-in-the-middle approach allows systematic exploration of the design space through a series of top-down mapping of system constraints onto component feasibility models in a platform library, which is based on bottom-up characterizations. In this framework, new designs can be assembled from the precharacterized library components, giving the highest priority to design reuse, correct assembly, and efficient design flow from specifications to implementation. We apply concepts from design centering to enforce robustness to modeling errors as well as process, voltage, and temperature variations, which are currently plaguing embedded system design in deep-submicron technologies. The effectiveness of our methodology is finally shown on the design of a pipeline A/D converter and two receiver front-ends for UMTS and UWB communications.

  4. Capital expenditure and depletion

    Energy Technology Data Exchange (ETDEWEB)

    Rech, O.; Saniere, A

    2003-07-01

    In the future, the increase in oil demand will be covered for the most part by non conventional oils, but conventional sources will continue to represent a preponderant share of the world oil supply. Their depletion represents a complex challenge involving technological, economic and political factors. At the same time, there is reason for concern about the decrease in exploration budgets at the major oil companies. (author)

  5. Capital expenditure and depletion

    International Nuclear Information System (INIS)

    In the future, the increase in oil demand will be covered for the most part by non conventional oils, but conventional sources will continue to represent a preponderant share of the world oil supply. Their depletion represents a complex challenge involving technological, economic and political factors. At the same time, there is reason for concern about the decrease in exploration budgets at the major oil companies. (author)

  6. Predicting pedestrian flow: a methodology and a proof of concept based on real-life data.

    Science.gov (United States)

    Davidich, Maria; Köster, Gerta

    2013-01-01

    Building a reliable predictive model of pedestrian motion is very challenging: Ideally, such models should be based on observations made in both controlled experiments and in real-world environments. De facto, models are rarely based on real-world observations due to the lack of available data; instead, they are largely based on intuition and, at best, literature values and laboratory experiments. Such an approach is insufficient for reliable simulations of complex real-life scenarios: For instance, our analysis of pedestrian motion under natural conditions at a major German railway station reveals that the values for free-flow velocities and the flow-density relationship differ significantly from widely used literature values. It is thus necessary to calibrate and validate the model against relevant real-life data to make it capable of reproducing and predicting real-life scenarios. In this work we aim at constructing such realistic pedestrian stream simulation. Based on the analysis of real-life data, we present a methodology that identifies key parameters and interdependencies that enable us to properly calibrate the model. The success of the approach is demonstrated for a benchmark model, a cellular automaton. We show that the proposed approach significantly improves the reliability of the simulation and hence the potential prediction accuracy. The simulation is validated by comparing the local density evolution of the measured data to that of the simulated data. We find that for our model the most sensitive parameters are: the source-target distribution of the pedestrian trajectories, the schedule of pedestrian appearances in the scenario and the mean free-flow velocity. Our results emphasize the need for real-life data extraction and analysis to enable predictive simulations. PMID:24386186

  7. Predicting pedestrian flow: a methodology and a proof of concept based on real-life data.

    Directory of Open Access Journals (Sweden)

    Maria Davidich

    Full Text Available Building a reliable predictive model of pedestrian motion is very challenging: Ideally, such models should be based on observations made in both controlled experiments and in real-world environments. De facto, models are rarely based on real-world observations due to the lack of available data; instead, they are largely based on intuition and, at best, literature values and laboratory experiments. Such an approach is insufficient for reliable simulations of complex real-life scenarios: For instance, our analysis of pedestrian motion under natural conditions at a major German railway station reveals that the values for free-flow velocities and the flow-density relationship differ significantly from widely used literature values. It is thus necessary to calibrate and validate the model against relevant real-life data to make it capable of reproducing and predicting real-life scenarios. In this work we aim at constructing such realistic pedestrian stream simulation. Based on the analysis of real-life data, we present a methodology that identifies key parameters and interdependencies that enable us to properly calibrate the model. The success of the approach is demonstrated for a benchmark model, a cellular automaton. We show that the proposed approach significantly improves the reliability of the simulation and hence the potential prediction accuracy. The simulation is validated by comparing the local density evolution of the measured data to that of the simulated data. We find that for our model the most sensitive parameters are: the source-target distribution of the pedestrian trajectories, the schedule of pedestrian appearances in the scenario and the mean free-flow velocity. Our results emphasize the need for real-life data extraction and analysis to enable predictive simulations.

  8. A Methodology for Equitable Performance Assessment and Presentation of Wave Energy Converters Based on Sea Trials

    DEFF Research Database (Denmark)

    Kofoed, Jens Peter; Pecher, Arthur; Margheritini, Lucia;

    2013-01-01

    This paper provides a methodology for the analysis and presentation of data obtained from sea trials of wave energy converters (WEC). The equitable aspect of this methodology lies in its wide application, as any WEC at any scale or stage of development can be considered as long as the tests are p...

  9. Progressive design methodology for complex engineering systems based on multiobjective genetic algorithms and linguistic decision making

    NARCIS (Netherlands)

    Kumar, P.; Bauer, P.

    2008-01-01

    This work focuses on a design methodology that aids in design and development of complex engineering systems. This design methodology consists of simulation, optimization and decision making. Within this work a framework is presented in which modelling, multi-objective optimization and multi criteri

  10. Methodologies in Cultural-Historical Activity Theory: The Example of School-Based Development

    Science.gov (United States)

    Postholm, May Britt

    2015-01-01

    Background and purpose: Relatively little research has been conducted on methodology within Cultural-Historical Activity Theory (CHAT). CHAT is mainly used as a framework for developmental processes. The purpose of this article is to discuss both focuses for research and research questions within CHAT and to outline methodologies that can be used…

  11. Environmental external effects from wind power based on the EU ExternE methodology

    DEFF Research Database (Denmark)

    Ibsen, Liselotte Schleisner; Nielsen, Per Sieverts

    1998-01-01

    The European Commission has launched a major study project, ExternE, to develop a methodology to quantify externalities. A “National Implementation Phase”, was started under the Joule II programme with the purpose of implementing the ExternE methodology in all member states. The main objective...

  12. Fully Depleted Charge-Coupled Devices

    International Nuclear Information System (INIS)

    We have developed fully depleted, back-illuminated CCDs that build upon earlier research and development efforts directed towards technology development of silicon-strip detectors used in high-energy-physics experiments. The CCDs are fabricated on the same type of high-resistivity, float-zone-refined silicon that is used for strip detectors. The use of high-resistivity substrates allows for thick depletion regions, on the order of 200-300 um, with corresponding high detection efficiency for near-infrared and soft x-ray photons. We compare the fully depleted CCD to the p-i-n diode upon which it is based, and describe the use of fully depleted CCDs in astronomical and x-ray imaging applications

  13. A game-based decision support methodology for competitive systems design

    Science.gov (United States)

    Briceno, Simon Ignacio

    This dissertation describes the development of a game-based methodology that facilitates the exploration and selection of research and development (R&D) projects under uncertain competitive scenarios. The proposed method provides an approach that analyzes competitor positioning and formulates response strategies to forecast the impact of technical design choices on a project's market performance. A critical decision in the conceptual design phase of propulsion systems is the selection of the best architecture, centerline, core size, and technology portfolio. This selection can be challenging when considering evolving requirements from both the airframe manufacturing company and the airlines in the market. Furthermore, the exceedingly high cost of core architecture development and its associated risk makes this strategic architecture decision the most important one for an engine company. Traditional conceptual design processes emphasize performance and affordability as their main objectives. These areas alone however, do not provide decision-makers with enough information as to how successful their engine will be in a competitive market. A key objective of this research is to examine how firm characteristics such as their relative differences in completing R&D projects, differences in the degree of substitutability between different project types, and first/second-mover advantages affect their product development strategies. Several quantitative methods are investigated that analyze business and engineering strategies concurrently. In particular, formulations based on the well-established mathematical field of game theory are introduced to obtain insights into the project selection problem. The use of game theory is explored in this research as a method to assist the selection process of R&D projects in the presence of imperfect market information. The proposed methodology focuses on two influential factors: the schedule uncertainty of project completion times and

  14. The use of depleted uranium ammunition under contemporary international law: is there a need for a treaty-based ban on DU weapons?

    Science.gov (United States)

    Borrmann, Robin

    2010-01-01

    This article examines whether the use of Depleted Uranium (DU) munitions can be considered illegal under current public international law. The analysis covers the law of arms control and focuses in particular on international humanitarian law. The article argues that DU ammunition cannot be addressed adequately under existing treaty based weapon bans, such as the Chemical Weapons Convention, due to the fact that DU does not meet the criteria required to trigger the applicability of those treaties. Furthermore, it is argued that continuing uncertainties regarding the effects of DU munitions impedes a reliable review of the legality of their use under various principles of international law, including the prohibition on employing indiscriminate weapons; the prohibition on weapons that are intended, or may be expected, to cause widespread, long-term and severe damage to the natural environment; and the prohibition on causing unnecessary suffering or superfluous injury. All of these principles require complete knowledge of the effects of the weapon in question. Nevertheless, the author argues that the same uncertainty places restrictions on the use of DU under the precautionary principle. The paper concludes with an examination of whether or not there is a need for--and if so whether there is a possibility of achieving--a Convention that comprehensively outlaws the use, transfer and stockpiling of DU weapons, as proposed by some non-governmental organisations (NGOs).

  15. DESIGN METHODOLOGY OF NETWORKED SOFTWARE EVOLUTION GROWTH BASED ON SOFTWARE PATTERNS

    Institute of Scientific and Technical Information of China (English)

    Keqing HE; Rong PENG; Jing LIU; Fei HE; Peng LIANG; Bing LI

    2006-01-01

    Recently, some new characteristics of complex networks attract the attentions of scientists in different fields, and lead to many kinds of emerging research directions. So far, most of the research work has been limited in discovery of complex network characteristics by structure analysis in large-scale software systems. This paper presents the theoretical basis, design method, algorithms and experiment results of the research. It firstly emphasizes the significance of design method of evolution growth for network topology of Object Oriented (OO) software systems, and argues that the selection and modulation of network models with various topology characteristics will bring un-ignorable effect on the process of design and implementation of OO software systems. Then we analyze the similar discipline of "negation of negation and compromise" between the evolution of network models with different topology characteristics and the development of software modelling methods. According to the analysis of the growth features of software patterns, we propose an object-oriented software network evolution growth method and its algorithms in succession. In addition, we also propose the parameter systems for Oosoftware system metrics based on complex network theory. Based on these parameter systems, it can analyze the features of various nodes, links and local-world, modulate the network topology and guide the software metrics. All these can be helpful to the detailed design, implementation and performance analysis. Finally, we focus on the application of the evolution algorithms and demonstrate it by a case study.Comparing the results from our early experiments with methodologies in empirical software engineering, we believe that the proposed software engineering design method is a computational software engineering approach based on complex network theory. We argue that this method should be greatly beneficial for the design, implementation, modulation and metrics of

  16. Ozone-depleting Substances (ODS)

    Data.gov (United States)

    U.S. Environmental Protection Agency — This site includes all of the ozone-depleting substances (ODS) recognized by the Montreal Protocol. The data include ozone depletion potentials (ODP), global...

  17. Development of flaxseed fortified rice - corn flour blend based extruded product by response surface methodology.

    Science.gov (United States)

    Ganorkar, P M; Jain, R K

    2015-08-01

    Flaxseed imparted the evidence of health benefits in human being. Response surface methodology (RSM) was employed to develop flaxseed fortified rice - corn flour blend based extruded product using twin screw extruder. The effect of roasted flaxseed flour (RFF) fortification (15-25 %), moisture content of feed (12-16 %, wb), extruder barrel temperature (120-140 °C) and screw speed (300-330 RPM) on expansion ratio (ER), breaking strength (BS), bulk density (BD) and overall acceptability (OAA) score of extrudates were investigated using central composite rotatable design (CCRD). Increased RFF level decreased the ER and OAA score significantly while increased BS and BD of extrudates (p extruder feed was positively related to ER (p Extruder barrel temperature was found to be negatively related to ER and OAA (p screw speed was significantly positively related to ER (p extruder feed, 120 °C extruder barrel temperature and 330 RPM of screw speed gave an optimized product of high desirability with corresponding responses as 3.08 ER, 0.53 kgf BS, 0.106 g.cm(-3) BD and 7.86 OAA.

  18. Developing More Insights on Sustainable Consumption in China Based on Q Methodology

    Directory of Open Access Journals (Sweden)

    Ying Qu

    2015-10-01

    Full Text Available Being an important aspect of sustainable development, sustainable consumption has attracted great attention among Chinese politicians and academia, and Chinese governments have established policies that encourage sustainable consumption behaviors. However, unsustainable consumption behavior still remains predominant in China. This paper aims to classify consumers with similar traits, in terms of the characteristics of practicing sustainable consumption, into one group, so that their traits can be clearly understood, to enable governments to establish pointed policies for different groups of consumers. Q methodology, generally used to reveal the subjectivity of human beings involved in any situation, is applied in this paper to classify Chinese consumers based on Q sample design and data collection and analysis. Next, the traits of each group are analyzed in detail and comparison analyses are also conducted to compare the common and differentiating factors among the three groups. The results show that Chinese consumers can be classified into three groups: sustainable (Group 1, potential sustainable (Group 2 and unsustainable consumers (Group 3, according to their values and attitudes towards sustainable consumption. As such, Group 1 cares for the environment and has strong environmental values. They understand sustainable consumption and its functions. Group 2 needs more enlightenments and external stimuli to motivate them to consume sustainably. Group 3 needs to be informed about and educated on sustainable consumption to enable them to change their consumption behavior from unsustainable to sustainable. Suggestions and implications of encouraging each group of consumers to engage in sustainable consumption are also provided.

  19. ANN-GA hybrid methodology based optimization study for microbial production of CoQ10

    Directory of Open Access Journals (Sweden)

    Shruti Bajpai

    2015-01-01

    Full Text Available Ubiquinone-10 also known as CoQ10 is a potent antioxidant which is found at membrane-bound electron transport system and has a wide range of therapeutic use. Purpose: The purpose of this study was to implement fermentation process optimization for production of CoQ10 by using Pseudomonas diminuta NCIM 2865. Methods: Significant medium components with respect to CoQ10 production were identified using Plackett- Burman design wherein their interaction was studied using response surface methodology (RSM. CoQ10 production increased considerably from 10.8 to 18.57 mg/l when fermentation was carried out in RSM optimised medium. Further, production of CoQ10 was increased by using the predictive results of ANN-GA (artificial neural network and genetic algorithm hybrid method. Results and Conclusions: This increased the yield of CoQ10 18.57 to 27.9 mg/l. The experimental study using ANN-GA based optimized medium condition in the presence of carrot juice as precursor for the CoQ10 production reported as yield of 34.4 mg/l, quite high compared to the earlier studies.

  20. IIR filtering based adaptive active vibration control methodology with online secondary path modeling using PZT actuators

    Science.gov (United States)

    Boz, Utku; Basdogan, Ipek

    2015-12-01

    Structural vibrations is a major cause for noise problems, discomfort and mechanical failures in aerospace, automotive and marine systems, which are mainly composed of plate-like structures. In order to reduce structural vibrations on these structures, active vibration control (AVC) is an effective approach. Adaptive filtering methodologies are preferred in AVC due to their ability to adjust themselves for varying dynamics of the structure during the operation. The filtered-X LMS (FXLMS) algorithm is a simple adaptive filtering algorithm widely implemented in active control applications. Proper implementation of FXLMS requires availability of a reference signal to mimic the disturbance and model of the dynamics between the control actuator and the error sensor, namely the secondary path. However, the controller output could interfere with the reference signal and the secondary path dynamics may change during the operation. This interference problem can be resolved by using an infinite impulse response (IIR) filter which considers feedback of the one or more previous control signals to the controller output and the changing secondary path dynamics can be updated using an online modeling technique. In this paper, IIR filtering based filtered-U LMS (FULMS) controller is combined with online secondary path modeling algorithm to suppress the vibrations of a plate-like structure. The results are validated through numerical and experimental studies. The results show that the FULMS with online secondary path modeling approach has more vibration rejection capabilities with higher convergence rate than the FXLMS counterpart.

  1. Fast Lemons and Sour Boulders: Testing Crossmodal Correspondences Using an Internet-Based Testing Methodology

    Directory of Open Access Journals (Sweden)

    Andy T. Woods

    2013-09-01

    Full Text Available According to a popular family of hypotheses, crossmodal matches between distinct features hold because they correspond to the same polarity on several conceptual dimensions (such as active–passive, good–bad, etc. that can be identified using the semantic differential technique. The main problem here resides in turning this hypothesis into testable empirical predictions. In the present study, we outline a series of plausible consequences of the hypothesis and test a variety of well-established and previously untested crossmodal correspondences by means of a novel internet-based testing methodology. The results highlight that the semantic hypothesis cannot easily explain differences in the prevalence of crossmodal associations built on the same semantic pattern (fast lemons, slow prunes, sour boulders, heavy red; furthermore, the semantic hypothesis only minimally predicts what happens when the semantic dimensions and polarities that are supposed to drive such crossmodal associations are made more salient (e.g., by adding emotional cues that ought to make the good/bad dimension more salient; finally, the semantic hypothesis does not explain why reliable matches are no longer observed once intramodal dimensions with congruent connotations are presented (e.g., visually presented shapes and colour do not appear to correspond.

  2. Development of a risk monitoring system for nuclear power plants based on GO-FLOW methodology

    International Nuclear Information System (INIS)

    Highlights: • A method for developing Living PSA is proposed. • Living PSA is easy to update with online modification to system model file. • A risk monitoring system is designed and developed using the GO-FLOW. • The risk monitoring system is useful for plant daily operation risk management. - Abstract: The paper presents a risk monitoring system developed based on GO-FLOW methodology which is a success-oriented system reliability modeling technique for phased mission as well as time-dependent problems analysis. The risk monitoring system is designed to receive information on plant configuration changes either from equipment failures, operator interventions, or maintenance activities, then update the Living PSA model with online modification to the system GO-FLOW model file which contains all the functional modes of equipment represented by a proposed generalized GO-FLOW modeling structure, and display risk values graphically. The risk monitoring system can be used to assist safety engineers and plant operators in their maintenance management and daily operation risk management at NPPs

  3. METHODOLOGICAL BASES OF THE OPTIMIZATION OF ORGANIZATIONAL MANAGEMENT STRUCTURE AT IMPLEMENTING THE MAJOR CONSTRUCTION ENTERPRISE STRATEGY

    Directory of Open Access Journals (Sweden)

    Rodionova Svetlana Vladimirovna

    2015-09-01

    Full Text Available Planning and implementation of innovations on the microlevel of management and on the higher levels is a process of innovative projects portfolio implementation. Project management is aimed at some goal; therefore, defining the mission and aims of implementation is of primary importance. These are the part of the notion of development strategy of an enterprise. Creating a strategy for big construction holding companies is complicated by the necessity to account for different factors effecting each business-block and subsidiary companies. The authors specify an algorithm of development and implementation of the activity strategy of a big construction enterprise. A special importance of the correspondence of organizational management structure to the implemented strategy is shown. The innovative character of organizational structure change is justified. The authors offer methods to optimize the organizational management structure based on communication approach with the use of the elements graph theory. The offered methodological provisions are tested on the example of the Russian JSC “RZhDstroy”.

  4. Designing reasonable accommodation of the workplace: a new methodology based on risk assessment.

    Science.gov (United States)

    Pigini, L; Andrich, R; Liverani, G; Bucciarelli, P; Occhipinti, E

    2010-05-01

    If working tasks are carried out in inadequate conditions, workers with functional limitations may, over time, risk developing further disabilities. While several validated risk assessment methods exist for able-bodied workers, few studies have been carried out for workers with disabilities. This article, which reports the findings of a Study funded by the Italian Ministry of Labour, proposes a general methodology for the technical and organisational re-design of a worksite, based on risk assessment and irrespective of any worker disability. To this end, a sample of 16 disabled workers, composed of people with either mild or severe motor disabilities, was recruited. Their jobs include business administration (5), computer programmer (1), housewife (1), mechanical worker (2), textile worker (1), bus driver (1), nurse (2), electrical worker (1), teacher (1), warehouseman (1). By using a mix of risk assessment methods and the International Classification of Functioning (ICF) taxonomy, their worksites were re-designed in view of a reasonable accommodation, and prospective evaluation was carried out to check whether the new design would eliminate the risks. In one case - a man with congenital malformations who works as a help-desk operator for technical assistance in the Information and Communication Technology (ICT) department of a big organisation - the accommodation was actually carried out within the time span of the study, thus making it possible to confirm the hypotheses raised in the prospective assessment. PMID:20131973

  5. Measuring Effectiveness in Digital Game-Based Learning: A Methodological Review.

    Directory of Open Access Journals (Sweden)

    Anissa All

    2014-06-01

    Full Text Available In recent years, a growing number of studies are being conducted into the effectiveness of digital game-based learning (DGBL. Despite this growing interest, there is a lack of sound empirical evidence on the effectiveness of DGBL due to different outcome measures for assessing effectiveness, varying methods of data collection and inconclusive or difficult to interpret results. This has resulted in a need for an overarching methodology for assessing the effectiveness of DGBL. The present study took a first step in this direction by mapping current methods used for assessing the effectiveness of DGBL. Results showed that currently, comparison of results across studies and thus looking at effectiveness of DGBL on a more general level is problematic due to diversity in and suboptimal study designs. Variety in study design relates to three issues, namely different activities that are implemented in the control groups, different measures for assessing the effectiveness of DGBL and the use of different statistical techniques for analyzing learning outcomes. Suboptimal study designs are the result of variables confounding study results. Possible confounds that were brought forward in this review are elements that are added to the game as part of the educational intervention (e.g., required reading, debriefing session, instructor influences and practice effects when using the same test pre- and post-intervention. Lastly, incomplete information on the study design impedes replication of studies and thus falsification of study results.

  6. Cerebral methodology based computing to estimate real phenomena from large-scale nuclear simulation

    International Nuclear Information System (INIS)

    Our final goal is to estimate real phenomena from large-scale nuclear simulations by using computing processes. Large-scale simulations mean that they include scale variety and physical complexity so that corresponding experiments and/or theories do not exist. In nuclear field, it is indispensable to estimate real phenomena from simulations in order to improve the safety and security of nuclear power plants. Here, the analysis of uncertainty included in simulations is needed to reveal sensitivity of uncertainty due to randomness, to reduce the uncertainty due to lack of knowledge and to lead a degree of certainty by verification and validation (V and V) and uncertainty quantification (UQ) processes. To realize this, we propose 'Cerebral Methodology based Computing (CMC)' as computing processes with deductive and inductive approaches by referring human reasoning processes. Our idea is to execute deductive and inductive simulations contrasted with deductive and inductive approaches. We have established its prototype system and applied it to a thermal displacement analysis of a nuclear power plant. The result shows that our idea is effective to reduce the uncertainty and to get the degree of certainty. (author)

  7. Looking for phase-space structures in star-forming regions: an MST-based methodology

    Science.gov (United States)

    Alfaro, Emilio J.; González, Marta

    2016-03-01

    We present a method for analysing the phase space of star-forming regions. In particular we are searching for clumpy structures in the 3D sub-space formed by two position coordinates and radial velocity. The aim of the method is the detection of kinematic segregated radial velocity groups, that is, radial velocity intervals whose associated stars are spatially concentrated. To this end we define a kinematic segregation index, tilde{Λ }(RV), based on the Minimum Spanning Tree graph algorithm, which is estimated for a set of radial velocity intervals in the region. When tilde{Λ }(RV) is significantly greater than 1 we consider that this bin represents a grouping in the phase space. We split a star-forming region into radial velocity bins and calculate the kinematic segregation index for each bin, and then we obtain the spectrum of kinematic groupings, which enables a quick visualization of the kinematic behaviour of the region under study. We carried out numerical models of different configurations in the sub-space of the phase space formed by the coordinates and the that various case studies illustrate. The analysis of the test cases demonstrates the potential of the new methodology for detecting different kind of groupings in phase space.

  8. Measuring resource inequalities. The concepts and methodology for an area-based Gini coefficient

    International Nuclear Information System (INIS)

    Although inequalities in income and expenditure are relatively well researched, comparatively little attention has been paid, to date, to inequalities in resource use. This is clearly a shortcoming when it comes to developing informed policies for sustainable consumption and social justice. This paper describes an indicator of inequality in resource use called the AR-Gini. The AR-Gini is an area-based measure of resource inequality that estimates inequalities between neighbourhoods with regard to the consumption of specific consumer goods. It is also capable of estimating inequalities in the emissions resulting from resource use, such as carbon dioxide emissions from energy use, and solid waste arisings from material resource use. The indicator is designed to be used as a basis for broadening the discussion concerning 'food deserts' to inequalities in other types of resource use. By estimating the AR-Gini for a wide range of goods and services we aim to enhance our understanding of resource inequalities and their drivers, identify which resources have highest inequalities, and to explore trends in inequalities. The paper describes the concepts underlying the construction of the AR-Gini and its methodology. Its use is illustrated by pilot applications (specifically, men's and boys' clothing, carpets, refrigerators/freezers and clothes washer/driers). The results illustrate that different levels of inequality are associated with different commodities. The paper concludes with a brief discussion of some possible policy implications of the AR-Gini. (author)

  9. Designing reasonable accommodation of the workplace: a new methodology based on risk assessment.

    Science.gov (United States)

    Pigini, L; Andrich, R; Liverani, G; Bucciarelli, P; Occhipinti, E

    2010-05-01

    If working tasks are carried out in inadequate conditions, workers with functional limitations may, over time, risk developing further disabilities. While several validated risk assessment methods exist for able-bodied workers, few studies have been carried out for workers with disabilities. This article, which reports the findings of a Study funded by the Italian Ministry of Labour, proposes a general methodology for the technical and organisational re-design of a worksite, based on risk assessment and irrespective of any worker disability. To this end, a sample of 16 disabled workers, composed of people with either mild or severe motor disabilities, was recruited. Their jobs include business administration (5), computer programmer (1), housewife (1), mechanical worker (2), textile worker (1), bus driver (1), nurse (2), electrical worker (1), teacher (1), warehouseman (1). By using a mix of risk assessment methods and the International Classification of Functioning (ICF) taxonomy, their worksites were re-designed in view of a reasonable accommodation, and prospective evaluation was carried out to check whether the new design would eliminate the risks. In one case - a man with congenital malformations who works as a help-desk operator for technical assistance in the Information and Communication Technology (ICT) department of a big organisation - the accommodation was actually carried out within the time span of the study, thus making it possible to confirm the hypotheses raised in the prospective assessment.

  10. A Preisach-Based Nonequilibrium Methodology for Simulating Performance of Hysteretic Magnetic Refrigeration Cycles

    Science.gov (United States)

    Brown, Timothy D.; Bruno, Nickolaus M.; Chen, Jing-Han; Karaman, Ibrahim; Ross, Joseph H.; Shamberger, Patrick J.

    2015-09-01

    In giant magnetocaloric effect (GMCE) materials a large entropy change couples to a magnetostructural first-order phase transition, potentially providing a basis for magnetic refrigeration cycles. However, hysteresis loss greatly reduces the availability of refrigeration work in such cycles. Here, we present a methodology combining a Preisach model for rate-independent hysteresis with a thermodynamic analysis of nonequilibrium phase transformations which, for GMCE materials exhibiting hysteresis, allows an evaluation of refrigeration work and efficiency terms for an arbitrary cycle. Using simplified but physically meaningful descriptors for the magnetic and thermal properties of a Ni45Co5Mn36.6In13.4 at.% single-crystal alloy, we relate these work/efficiency terms to fundamental material properties, demonstrating the method's use as a materials design tool. Following a simple two-parameter model for the alloy's hysteresis properties, we compute and interpret the effect of each parameter on the cyclic refrigeration work and efficiency terms. We show that hysteresis loss is a critical concern in cycles based on GMCE systems, since the resultant lost work can reduce the refrigeration work to zero; however, we also find that the lost work may be mitigated by modifying other aspects of the transition, such as the width over which the one-way transformation occurs.

  11. Low-Cost Fault Tolerant Methodology for Real Time MPSoC Based Embedded System

    Directory of Open Access Journals (Sweden)

    Mohsin Amin

    2014-01-01

    Full Text Available We are proposing a design methodology for a fault tolerant homogeneous MPSoC having additional design objectives that include low hardware overhead and performance. We have implemented three different FT methodologies on MPSoCs and compared them against the defined constraints. The comparison of these FT methodologies is carried out by modelling their architectures in VHDL-RTL, on Spartan 3 FPGA. The results obtained through simulations helped us to identify the most relevant scheme in terms of the given design constraints.

  12. A Probabilistic Transmission Expansion Planning Methodology based on Roulette Wheel Selection and Social Welfare

    CERN Document Server

    Gupta, Neeraj; Kalra, Prem Kumar

    2012-01-01

    A new probabilistic methodology for transmission expansion planning (TEP) that does not require a priori specification of new/additional transmission capacities and uses the concept of social welfare has been proposed. Two new concepts have been introduced in this paper: (i) roulette wheel methodology has been used to calculate the capacity of new transmission lines and (ii) load flow analysis has been used to calculate expected demand not served (EDNS). The overall methodology has been implemented on a modified IEEE 5-bus test system. Simulations show an important result: addition of only new transmission lines is not sufficient to minimize EDNS.

  13. Methodology To Define Drought Management Scenarios Based On Accumulated Future Projections Of Risk

    Science.gov (United States)

    Haro-Monteagudo, David; Solera-Solera, Abel; Andreu-Álvarez, Joaquín

    2014-05-01

    Drought is a serious threat to many water resources systems in the world. Especially to those in which the equilibrium between resources availability and water uses is very fragile, making that deviation below normality compromises the capacity of the system to cope with all the demands and environmental requirements. Since droughts are not isolated events but instead they develop through time in what could be considered a creeping behavior, it is very difficult to determine when an episode starts and how long will it last. Because this is a major concern for water managers and society in general, scientific research has strived to develop indices that allow evaluating the risk of a drought event occurrence. These indices often have as basis previous and current state variables of the system that combined between them supply decision making responsible with an indication of the risk of being in a situation of drought, normally through the definition of a drought scenario situation. While this way of proceeding has found to be effective in many systems, there are cases in which indicators systems fail to define the appropriate on-going drought scenario early enough to start measures that allowed to minimize the possible impacts. This is the case, for example, of systems with high seasonal precipitation variability. The use of risk assessment models to evaluate future possible states of the system becomes handy in cases like the previous one, although they are not limited to such systems. We present a method to refine the drought scenario definition within a water resources system. To implement this methodology, we use a risk assessment model generalized to water resources systems based in the stochastic generation of multiple possible future streamflows generation and the simulation of the system from a Monte-Carlo approach. We do this assessment every month of the year up to the end of the hydrologic year that normally corresponds with the end of the irrigation

  14. A study of polar ozone depletion based on sequential assimilation of satellite data from the ENVISAT/MIPAS and Odin/SMR instruments

    Directory of Open Access Journals (Sweden)

    J. D. Rösevall

    2007-01-01

    Full Text Available The objective of this study is to demonstrate how polar ozone depletion can be mapped and quantified by assimilating ozone data from satellites into the wind driven transport model DIAMOND, (Dynamical Isentropic Assimilation Model for OdiN Data. By assimilating a large set of satellite data into a transport model, ozone fields can be built up that are less noisy than the individual satellite ozone profiles. The transported fields can subsequently be compared to later sets of incoming satellite data so that the rates and geographical distribution of ozone depletion can be determined. By tracing the amounts of solar irradiation received by different air parcels in a transport model it is furthermore possible to study the photolytic reactions that destroy ozone. In this study, destruction of ozone that took place in the Antarctic winter of 2003 and in the Arctic winter of 2002/2003 have been examined by assimilating ozone data from the ENVISAT/MIPAS and Odin/SMR satellite-instruments. Large scale depletion of ozone was observed in the Antarctic polar vortex of 2003 when sunlight returned after the polar night. By mid October ENVISAT/MIPAS data indicate vortex ozone depletion in the ranges 80–100% and 70–90% on the 425 and 475 K potential temperature levels respectively while the Odin/SMR data indicates depletion in the ranges 70–90% and 50–70%. The discrepancy between the two instruments has been attributed to systematic errors in the Odin/SMR data. Assimilated fields of ENVISAT/MIPAS data indicate ozone depletion in the range 10–20% on the 475 K potential temperature level, (~19 km altitude, in the central regions of the 2002/2003 Arctic polar vortex. Assimilated fields of Odin/SMR data on the other hand indicate ozone depletion in the range 20–30%.

  15. An AFM-based methodology for measuring axial and radial error motions of spindles

    International Nuclear Information System (INIS)

    This paper presents a novel atomic force microscopy (AFM)-based methodology for measurement of axial and radial error motions of a high precision spindle. Based on a modified commercial AFM system, the AFM tip is employed as a cutting tool by which nano-grooves are scratched on a flat surface with the rotation of the spindle. By extracting the radial motion data of the spindle from the scratched nano-grooves, the radial error motion of the spindle can be calculated after subtracting the tilting errors from the original measurement data. Through recording the variation of the PZT displacement in the Z direction in AFM tapping mode during the spindle rotation, the axial error motion of the spindle can be obtained. Moreover the effects of the nano-scratching parameters on the scratched grooves, the tilting error removal method for both conditions and the method of data extraction from the scratched groove depth are studied in detail. The axial error motion of 124 nm and the radial error motion of 279 nm of a commercial high precision air bearing spindle are achieved by this novel method, which are comparable with the values provided by the manufacturer, verifying this method. This approach does not need an expensive standard part as in most conventional measurement approaches. Moreover, the axial and radial error motions of the spindle can both be obtained, indicating that this is a potential means of measuring the error motions of the high precision moving parts of ultra-precision machine tools in the future. (paper)

  16. Methodological Bases for Ranking the European Union Countries in Terms of Macroeconomic Security

    Directory of Open Access Journals (Sweden)

    Tymoshenko Olena V.

    2015-11-01

    Full Text Available The fundamental contradictions of existing methodical approaches to assessing the level of the state economic security have been substantiated and proposals on the introduction of a unified methodology for its assessment, which would be acceptable for use at the international level or for a specific cluster of countries, have been developed. Based on the conducted researches it has been found that the there are no unified signs for such classification of countries. To determine the most significant coefficients and critical values of the indicators of economic security, it is appropriate that the countries should be grouped in terms of the level of the economic development proposed by the UN Commission and the IMF. Analysis of the economic security level has been conducted for the countries-members of the European Union as a separate cluster of countries on the example of macroeconomic security indicators. Based on the evaluation it has been found that the proposed list of indicators and their critical values is economically sound and built on the principle of adequacy, representativeness and comprehensiveness. In 2004 the most secure countries of the EU corresponding to the macroeconomic security standards were Austria, Denmark, Sweden, Finland, and as in 2014 the percentage of absolutely secure countries decreased from 14.3 to 7.1%, only Denmark and Sweden remained in the ranking. During the analyzed period Bulgaria and Croatia got into the risk zone, Estonia, Lithuania, Latvia, Romania were in a danger zone. In 2014 Ukraine in terms of its macroeconomic security was in a critical state, which testified about serious structural and system imbalances in its development.

  17. Stratospheric ozone depletion

    International Nuclear Information System (INIS)

    The amount of stratospheric ozone and the reduction of the ozone layer vary according to seasons and latitudes. At present total and vertical ozone is monitored over all Austria. The mean monthly ozone levels between 1994 and 2000 are presented. Data on stratospheric ozone and UV-B radiation are published daily on the home page http: www.lebesministerium.at. The use of ozone depleting substances such as chlorofluorocarbons (CFCs), hydrochlorofluorocarbons (HCFCs) is provided. Besides, the national measures taken to reduce their use. Figs. 2, Tables 2. (nevyjel)

  18. Towards a common methodology to simulate tree mortality based on ring-width data

    Science.gov (United States)

    Cailleret, Maxime; Bigler, Christof; Bugmann, Harald; Davi, Hendrik; Minunno, Francesco; Peltoniemi, Mikko; Martínez-Vilalta, Jordi

    2015-04-01

    Individual mortality is a key process of population and community dynamics, especially for long-lived species such as trees. As the rates of vegetation background mortality and of massive diebacks accelerated during the last decades and would continue in the future due to rising temperature and increasing drought, there is a growing demand of early warning signals that announce that the likelihood of death is very high. If physiological indicators have a high potential to predict tree mortality, their development requires an intensive tree monitoring which cannot be currently done on a representative sample of a population and on several species. An easier approach is to use radial growth data such as tree ring-widths measurements. During the last decades, an increasing number of studies aimed to derive these growth-mortality functions. However, as they followed different approaches concerning the choice of the sampling strategy (number of dead and living trees), of the type of growth explanatory variables (growth level, growth trend variables…), and of the length of the time-window (number of rings before death) used to calculate them, it makes difficult to compare results among studies and a subsequent biological interpretation. We detailed a new methodology for assessing reliable tree-ring based growth-mortality relationships using binomial logistic regression models. As examples we used published tree-ring datasets from Abies alba growing in 13 different sites, and from Nothofagus dombeyi and Quercus petraea located in one single site. Our first approach, based on constant samplings, aims to (1) assess the dependency of growth-mortality relationships on the statistical sampling scheme used; (2) determine the best length of the time-window used to calculate each growth variable; and (3) reveal the presence of intra-specific shifts in growth-mortality relationships. We also followed a Bayesian approach to build the best multi-variable logistic model considering

  19. Revised Design-Based Research Methodology for College Course Improvement and Application to Education Courses in Japan

    Science.gov (United States)

    Akahori, Kanji

    2011-01-01

    The author describes a research methodology for college course improvement, and applies the results to education courses. In Japan, it is usually difficult to carry out research on college course improvement, because faculty cannot introduce experimental design approaches based on control and treatment groupings of students in actual classroom…

  20. Prevalence of cluster headache in the Republic of Georgia: results of a population-based study and methodological considerations

    DEFF Research Database (Denmark)

    Katsarava, Z; Dzagnidze, A; Kukava, M;

    2009-01-01

    We present a study of the general-population prevalence of cluster headache in the Republic of Georgia and discuss the advantages and challenges of different methodological approaches. In a community-based survey, specially trained medical residents visited 500 adjacent households in the capital...

  1. Study on the Optimization of Bio-emulsifier Production by Geobacillus sp.XS2 Based on Response Surface Methodology

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    [Objective] The aim was to study the optimization of bio-emulsifier production by Geobacillus sp.XS2 based on response surface methodology.[Method] Firstly,single factor experiment was conducted to find out the main medium components influencing bio-emulsifier production by Geobacillus sp.XS2,and then response surface model was established by using response surface methodology and Design-Expert 7.0,so as to optimize the fermentation medium for bio-emulsifier production by Geobacillus sp.XS2.[Result] Glucose...

  2. A methodological approach to characterise Landslide Periods based on historical series of rainfall and landslide damage

    Directory of Open Access Journals (Sweden)

    O. Petrucci

    2009-10-01

    Full Text Available Landslide Periods (LPs are defined as periods, shorter than a hydrological year, during which one or more landslide damage events occur in one or more sectors of a study area. In this work, we present a methodological approach, based on the comparative analysis of historical series of landslide damage and daily rainfall data, aiming to characterise the main types of LPs affecting selected areas. Cumulative rainfall preceding landslide activation is assessed for short (1, 2, 3, and 5 days, medium (7, 10, and 30 days and long (60, 90, and 180 days durations, and their Return Periods (RPs are assessed and ranked into three classes (Class 1: RP=5-10 years; Class 2: RP=11-15; Class 3: RP>15 years. To assess landslide damage, the Simplified Damage Index (SDI is introduced. This represents classified landslide losses and is obtained by multiplying the value of the damaged element and the percentage of damage affecting it. The comparison of the RP of rainfall and the SDI allows us to indentify the different types of LPs that affected the study area in the past and that could affect it again in the future.

    The results of this activity can be used for practical purposes to define scenarios and strategies for risk management, to suggest priorities in policy towards disaster mitigation and preparedness and to predispose defensive measures and civil protection plans ranked according to the types of LPs that must be managed.

    We present an application, performed for a 39-year series of rainfall/landslide damage data and concerning a study area located in NE Calabria (Italy; in this case study, we identify four main types of LPs, which are ranked according to damage severity.

  3. A grey-based group decision-making methodology for the selection of hydrogen technologiess in Life Cycle Sustainability perspective

    DEFF Research Database (Denmark)

    Manzardo, Alessandro; Ren, Jingzheng; Mazzi, Anna;

    2012-01-01

    the issue of uncertainty. The proposed methodology allows multi-person to participate in the decision-making process and to give linguistic evaluation on the weights of the criteria and the performance of the alternative technologies. In this paper, twelve hydrogen production technologies have been assessed......The objective of this research is to develop a grey-based group decision-making methodology for the selection of the best renewable energy technology (including hydrogen) using a life cycle sustainability perspective. The traditional grey relational analysis has been modified to better address...... using the proposed methodology, electrolysis of water technology by hydropower has been considered to be the best technology for hydrogen production according to the decision-making group....

  4. Novel methodology for 3D reconstruction of carotid arteries and plaque characterization based upon magnetic resonance imaging carotid angiography data.

    Science.gov (United States)

    Sakellarios, Antonis I; Stefanou, Kostas; Siogkas, Panagiotis; Tsakanikas, Vasilis D; Bourantas, Christos V; Athanasiou, Lambros; Exarchos, Themis P; Fotiou, Evangelos; Naka, Katerina K; Papafaklis, Michail I; Patterson, Andrew J; Young, Victoria E L; Gillard, Jonathan H; Michalis, Lampros K; Fotiadis, Dimitrios I

    2012-10-01

    In this study, we present a novel methodology that allows reliable segmentation of the magnetic resonance images (MRIs) for accurate fully automated three-dimensional (3D) reconstruction of the carotid arteries and semiautomated characterization of plaque type. Our approach uses active contours to detect the luminal borders in the time-of-flight images and the outer vessel wall borders in the T(1)-weighted images. The methodology incorporates the connecting components theory for the automated identification of the bifurcation region and a knowledge-based algorithm for the accurate characterization of the plaque components. The proposed segmentation method was validated in randomly selected MRI frames analyzed offline by two expert observers. The interobserver variability of the method for the lumen and outer vessel wall was -1.60%±6.70% and 0.56%±6.28%, respectively, while the Williams Index for all metrics was close to unity. The methodology implemented to identify the composition of the plaque was also validated in 591 images acquired from 24 patients. The obtained Cohen's k was 0.68 (0.60-0.76) for lipid plaques, while the time needed to process an MRI sequence for 3D reconstruction was only 30 s. The obtained results indicate that the proposed methodology allows reliable and automated detection of the luminal and vessel wall borders and fast and accurate characterization of plaque type in carotid MRI sequences. These features render the currently presented methodology a useful tool in the clinical and research arena.

  5. Project-Based Learning and Agile Methodologies in Electronic Courses: Effect of Student Population and Open Issues

    Directory of Open Access Journals (Sweden)

    Marina Zapater

    2013-12-01

    Full Text Available Project-Based Learning (PBL and Agile methodologies have proven to be very interesting instructional strategies in Electronics and Engineering education, because they provide practical learning skills that help students understand the basis of electronics. In this paper we analyze two courses, one belonging to a Master in Electronic Engineering and one to a Bachelor in Telecommunication Engineering that apply Agile-PBL methodologies, and compare the results obtained in both courses with a traditional laboratory course. Our results support previous work stating that Agile-PBL methodologies increase student satisfaction. However, we also highlight some open issues that negatively affect the implementation of these methodologies,such as planning overhead or accidental complexity. Moreover,we show how differences in the student population, mostly related to the time spent on-campus, their commitment to the course or part-time dedication, have an impact on the benefits of Agile-PBL methods. In these cases, Agile-PBL methodologies by themselves are not enough and need to be combined with other techniques to increase student motivation.

  6. Study of A Multi-criteria Evaluation Methodology for Nuclear Fuel Cycle System Based on Sustainability

    Institute of Scientific and Technical Information of China (English)

    Liu Jingquan; Hidekazu Yoshikawa; OuYang Jun; Zhou Yangping

    2006-01-01

    This paper presents a multi-criteria evaluation methodology for nuclear fuel cycle options in terms of energy sustainability. Starting from the general sustainability concept and the public acceptance questionnaire, a set of indicators reflecting specific criteria for the evaluation of nuclear fuel cycle options are defined.Particular attention is devoted to the resource utility efficiency, environmental effect, human health hazard and economic effect, which represent the different concerns of different stakeholders. This methodology also integrated a special mathematic processing approach, namely the Extentics Evaluation Method, which quantifies the human being subjective perception to provide the intuitionistic judgement and comparison for different options. The once-through option and reprocessing option of nuclear fuel cycle are examined by using the proposed methodology. The assessment process and result can give us some guidance in nuclear fuel cycle evaluation under the constraint of limited data.

  7. Validation of 2DH hydrodynamic and morphological mathematical models. A methodology based on SAR imaging

    Science.gov (United States)

    Canelas, Ricardo; Heleno, Sandra; Pestana, Rita; Ferreira, Rui M. L.

    2014-05-01

    The objective of the present work is to devise a methodology to validate 2DH shallow-water models suitable to simulate flow hydrodynamics and channel morphology. For this purpose, a 2DH mathematical model, assembled at CEHIDRO, IST, is employed to model Tagus river floods over a 70 km reach and Synthetic Aperture Radar (SAR) images are collected to retrieve planar inundation extents. The model is suited for highly unsteady discontinuous flows over complex, time-evolving geometries, employing a finite-volume discretization scheme, based on a flux-splitting technique incorporating a reviewed version of the Roe Riemann solver. Novel closure terms for the non-equilibrium sediment transport model are included. New boundary conditions are employed, based on the Riemann variables associated the outgoing characteristic fields, coping with the provided hydrographs in a mathematically coherent manner. A high resolution Digital Elevation Model (DEM) is used and levee structures are considered as fully erodible elements. Spatially heterogeneous roughness characteristics are derived from land-use databases such as CORINE LandCover 2006. SAR satellite imagery of the floods is available and is used to validate the simulation results, with particular emphasis on the 2000/2001 flood. The delimited areas from the satellite and simulations are superimposed. The quality of the adjustment depends on the calibration of roughness coefficients and the spatial discretization of with small structures, with lengths at the order of the spatial discretization. Flow depths and registered discharges are recovered from the simulation and compared with data from a measuring station in the domain, with the comparison revealing remarkably high accuracy, both in terms of amplitudes and phase. Further inclusion of topographical detail should improve the comparison of flood extents regarding satellite data. The validated model was then employed to simulate 100-year floods in the same reach. The

  8. Assessing local planning to control groundwater depletion: California as a microcosm of global issues

    Science.gov (United States)

    Nelson, Rebecca L.

    2012-01-01

    Groundwater pumping has caused excessive groundwater depletion around the world, yet regulating pumping remains a profound challenge. California uses more groundwater than any other U.S. state, and serves as a microcosm of the adverse effects of pumping felt worldwide—land subsidence, impaired water quality, and damaged ecosystems, all against the looming threat of climate change. The state largely entrusts the control of depletion to the local level. This study uses internationally accepted water resources planning theories systematically to investigate three key aspects of controlling groundwater depletion in California, with an emphasis on local-level action: (a) making decisions and engaging stakeholders; (b) monitoring groundwater; and (c) using mandatory, fee-based and voluntary approaches to control groundwater depletion (e.g., pumping restrictions, pumping fees, and education about water conservation, respectively). The methodology used is the social science-derived technique of content analysis, which involves using a coding scheme to record these three elements in local rules and plans, and State legislation, then analyzing patterns and trends. The study finds that Californian local groundwater managers rarely use, or plan to use, mandatory and fee-based measures to control groundwater depletion. Most use only voluntary approaches or infrastructure to attempt to reduce depletion, regardless of whether they have more severe groundwater problems, or problems which are more likely to have irreversible adverse effects. The study suggests legal reforms to the local groundwater planning system, drawing upon its empirical findings. Considering the content of these recommendations may also benefit other jurisdictions that use a local groundwater management planning paradigm.

  9. Parametric Identification of Solar Series based on an Adaptive Parallel Methodology

    Indian Academy of Sciences (India)

    Juan A. Gómez Pulido; Miguel A. Vega Rodríguez; Juan M. Sánchez Pérez

    2005-03-01

    In this work we present an adaptive parallel methodology to optimize the identification of time series through parametric models, applying it to the case of sunspot series. We employ high precision computation of system identification algorithms, and use recursive least squares processing and ARMAX (Autoregressive Moving Average Extensive) parametric modelling. This methodology could be very useful when the high precision mathematical modelling of dynamic complex systems is required. After explaining the proposed heuristics and the tuning of its parameters, we showthe results we have found for several solar series using different implementations. Thus, we demonstrate how the result precision improves.

  10. Riddle of depleted uranium

    International Nuclear Information System (INIS)

    Depleted Uranium (DU) is the waste product of uranium enrichment from the manufacturing of fuel rods for nuclear reactors in nuclear power plants and nuclear power ships. DU may also results from the reprocessing of spent nuclear reactor fuel. Potentially DU has both chemical and radiological toxicity with two important targets organs being the kidney and the lungs. DU is made into a metal and, due to its availability, low price, high specific weight, density and melting point as well as its pyrophoricity; it has a wide range of civilian and military applications. Due to the use of DU over the recent years, there appeared in some press on health hazards that are alleged to be due to DU. In these paper properties, applications, potential environmental and health effects of DU are briefly reviewed

  11. Depleted uranium: Metabolic disruptor?

    International Nuclear Information System (INIS)

    The presence of uranium in the environment can lead to long-term contamination of the food chain and of water intended for human consumption and thus raises many questions about the scientific and societal consequences of this exposure on population health. Although the biological effects of chronic low-level exposure are poorly understood, results of various recent studies show that contamination by depleted uranium (DU) induces subtle but significant biological effects at the molecular level in organs including the brain, liver, kidneys and testicles. For the first time, it has been demonstrated that DU induces effects on several metabolic pathways, including those metabolizing vitamin D, cholesterol, steroid hormones, acetylcholine and xenobiotics. This evidence strongly suggests that DU might well interfere with many metabolic pathways. It might thus contribute, together with other man-made substances in the environment, to increased health risks in some regions. (authors)

  12. Methodology for the Construction of a Rule-Based Knowledge Base Enabling the Selection of Appropriate Bronze Heat Treatment Parameters Using Rough Sets

    Directory of Open Access Journals (Sweden)

    Górny Z.

    2015-04-01

    Full Text Available Decisions regarding appropriate methods for the heat treatment of bronzes affect the final properties obtained in these materials. This study gives an example of the construction of a knowledge base with application of the rough set theory. Using relevant inference mechanisms, knowledge stored in the rule-based database allows the selection of appropriate heat treatment parameters to achieve the required properties of bronze. The paper presents the methodology and the results of exploratory research. It also discloses the methodology used in the creation of a knowledge base.

  13. The New MCNP6 Depletion Capability

    Energy Technology Data Exchange (ETDEWEB)

    Fensin, Michael Lorne [Los Alamos National Laboratory; James, Michael R. [Los Alamos National Laboratory; Hendricks, John S. [Los Alamos National Laboratory; Goorley, John T. [Los Alamos National Laboratory

    2012-06-19

    The first MCNP based inline Monte Carlo depletion capability was officially released from the Radiation Safety Information and Computational Center as MCNPX 2.6.0. Both the MCNP5 and MCNPX codes have historically provided a successful combinatorial geometry based, continuous energy, Monte Carlo radiation transport solution for advanced reactor modeling and simulation. However, due to separate development pathways, useful simulation capabilities were dispersed between both codes and not unified in a single technology. MCNP6, the next evolution in the MCNP suite of codes, now combines the capability of both simulation tools, as well as providing new advanced technology, in a single radiation transport code. We describe here the new capabilities of the MCNP6 depletion code dating from the official RSICC release MCNPX 2.6.0, reported previously, to the now current state of MCNP6. NEA/OECD benchmark results are also reported. The MCNP6 depletion capability enhancements beyond MCNPX 2.6.0 reported here include: (1) new performance enhancing parallel architecture that implements both shared and distributed memory constructs; (2) enhanced memory management that maximizes calculation fidelity; and (3) improved burnup physics for better nuclide prediction. MCNP6 depletion enables complete, relatively easy-to-use depletion calculations in a single Monte Carlo code. The enhancements described here help provide a powerful capability as well as dictate a path forward for future development to improve the usefulness of the technology.

  14. Context Based Inferences in Research Methodology: The Role of Culture in Justifying Knowledge Claims

    Science.gov (United States)

    Evers, Colin W.; Mason, Mark

    2011-01-01

    Drawing on work in epistemology and the philosophy of science, this paper seeks to provide very general reasons for why a comparative perspective needs to be applied to the inferential procedures of research methodologies where these concern the issue of justifying knowledge claims. In particular, the paper explores the role of culture on a number…

  15. The Case in Case-Based Design of Educational Software: A Methodological Interrogation

    Science.gov (United States)

    Khan, S.

    2008-01-01

    This research assessed the value of case study methodology in the design of an educational computer simulation. Three sources of knowledge were compared to assess the value of case study: practitioner and programmer knowledge, disciplinary knowledge, and knowledge obtained from a case study of teacher practice. A retrospective analysis revealed…

  16. Methodology for Evaluating an Adaptation of Evidence-Based Drug Abuse Prevention in Alternative Schools

    Science.gov (United States)

    Hopson, Laura M.; Steiker, Lori K. H.

    2008-01-01

    The purpose of this article is to set forth an innovative methodological protocol for culturally grounding interventions with high-risk youths in alternative schools. This study used mixed methods to evaluate original and adapted versions of a culturally grounded substance abuse prevention program. The qualitative and quantitative methods…

  17. Review of Seismic Evaluation Methodologies for Nuclear Power Plants Based on a Benchmark Exercise

    International Nuclear Information System (INIS)

    quantification of the effect of different analytical approaches on the response of the piping system under single and multi-support input motions), the spent fuel pool (to estimate the sloshing frequencies, maximum wave height and spilled water amount, and predict free surface evolution), and the pure water tank (to predict the observed buckling modes of the pure water tank). Analyses of the main results include comparison between different computational models, variability of results among participants, and comparison of analysis results with recorded ones. This publication addresses state of the practice for seismic evaluation and margin assessment methodologies for SSCs in NPPs based on the KARISMA benchmark exercise. As such, it supports and complements other IAEA publications with respect to seismic safety of new and existing nuclear installations. It was developed within the framework of International Seismic Safety Centre activities. It provides detailed guidance on seismic analysis, seismic design and seismic safety re-evaluation of nuclear installations and will be of value to researchers, operating organizations, regulatory authorities, vendors and technical support organizations

  18. The Toxicity of Depleted Uranium

    OpenAIRE

    Wayne Briner

    2010-01-01

    Depleted uranium (DU) is an emerging environmental pollutant that is introduced into the environment primarily by military activity. While depleted uranium is less radioactive than natural uranium, it still retains all the chemical toxicity associated with the original element. In large doses the kidney is the target organ for the acute chemical toxicity of this metal, producing potentially lethal tubular necrosis. In contrast, chronic low dose exposure to depleted uranium may not produce a c...

  19. A GIS-based methodology for the estimation of potential volcanic damage and its application to Tenerife Island, Spain

    Science.gov (United States)

    Scaini, C.; Felpeto, A.; Martí, J.; Carniel, R.

    2014-05-01

    This paper presents a GIS-based methodology to estimate damages produced by volcanic eruptions. The methodology is constituted by four parts: definition and simulation of eruptive scenarios, exposure analysis, vulnerability assessment and estimation of expected damages. Multi-hazard eruptive scenarios are defined for the Teide-Pico Viejo active volcanic complex, and simulated through the VORIS tool. The exposure analysis identifies the elements exposed to the hazard at stake and focuses on the relevant assets for the study area. The vulnerability analysis is based on previous studies on the built environment and complemented with the analysis of transportation and urban infrastructures. Damage assessment is performed associating a qualitative damage rating to each combination of hazard and vulnerability. This operation consists in a GIS-based overlap, performed for each hazardous phenomenon considered and for each element. The methodology is then automated into a GIS-based tool using an ArcGIS® program. Given the eruptive scenarios and the characteristics of the exposed elements, the tool produces expected damage maps. The tool is applied to the Icod Valley (North of Tenerife Island) which is likely to be affected by volcanic phenomena in case of eruption from both the Teide-Pico Viejo volcanic complex and North-West basaltic rift. Results are thematic maps of vulnerability and damage that can be displayed at different levels of detail, depending on the user preferences. The aim of the tool is to facilitate territorial planning and risk management in active volcanic areas.

  20. Depleted zinc: Properties, application, production

    International Nuclear Information System (INIS)

    The addition of ZnO, depleted in the Zn-64 isotope, to the water of boiling water nuclear reactors lessens the accumulation of Co-60 on the reactor interior surfaces, reduces radioactive wastes and increases the reactor service-life because of the inhibitory action of zinc on inter-granular stress corrosion cracking. To the same effect depleted zinc in the form of acetate dihydrate is used in pressurized water reactors. Gas centrifuge isotope separation method is applied for production of depleted zinc on the industrial scale. More than 20 years of depleted zinc application history demonstrates its benefits for reduction of NPP personnel radiation exposure and combating construction materials corrosion.

  1. Depleted zinc: Properties, application, production.

    Science.gov (United States)

    Borisevich, V D; Pavlov, A V; Okhotina, I A

    2009-01-01

    The addition of ZnO, depleted in the Zn-64 isotope, to the water of boiling water nuclear reactors lessens the accumulation of Co-60 on the reactor interior surfaces, reduces radioactive wastes and increases the reactor service-life because of the inhibitory action of zinc on inter-granular stress corrosion cracking. To the same effect depleted zinc in the form of acetate dihydrate is used in pressurized water reactors. Gas centrifuge isotope separation method is applied for production of depleted zinc on the industrial scale. More than 20 years of depleted zinc application history demonstrates its benefits for reduction of NPP personnel radiation exposure and combating construction materials corrosion.

  2. Vulnerability curves vs. vulnerability indicators: application of an indicator-based methodology for debris-flow hazards

    Science.gov (United States)

    Papathoma-Köhle, Maria

    2016-08-01

    The assessment of the physical vulnerability of elements at risk as part of the risk analysis is an essential aspect for the development of strategies and structural measures for risk reduction. Understanding, analysing and, if possible, quantifying physical vulnerability is a prerequisite for designing strategies and adopting tools for its reduction. The most common methods for assessing physical vulnerability are vulnerability matrices, vulnerability curves and vulnerability indicators; however, in most of the cases, these methods are used in a conflicting way rather than in combination. The article focuses on two of these methods: vulnerability curves and vulnerability indicators. Vulnerability curves express physical vulnerability as a function of the intensity of the process and the degree of loss, considering, in individual cases only, some structural characteristics of the affected buildings. However, a considerable amount of studies argue that vulnerability assessment should focus on the identification of these variables that influence the vulnerability of an element at risk (vulnerability indicators). In this study, an indicator-based methodology (IBM) for mountain hazards including debris flow (Kappes et al., 2012) is applied to a case study for debris flows in South Tyrol, where in the past a vulnerability curve has been developed. The relatively "new" indicator-based method is being scrutinised and recommendations for its improvement are outlined. The comparison of the two methodological approaches and their results is challenging since both methodological approaches deal with vulnerability in a different way. However, it is still possible to highlight their weaknesses and strengths, show clearly that both methodologies are necessary for the assessment of physical vulnerability and provide a preliminary "holistic methodological framework" for physical vulnerability assessment showing how the two approaches may be used in combination in the future.

  3. A Novel Depletion-Mode MOS Gated Emitter Shorted Thyristor

    Institute of Scientific and Technical Information of China (English)

    张鹤鸣; 戴显英; 张义门; 马晓华; 林大松

    2000-01-01

    A Novel MOS-gated thyristor, depletion-mode MOS gated emitter shorted thyristor (DMST),and its two structures are proposed. In DMST,the channel of depletion-mode MOS makes the thyristor emitter-based junction inherently short. The operation of the device is controlled by the interruption and recovery of the depletion-mode MOS P channel. The perfect properties have been demonstrated by 2-D numerical simulations and the tests on the fabricated chips.

  4. A Multi-Criteria Decision Analysis based methodology for quantitatively scoring the reliability and relevance of ecotoxicological data.

    Science.gov (United States)

    Isigonis, Panagiotis; Ciffroy, Philippe; Zabeo, Alex; Semenzin, Elena; Critto, Andrea; Giove, Silvio; Marcomini, Antonio

    2015-12-15

    Ecotoxicological data are highly important for risk assessment processes and are used for deriving environmental quality criteria, which are enacted for assuring the good quality of waters, soils or sediments and achieving desirable environmental quality objectives. Therefore, it is of significant importance the evaluation of the reliability of available data for analysing their possible use in the aforementioned processes. The thorough analysis of currently available frameworks for the assessment of ecotoxicological data has led to the identification of significant flaws but at the same time various opportunities for improvement. In this context, a new methodology, based on Multi-Criteria Decision Analysis (MCDA) techniques, has been developed with the aim of analysing the reliability and relevance of ecotoxicological data (which are produced through laboratory biotests for individual effects), in a transparent quantitative way, through the use of expert knowledge, multiple criteria and fuzzy logic. The proposed methodology can be used for the production of weighted Species Sensitivity Weighted Distributions (SSWD), as a component of the ecological risk assessment of chemicals in aquatic systems. The MCDA aggregation methodology is described in detail and demonstrated through examples in the article and the hierarchically structured framework that is used for the evaluation and classification of ecotoxicological data is shortly discussed. The methodology is demonstrated for the aquatic compartment but it can be easily tailored to other environmental compartments (soil, air, sediments).

  5. A correlative imaging based methodology for accurate quantitative assessment of bone formation in additive manufactured implants.

    Science.gov (United States)

    Geng, Hua; Todd, Naomi M; Devlin-Mullin, Aine; Poologasundarampillai, Gowsihan; Kim, Taek Bo; Madi, Kamel; Cartmell, Sarah; Mitchell, Christopher A; Jones, Julian R; Lee, Peter D

    2016-06-01

    A correlative imaging methodology was developed to accurately quantify bone formation in the complex lattice structure of additive manufactured implants. Micro computed tomography (μCT) and histomorphometry were combined, integrating the best features from both, while demonstrating the limitations of each imaging modality. This semi-automatic methodology registered each modality using a coarse graining technique to speed the registration of 2D histology sections to high resolution 3D μCT datasets. Once registered, histomorphometric qualitative and quantitative bone descriptors were directly correlated to 3D quantitative bone descriptors, such as bone ingrowth and bone contact. The correlative imaging allowed the significant volumetric shrinkage of histology sections to be quantified for the first time (~15 %). This technique demonstrated the importance of location of the histological section, demonstrating that up to a 30 % offset can be introduced. The results were used to quantitatively demonstrate the effectiveness of 3D printed titanium lattice implants.

  6. Business analysis methodology in telecommunication industry – the research based on the grounded theory

    Directory of Open Access Journals (Sweden)

    Hana Nenickova

    2013-10-01

    Full Text Available The objective of this article is to present the grounded theory using in the qualitative research as a basis to build a business analysis methodology for the implementation of information systems in telecommunication enterprises in Czech Republic. In the preparation of the methodology I have used the current needs of telecommunications companies, which are characterized mainly by high dependence on information systems. Besides that, this industry is characterized by high flexibility and competition and compressing of the corporate strategy timeline. The grounded theory of business analysis defines the specifics of the telecommunications industry, focusing on the very specific description of the procedure for collecting the business requirements and following the business strategy.

  7. A contract-based methodology for aircraft electric power system design

    OpenAIRE

    Nuzzo, P; H. Xu; Ozay, N; Finn, JB; Sangiovanni-Vincentelli, AL; Murray, RM; Donzé, A; Seshia, SA

    2014-01-01

    In an aircraft electric power system, one or more supervisory control units actuate a set of electromechanical switches to dynamically distribute power from generators to loads, while satisfying safety, reliability, and real-time performance requirements. To reduce expensive redesign steps, this control problem is generally addressed by minor incremental changes on top of consolidated solutions. A more systematic approach is hindered by a lack of rigorous design methodologies that allow estim...

  8. Risk-based methodology for parameter calibration of a reservoir flood control model

    OpenAIRE

    Bianucci, P.; A. Sordo-Ward; Pérez, J. I.; J. García-Palacios; L. Mediero; L. Garrote

    2013-01-01

    Flash floods are of major relevance in natural disaster management in the Mediterranean region. In many cases, the damaging effects of flash floods can be mitigated by adequate management of flood control reservoirs. This requires the development of suitable models for optimal operation of reservoirs. A probabilistic methodology for calibrating the parameters of a reservoir flood control model (RFCM) that takes into account the stochastic variability of flood events is presented. This study a...

  9. Development of six sigma concurrent parameter and tolerance design method based on response surface methodology

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Using Response Surface Methodology (RSM), an optimizing model of concurrent parameter and tolerance design is proposed where response mean equals its target in the target being best. The optimizing function of the model is the sum of quality loss and tolerance cost subjecting to the variance confidence region of which six sigma capability can be assured. An example is illustrated in order to compare the differences between the developed model and the parameter design with minimum variance. The results show ...

  10. A PLM components monitoring framework for SMEs based on a PLM maturity model and FAHP methodology

    OpenAIRE

    Zhang, Haiqing; Sekhari, Aicha; Ouzrout, Yacine; Bouras, Abdelaziz

    2014-01-01

    Right PLM components selection and investments increase business advantages. This paper develops a PLM components monitoring framework to assess and guide PLM implementation in small and middle enterprises (SMEs). The framework builds upon PLM maturity models and decision-making methodology. PLM maturity model has the capability to analyze PLM functionalities and evaluate PLM components. A proposed PLM components maturity assessment (PCMA) model can obtain general maturity levels of PLM compo...

  11. Groundwater Interactive: Interdisciplinary Web-Based Software Incorporating New Learning Methodologies and Technologies

    OpenAIRE

    Mendez, Eduardo

    2002-01-01

    Groundwater related courses are offered through several colleges at Virginia Tech. These classes enroll a diverse group of students with varied academic backgrounds and educational levels. Though these classes emphasize different aspects of groundwater resources, they lack a unified approach in instructional materials and learning methodologies for knowledge they do share. The goals of this research are to lessen the impact of variable student backgrounds and to better integrate the course...

  12. Open Space Evaluation Methodology and Three Dimensional Evaluation Model as a Base for Sustainable Development Tracking

    Directory of Open Access Journals (Sweden)

    Melita Rozman Cafuta

    2015-10-01

    Full Text Available Sustainable development, as a concept of controlled development, is a management characteristic. Adaptation to progress is important to achieve sustainability. The research focus here is on developing an evaluation methodology for determining the characteristics of urban open space. A method was designed for use in the comparative analysis of environmental perception evaluation between different time sequences. It allows us to compare results before and after spatial interventions or spatial development tracking over time. The newly-developed SEC model (suitable for everyone, environmentally-accepted, and cost-effective was an essential element used in the research methodology. The model was designed using the systematic principle, the top–down approach, and the decomposition method. Three basic dimensions were divided into six factors. Factors were divided into eighteen indicators that are either quantitatively or qualitatively assessed. Indicators were divided into several aspects. An instrument (questionnaire was developed to support the evaluation methodology of urban open space characteristics. Every aspect belongs to a question in the questionnaire. The applicability of the SEC model was demonstrated in two case studies. Evaluation took place during two different time sequences, once during the day-time and once during the night. Obtained results provide useful information of the current spatial situation needed for sustainable development strategy preparation.

  13. Methodology based on Geographic Information Systems for biomass logistics and transport optimisation

    Energy Technology Data Exchange (ETDEWEB)

    Perpina, C.; Alfonso, D.; Perez-Navarro, A.; Penalvo, E.; Vargas, C.; Cardenas, R. [Instituto de Ingenieria Energetica, Universidad Politecnica de Valencia, Camino de Vera s/n Edificio 8E, 2a, 46022 Valencia (Spain)

    2009-03-15

    The aim of this study is to contribute by outlining a procedure for achieving an optimal use of agricultural and forest residue biomass. In this regard, it develops and applies a methodology focused on logistics and transport strategies that can be used to locate a network of bioenergy plants around the region. This methodology was developed using a Geographic Information Systems and it provides information on the spatial distribution of biomass residues. This is accomplished by taking into consideration the amount of residue left within a rectangle with an area of 1 km{sup 2}, and making a regular grid overlap for the region under consideration. The centroid of each square will be evaluated and classified as 'origin' (source of biomass collection) or 'destination' (potential location of the bioenergy plant) depending on technical, economic, environmental and social constraints. The study focuses on mapping potential sites for tapping biomass energy and optimal locations for bioenergy plants. To identify and map optimal locations it is necessary to evaluate the time, distance and transport costs involved in the road transportation of biomass by means of a network analysis. The methodology was applied in the Valencian Community because the intense agricultural, agro-alimentary and timber activity in the region means there is a high potential for biomass. (author)

  14. Optimization design of the stratospheric airship's power system based on the methodology of orthogonal experiment

    Institute of Scientific and Technical Information of China (English)

    Jian LIU; Quan-bao WANG; Hai-tao ZHAO; Ji-an CHEN; Ye QIU; Deng-ping DUAN

    2013-01-01

    The optimization design of the power system is essential for stratospheric airships with paradoxical requirements of high reliability and low weight.The methodology of orthogonal experiment is presented to deal with the problem of the optimization design of the airship's power system.Mathematical models of the solar array,regenerative fuel cell,and power management subsystem (PMS) are presented.The basic theory of the method of orthogonal experiment is discussed,and the selection of factors and levels of the experiment and the choice of the evaluation function are also revealed.The proposed methodology is validated in the optimization design of the power system of the ZhiYuan-2 stratospheric airship.Results show that the optimal configuration is easily obtained through this methodology.Furthermore,the optimal configuration and three sub-optimal configurations are in the Pareto frontier of the design space.Sensitivity analyses for the weight and reliability of the airship's power system are presented.

  15. Multicriteria decision-making analysis based methodology for predicting carbonate rocks' uniaxial compressive strength

    Directory of Open Access Journals (Sweden)

    Ersoy Hakan

    2012-10-01

    Full Text Available

    ABSTRACT

    Uniaxial compressive strength (UCS deals with materials' to ability to withstand axially-directed pushing forces and especially considered to be rock materials' most important mechanical properties. However, the UCS test is an expensive, very time-consuming test to perform in the laboratory and requires high-quality core samples having regular geometry. Empirical equations were thus proposed for predicting UCS as a function of rocks' index properties. Analytical hierarchy process and multiple regression analysis based methodology were used (as opposed to traditional linear regression methods on data-sets obtained from carbonate rocks in NE Turkey. Limestone samples ranging from Devonian to late Cretaceous ages were chosen; travertine-onyx samples were selected from morphological environments considering their surface environmental conditions Test results from experiments carried out on about 250 carbonate rock samples were used in deriving the model. While the hierarchy model focused on determining the most important index properties affecting on UCS, regression analysis established meaningful relationships between UCS and index properties; 0. 85 and 0. 83 positive coefficient correlations between the variables were determined by regression analysis. The methodology provided an appropriate alternative to quantitative estimation of UCS and avoided the need for tedious and time consuming laboratory testing


    RESUMEN

    La resistencia a la compresión uniaxial (RCU trata con la capacidad de los materiales para soportar fuerzas empujantes dirigidas axialmente y, especialmente, es considerada ser uno de las más importantes propiedades mecánicas de

  16. A GIS-based methodology to delineate potential areas for groundwater development: a case study from Kathmandu Valley, Nepal

    Science.gov (United States)

    Pandey, Vishnu P.; Shrestha, Sangam; Kazama, Futaba

    2013-06-01

    For an effective planning of activities aimed at recovering aquifer depletion and maintaining health of groundwater ecosystem, estimates of spatial distribution in groundwater storage volume would be useful. The estimated volume, if analyzed together with other hydrogeologic characteristics, may help delineate potential areas for groundwater development. This study proposes a GIS-based ARC model to delineate potential areas for groundwater development; where `A' stands for groundwater availability, `R' for groundwater release potential of soil matrix, and `C' for cost for groundwater development. The model is illustrated with a case of the Kathmandu Valley in Central Nepal, where active discussions are going on to develop and implement groundwater management strategies. The study results show that shallow aquifers have high groundwater storage potential (compared to the deep) and favorable areas for groundwater development are concentrated at some particular areas in shallow and deep aquifers. The distribution of groundwater storage and potential areas for groundwater development are then mapped using GIS.

  17. Post flood damage data collection and assessment in Albania based on DesInventar methodology

    Science.gov (United States)

    Toto, Emanuela; Massabo, Marco; Deda, Miranda; Rossello, Laura

    2015-04-01

    In 2013 in Albania was implemented a collection of disaster losses based on Desinventar. The DesInventar system consists in a methodology and software tool that lead to the systematic collection, documentation and analysis of loss data on disasters. The main sources of information about disasters used for the Albanian database were the Albanian Ministry of Internal Affairs, the National Library and the State archive. Specifically for floods the database created contains nearly 900 datasets, for a period of 148 years (from 1865 to 2013). The data are georeferenced on the administrative units of Albania: Region, Provinces and Municipalities. The datasets describe the events by reporting the date of occurrence, the duration, the localization in administrative units and the cause. Additional information regards the effects and damage that the event caused on people (deaths, injured, missing, affected, relocated, evacuated, victims) and on houses (houses damaged or destroyed). Other quantitative indicators are the losses in local currency or US dollars, the damage on roads, the crops affected , the lost cattle and the involvement of social elements over the territory such as education and health centers. Qualitative indicators simply register the sectors (e.g. transportations, communications, relief, agriculture, water supply, sewerage, power and energy, industries, education, health sector, other sectors) that were affected. Through the queries and analysis of the data collected it was possible to identify the most affected areas, the economic loss, the damage in agriculture, the houses and people affected and many other variables. The most vulnerable Regions for the past floods in Albania were studied and individuated, as well as the rivers that cause more damage in the country. Other analysis help to estimate the damage and losses during the main flood events of the recent years, occurred in 2010 and 2011, and to recognize the most affected sectors. The database was

  18. The Toxicity of Depleted Uranium

    Directory of Open Access Journals (Sweden)

    Wayne Briner

    2010-01-01

    Full Text Available Depleted uranium (DU is an emerging environmental pollutant that is introduced into the environment primarily by military activity. While depleted uranium is less radioactive than natural uranium, it still retains all the chemical toxicity associated with the original element. In large doses the kidney is the target organ for the acute chemical toxicity of this metal, producing potentially lethal tubular necrosis. In contrast, chronic low dose exposure to depleted uranium may not produce a clear and defined set of symptoms. Chronic low-dose, or subacute, exposure to depleted uranium alters the appearance of milestones in developing organisms. Adult animals that were exposed to depleted uranium during development display persistent alterations in behavior, even after cessation of depleted uranium exposure. Adult animals exposed to depleted uranium demonstrate altered behaviors and a variety of alterations to brain chemistry. Despite its reduced level of radioactivity evidence continues to accumulate that depleted uranium, if ingested, may pose a radiologic hazard. The current state of knowledge concerning DU is discussed.

  19. Depletable resources and the economy.

    NARCIS (Netherlands)

    Heijman, W.J.M.

    1991-01-01

    The subject of this thesis is the depletion of scarce resources. The main question to be answered is how to avoid future resource crises. After dealing with the complex relation between nature and economics, three important concepts in relation with resource depletion are discussed: steady state, ti

  20. Methodology for Definition of Yellow Fever Priority Areas, Based on Environmental Variables and Multiple Correspondence Analyses

    OpenAIRE

    Eduardo Stramandinoli Moreno; Rita de Cássia Barradas Barata

    2012-01-01

    Yellow fever (YF) is endemic in much of Brazil, where cases of the disease are reported every year. Since 2008, outbreaks of the disease have occurred in regions of the country where no reports had been registered for decades, which has obligated public health authorities to redefine risk areas for the disease. The aim of the present study was to propose a methodology of environmental risk analysis for defining priority municipalities for YF vaccination, using as example, the State of São Pau...

  1. A Methodology for Platform Based High—Level System—on—Chip Verification

    Institute of Scientific and Technical Information of China (English)

    GAOFeng; LIUPeng; YAOQingdong

    2003-01-01

    The time-to-market challenge has increased the need for shortening the co-verification time in system-on-chip development.In this article,a new methodology of high-level hardware/software coverification is introduced.With the help of the real-time operating system,the application program can easily be migrated from the software simulator to the hardware emulation board.The hierarchical architecture can be used to separate application program from the implementation of the platform during the veriflaction process.The highlevel verification platform is successfully used in developing the HDTV decoding chip.

  2. A Methodological Review for the Analysis of Divide and Conquer Based Sorting/ Searching Algorithms

    Directory of Open Access Journals (Sweden)

    Deepak Abhyankar

    2011-09-01

    Full Text Available This paper develops a practical methodology for the analysis of sorting/searching algorithms. To achieve this objective an analytical study of Quicksort and searching problem was undertaken. This work explains that asymptotic analysis can be misleading if applied slovenly. The study provides a fresh insight into the working of Quicksort and Binary search. Also this presents an exact analysis of Quicksort. Our study finds that asymptotic analysis is a sort of approximation and may hide many useful facts. It was shown that infinite inefficient algorithms can easily be classified with a few efficient algorithms using asymptotic approach.

  3. Methodological issues in analyzing small populations using CCHS cycles based on the official language minority studies.

    Science.gov (United States)

    Makvandi, Ewa; Bouchard, Louise; Bergeron, Pierre-Jerôme; Sedigh, Golnaz

    2013-01-01

    Statistical analyses for small populations or small domains of interest can be challenging. To obtain reliable estimates, only very large surveys such as the Canadian Community Health Survey can be considered. However, despite its good geographical and temporal coverage, the analysis of small populations in smaller regions (e.g., health regions) and in regards to specific health issues remains challenging. We will look at the methodological issues in analysis of small populations in relation to sampling and non-sampling survey errors that affect the precision and accuracy of the estimates. Francophone minorities in Canada will be used to illustrate the issues throughout the paper. PMID:24300323

  4. A problem-based approach to teaching research methodology to medical graduates in Iran

    Directory of Open Access Journals (Sweden)

    Mehrdad Jalalian Hosseini

    2009-08-01

    Full Text Available Physicians are reticent to participate in research projects for avariety of reasons. Facilitating the active involvement ofdoctors in research projects is a high priority for the IranianBlood Transfusion Organization (IBTO. A one-month trainingcourse on research methodology was conducted for a groupof physicians in Mashhad, in northeast Iran. The participantswere divided in ten groups. They prepared a researchproposal under the guidance of a workshop leader. Thequality of the research proposals, which were prepared by allparticipants, went beyond our expectations. All of theresearch proposals were relevant to blood safety. In this briefreport we describe our approach.

  5. Assertion based verification methodology for HDL designs of primary sodium pump speed and eddy current flow measurement systems of PFBR

    International Nuclear Information System (INIS)

    With the growing complexity and size of digital designs, functional verification has become a huge challenge. The validation and testing process accounts for a significant percentage of the overall development effort and cost for electronic systems. Many studies have shown that up to 70% of the design development time and resources are spent on functional verification. Functional errors manifest themselves very early in the design flow, and unless they are detected upfront, they can result in severe consequences - both financially and from a safety viewpoint. This paper covers the various types of verification methodologies and focuses on Assertion Based Verification Methodology for HDL designs, taking as case studies, the Primary Sodium Pump Speed and Eddy Current Flow Measurement Systems of PFBR. (author)

  6. A PDE-based methodology for modeling, parameter estimation and feedback control in structural and structural acoustic systems

    Science.gov (United States)

    Banks, H. T.; Brown, D. E.; Metcalf, Vern L.; Silcox, R. J.; Smith, Ralph C.; Wang, Yun

    1994-01-01

    A problem of continued interest concerns the control of vibrations in a flexible structure and the related problem of reducing structure-borne noise in structural acoustic systems. In both cases, piezoceramic patches bonded to the structures have been successfully used as control actuators. Through the application of a controlling voltage, the patches can be used to reduce structural vibrations which in turn lead to methods for reducing structure-borne noise. A PDE-based methodology for modeling, estimating physical parameters, and implementing a feedback control scheme for problems of this type is discussed. While the illustrating example is a circular plate, the methodology is sufficiently general so as to be applicable in a variety of structural and structural acoustic systems.

  7. A Methodology for Mapping Meanings in Text-Based Sustainability Communication

    Directory of Open Access Journals (Sweden)

    Mark Brown

    2013-06-01

    Full Text Available In moving society towards more sustainable forms of consumption and production, social learning must play an important role. Making the assumption that it occurs as a consequence of changes in understanding, this article presents a methodology for mapping meanings in sustainability communication texts. The methodology uses techniques from corpus linguistics and framing theory. Two large databases of text were constructed by copying material down from the websites of two different groups of social actors: (i environmental NGOs and (ii British green business, and saving it as .txt files. The findings on individual words show that the NGOs and business use them very differently. Focusing on words expressing concern for the natural environment, it is proposed that the two actors also conceptualize their concern differently. Green business’s cognitive system of concern has two well-developed frames; good intentions and risk management. However, three frames—concern for the natural environment, perception of the damage, and responsibility, are light on detail. In contrast, within the NGOs’ system of concern, the frames of concern for the natural environment, perception of the damage and responsibility, contain words making detailed representations.

  8. Preparation of guides, on mechanics of fluids, for the physics teaching based on the investigatory methodology

    Energy Technology Data Exchange (ETDEWEB)

    Munoz, Loreto Mora; Buzzo, Ricardo; Martinez-Mardones, Javier; Romero, Angel [Instituto de Fisica, Pontificia Universidad Catolica de Valparaiso Av. Brasil 2950, Valparaiso (Chile)], E-mail: jmartine@ucv.cl

    2008-11-01

    The challenges in the present educational reform emphasize the professional character of the teacher in the planning of classes, execution of activities and evaluation of learning. A set of planned activities is not to a class of science as a pile of thrown bricks is to a house; this is, that if it is not counted on the knowledge of the preconceptions of the students, the daily realities that they face and of the expectations that these have at the time of participating in the science classes, cannot be obtained the proposed objectives. The well-known investigatory method applied to the education of sciences approaches the conceptual contents from practical activities of easy reproduction that are guided in effective form towards the topics to teach. Guides OPPS (Operation Primary Physics Science), of Louisiana University, are excellent examples of the application of this methodology, as much in the material that corresponds to the students as in the material for the guide of the learning activities (call Leader of the class). This international experience, within the framework used of the Plans and Programs of the Ministry of Education of Chile (MINEDUC), is the main axis of this work in which the accomplishment of guides with this methodology considers, approaching contained of a unit of the common plan of physics for third grade of high school.

  9. A methodology based in particle swarm optimization algorithm for preventive maintenance focused in reliability and cost

    International Nuclear Information System (INIS)

    In this work, a Particle Swarm Optimization Algorithm (PSO) is developed for preventive maintenance optimization. The proposed methodology, which allows the use flexible intervals between maintenance interventions, instead of considering fixed periods (as usual), allows a better adaptation of scheduling in order to deal with the failure rates of components under aging. Moreover, because of this flexibility, the planning of preventive maintenance becomes a difficult task. Motivated by the fact that the PSO has proved to be very competitive compared to other optimization tools, this work investigates the use of PSO as an alternative tool of optimization. Considering that PSO works in a real and continuous space, it is a challenge to use it for discrete optimization, in which scheduling may comprise variable number of maintenance interventions. The PSO model developed in this work overcome such difficulty. The proposed PSO searches for the best policy for maintaining and considers several aspects, such as: probability of needing repair (corrective maintenance), the cost of such repairs, typical outage times, costs of preventive maintenance, the impact of maintaining the reliability of systems as a whole, and the probability of imperfect maintenance. To evaluate the proposed methodology, we investigate an electro-mechanical system consisting of three pumps and four valves, High Pressure Injection System (HPIS) of a PWR. Results show that PSO is quite efficient in finding the optimum preventive maintenance policies for the HPIS. (author)

  10. Experimental methodology for turbocompressor in-duct noise evaluation based on beamforming wave decomposition

    Science.gov (United States)

    Torregrosa, A. J.; Broatch, A.; Margot, X.; García-Tíscar, J.

    2016-08-01

    An experimental methodology is proposed to assess the noise emission of centrifugal turbocompressors like those of automotive turbochargers. A step-by-step procedure is detailed, starting from the theoretical considerations of sound measurement in flow ducts and examining specific experimental setup guidelines and signal processing routines. Special care is taken regarding some limiting factors that adversely affect the measuring of sound intensity in ducts, namely calibration, sensor placement and frequency ranges and restrictions. In order to provide illustrative examples of the proposed techniques and results, the methodology has been applied to the acoustic evaluation of a small automotive turbocharger in a flow bench. Samples of raw pressure spectra, decomposed pressure waves, calibration results, accurate surge characterization and final compressor noise maps and estimated spectrograms are provided. The analysis of selected frequency bands successfully shows how different, known noise phenomena of particular interest such as mid-frequency "whoosh noise" and low-frequency surge onset are correlated with operating conditions of the turbocharger. Comparison against external inlet orifice intensity measurements shows good correlation and improvement with respect to alternative wave decomposition techniques.

  11. Artificial Neural Network based γ-hadron segregation methodology for TACTIC telescope

    Energy Technology Data Exchange (ETDEWEB)

    Dhar, V.K., E-mail: veer@barc.gov.in [Astrophysical Sciences Division, Bhabha Atomic Research Centre, Mumbai 400 085 (India); Tickoo, A.K.; Koul, M.K.; Koul, R. [Astrophysical Sciences Division, Bhabha Atomic Research Centre, Mumbai 400 085 (India); Dubey, B.P. [Electronics and Instrumentation Services Division, Bhabha Atomic Research Centre, Mumbai 400 085 (India); Rannot, R.C.; Yadav, K.K.; Chandra, P.; Kothari, M.; Chanchalani, K.; Venugopal, K. [Astrophysical Sciences Division, Bhabha Atomic Research Centre, Mumbai 400 085 (India)

    2013-04-21

    The sensitivity of a Cherenkov imaging telescope is strongly dependent on the rejection of the cosmic-ray background events. The methods which have been used to achieve the segregation between the γ-rays from the source and the background cosmic-rays, include methods like Supercuts/Dynamic Supercuts, Maximum likelihood classifier, Kernel methods, Fractals, Wavelets and random forest. While the segregation potential of the neural network classifier has been investigated in the past with modest results, the main purpose of this paper is to study the γ/hadron segregation potential of various ANN algorithms, some of which are supposed to be more powerful in terms of better convergence and lower error compared to the commonly used Backpropagation algorithm. The results obtained suggest that Levenberg–Marquardt method outperforms all other methods in the ANN domain. Applying this ANN algorithm to ∼101.44h of Crab Nebula data collected by the TACTIC telescope, during November 10, 2005–January 30, 2006, yields an excess of ∼(1141±106) with a statistical significance of ∼11.07σ, as against an excess of ∼(928±100) with a statistical significance of ∼9.40σ obtained with Dynamic Supercuts selection methodology. The main advantage accruing from the ANN methodology is that it is more effective at higher energies and this has allowed us to re-determine the Crab Nebula energy spectrum in the energy range ∼1–24TeV.

  12. Health monitoring methodology based on exergetic analysis for building mechanical systems

    International Nuclear Information System (INIS)

    Exergetic analysis is not often performed in the context of retrocommissioning (RCX); this research provides insight into the benefits of incorporating this approach. Data collected from a previously developed RCX test for an air handling unit (AHU) on a college campus are used in an advanced thermodynamic analysis. The operating data is analyzed using the first and second laws and retrofit design solutions are recommended for improved system performance; the second law analysis is particularly helpful because it requires few additional calculations or data collections. The thermodynamic methodology is extended to a building's cooling plant, which uses a vapor compression refrigeration cycle (VCRC) chiller. Existing chiller data collected for the design of automated fault detection and diagnosis methodology is used. As with the AHU analysis, the second law analysis locates irreversibilities that would not be determined from a first law analysis alone. Plant data representing both normal and faulty operation is used to develop a chiller model for assessing performance and health monitoring. Data is analyzed to determine the viability of health monitoring by performing an exergy analysis on existing data. Conclusions are drawn about the usefulness of exergetic analysis for improving system operations of energy intensive building mechanical systems.

  13. PAGIS summary report of phase 1: a common methodological approach based on European data and models

    International Nuclear Information System (INIS)

    Since 1982 a joint study has been launched by the CEC with the participation of national institutions in the E.C., aiming at a Performance Assessment of Geological Isolation Systems (PAGIS) for HLW disposal. This document is a summary of the first phase of the study which was devoted to the collection of data and models and to the choice of an appropriate methodology. To this purpose, real or national sites have been chosen, which are representative of three types of continental geological formations in the E.C.: clay, granite and salt (although the choices imply no committment of any kind about their final use). Moreover, sub-seabed areas have also been identified. The study covers the following items: - basic data on waste characteristics, site data and repository designs; - methodology, which allows sensitivity and uncertainty analyses to be performed, as well as the assessment of radiation doses to individuals and populations; - preliminary modelling of radionuclide release and migration through the geosphere (near- and far-field) and the biosphere following their various pathways to man; - selection of the most relevant radionuclide release scenarios and their probability of occurrence. Reference values have been selected for the basic data as well as variants covering the various options which are under consideration in the different Countries of the E.C.

  14. Quantifying price risk of electricity retailer based on CAPM and RAROC methodology

    International Nuclear Information System (INIS)

    In restructured electricity markets, electricity retailers set up contracts with generation companies (GENCOs) and with end users to meet their load requirements at agreed upon tariff. The retailers invest consumer payments as capital in the volatile competitive market. In this paper, a model for quantifying price risk of electricity retailer is proposed. An IEEE 30 Bus test system is used to demonstrate the model. The Capital Asset Pricing Model (CAPM) is demonstrated to determine the retail electricity price for the end users. The factor Risk Adjusted Recovery on Capital (RAROC) is used to quantify the price risk involved. The methodology proposed in this paper can be used by retailer while submitting proposal for electricity tariff to the regulatory authority. (author)

  15. Development of six sigma concurrent parameter and tolerance design method based on response surface methodology

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Using Response Surface Methodology (RSM), an optimizing model of concurrent parameter and tolerance design is proposed where response mean equals its target in the target being best. The optimizing function of the model is the sum of quality loss and tolerance cost subjecting to the variance confidence region of which six sigma capability can be assured. An example is illustrated in order to compare the differences between the developed model and the parameter design with minimum variance. The results show that the proposed method not only achieves robustness, but also greatly reduces cast. The objectives of high quality and low cost of product and process can be achieved simultaneously by the application of six sigma concurrent parameter and tolerance design.

  16. SYNTHESIS METHODOLOGY FOR ACTIVE ELEMENT USING IMPEDANCES –BASED BOND GRAPH METHODS

    Directory of Open Access Journals (Sweden)

    Nasta TANASESCU

    2004-12-01

    Full Text Available This paper introduces a synthesis methodology for active elements within systems that uses frequency response function as a basis for describing required behavior. The method is applicable in the design of a new system or in the retrofit of an existing system in which an active element is required or desired. The two basis principles of bond graph modeling are the “reticulation principle” which decomposes a physical into elemental physical laws represented as network elements interacting through ports and the “power postulate” which assembles the complete model through a network of power flows representing the exchange of energy between the elements. Moreover the bond graph model leads to a rigorous definitions of the structure of the associated dynamical equations.

  17. An Effective Approach Based on Response Surface Methodology for Predicting Friction Welding Parameters

    Science.gov (United States)

    Celik, Sare; Deniz Karaoglan, Aslan; Ersozlu, Ismail

    2016-03-01

    The joining of dissimilar metals is one of the most essential necessities of industries. Manufacturing by the joint of alloy steel and normal carbon steel is used in production, because it decreases raw material cost. The friction welding process parameters such as friction pressure, friction time, upset pressure, upset time and rotating speed play the major roles in determining the strength and microstructure of the joints. In this study, response surface methodology (RSM), which is a well-known design of experiments approach, is used for modeling the mathematical relation between the responses (tensile strength and maximum temperature), and the friction welding parameters with minimum number of experiments. The results show that RSM is an effective method for this type of problems for developing models and prediction.

  18. Quantifying price risk of electricity retailer based on CAPM and RAROC methodology

    Energy Technology Data Exchange (ETDEWEB)

    Karandikar, R.G.; Khaparde, S.A.; Kulkarni, S.V. [Electrical Engineering Department, Indian Institute of Technology Bombay, Mumbai 400 076 (India)

    2007-12-15

    In restructured electricity markets, electricity retailers set up contracts with generation companies (GENCOs) and with end users to meet their load requirements at agreed upon tariff. The retailers invest consumer payments as capital in the volatile competitive market. In this paper, a model for quantifying price risk of electricity retailer is proposed. An IEEE 30 Bus test system is used to demonstrate the model. The Capital Asset Pricing Model (CAPM) is demonstrated to determine the retail electricity price for the end users. The factor Risk Adjusted Recovery on Capital (RAROC) is used to quantify the price risk involved. The methodology proposed in this paper can be used by retailer while submitting proposal for electricity tariff to the regulatory authority. (author)

  19. Towards a Tool-based Development Methodology for Pervasive Computing Applications

    CERN Document Server

    Cassou, Damien; Consel, Charles; Balland, Emilie; 10.1109/TSE.2011.107

    2012-01-01

    Despite much progress, developing a pervasive computing application remains a challenge because of a lack of conceptual frameworks and supporting tools. This challenge involves coping with heterogeneous devices, overcoming the intricacies of distributed systems technologies, working out an architecture for the application, encoding it in a program, writing specific code to test the application, and finally deploying it. This paper presents a design language and a tool suite covering the development life-cycle of a pervasive computing application. The design language allows to define a taxonomy of area-specific building-blocks, abstracting over their heterogeneity. This language also includes a layer to define the architecture of an application, following an architectural pattern commonly used in the pervasive computing domain. Our underlying methodology assigns roles to the stakeholders, providing separation of concerns. Our tool suite includes a compiler that takes design artifacts written in our language as...

  20. Challenges in implementing a Planetary Boundaries based Life-Cycle Impact Assessment methodology

    DEFF Research Database (Denmark)

    Ryberg, Morten; Owsianiak, Mikolaj; Richardson, Katherine;

    2016-01-01

    Impacts on the environment from human activities are now threatening to exceed thresholds for central Earth System processes, potentially moving the Earth System out of the Holocene state. To avoid such consequences, the concept of Planetary Boundaries was defined in 2009, and updated in 2015...... of resolving the challenges and developing such methodology is discussed. The challenges are related to technical issues, i.e., modelling and including the Earth System processes and their control variables as impact categories in Life-Cycle Impact Assessment and to theoretical considerations with respect......, for a number of processes which are essential for maintaining the Earth System in its present state. Life-Cycle Assessment was identified as a suitable tool for linking human activities to the Planetary Boundaries. However, to facilitate proper use of Life-Cycle Assessment for non-global environmental...

  1. [Nursing care systematization according to the nurses' view: a methodological approach based on grounded theory].

    Science.gov (United States)

    de Medeiros, Ana Lúcia; dos Santos, Sérgio Ribeiro; de Cabral, Rômulo Wanderley Lima

    2012-09-01

    This study was aimed at understanding, from the nurses' perspective, the experience of going through the Systematization of nursing care (SNC) in an obstetric service unit. We used grounded theory as the theoretical and methodological framework. The subjects of this study consisted of thirteen nurses from a public hospital in the city of João Pessoa, in the state of Paraíba. The data analysis resulted in the following phenomenon. "perceiving SNC as a working method that organizes, directs and improves the quality of care by bringing visibility and providing security for the nursing staff" The nurses expressed the extent of knowledge about the SNC experienced in obstetrics as well as considered the nursing process as a decision-making process, which guides the reasoning of nurses in the planning of nursing care in obstetrics. It was concluded that nurses perceive the SNC as an instrument of theoretical-practical articulation leading to personalized assistance.

  2. DTC Based Induction Motor Speed Control Using 10-Sector Methodology for Torque Ripple Reduction

    Science.gov (United States)

    Pavithra, S.; Dinesh Krishna, A. S.; Shridharan, S.

    2014-09-01

    A direct torque control (DTC) drive allows direct and independent control of flux linkage and electromagnetic torque by the selection of optimum inverter switching modes. It is a simple method of signal processing which gives excellent dynamic performance. Also transformation of coordinates and voltage decoupling are not required. However, the possible discrete inverter switching vectors cannot always generate exact stator voltage required, to obtain the demanded electromagnetic torque and flux linkages. This results in the production of ripples in the torque as well as flux waveforms. In the present paper a torque ripple reduction methodology is proposed. In this method the circular locus of flux phasor is divided into 10 sector as compared to six sector divisions in conventional DTC method. The basic DTC scheme and the 10-sector method are simulated and compared for their performance. An analysis is done with sector increment so that finally the torque ripple varies slightly as the sector is increased.

  3. A Combined Heuristic and Indicator-based Methodology for Design of Sustainable Chemical Process Plants

    DEFF Research Database (Denmark)

    Halim, Iskandar; Carvalho, Ana; Srinivasan, Rajagopalan;

    2011-01-01

    The current emphasis on sustainable production has prompted chemical plants to minimize raw material and energy usage without compromising on economics. While computer tools are available to assistin sustainability assessment, their applications are constrained to a specific domain of the design...... synthesis problem. This paper outlines a design synthesis strategy that integrates two computer methodologies – ENVOPExpert and SustainPro – for simultaneous generation, analysis, evaluation, and optimization of sustainable process alternatives. ENVOPExpert diagnoses waste sources, identifies alternatives......, and high-lights trade-offs between environmental and economic objectives. This is complemented by SustainPro which evaluates the alternatives and screens them in-depth through indicators for profit and energy, water, and raw material usage. This results in accurate identification of the root causes...

  4. A Capacitance-Based Methodology for the Estimation of Piezoelectric Coefficients of Poled Piezoelectric Materials

    KAUST Repository

    Al Ahmad, Mahmoud

    2010-10-04

    A methodology is proposed to estimate the piezoelectric coefficients of bulk piezoelectric materials using simple capacitance measurements. The extracted values of d33 and d31 from the capacitance measurements were 506 pC/N and 247 pC/N, respectively. The d33 value is in agreement with that obtained from the Berlincourt method, which gave a d33 value of 500 pC/N. In addition, the d31 value is in agreement with the value obtained from the optical method, which gave a d 31 value of 223 pC/V. These results suggest that the proposed method is a viable way to quickly estimate piezoelectric coefficients of bulk unclamped samples. © 2010 The Electrochemical Society.

  5. A methodology for defining shock tests based on shock response spectra and temporal moments

    Energy Technology Data Exchange (ETDEWEB)

    Cap, J.S.; Smallwood, D.O. [Sandia National Labs., Albuquerque, NM (United States). Mechanical and Thermal Environments Dept.

    1997-11-01

    Defining acceptable tolerances for a shock test has always been a problem due in large part to the use of Shock Response Spectra (SRS) as the sole description of the shock. While SRS do contain a wealth of information if one knows what to look for, it is commonly accepted that different agencies can generate vastly different time domain test inputs whose SRS all satisfy the test requirements within a stated tolerance band. At an even more basic level, the laboratory test specifications often fail to resemble the field environment even though the SRS appear to be similar. A concise means of bounding the time domain inputs would be of great benefit in reducing the variation in the resulting shock tests. This paper describes a methodology that uses temporal moments to improve the repeatability of shock test specifications.

  6. A chemical status predictor. A methodology based on World-Wide sediment samples.

    Science.gov (United States)

    Gredilla, A; Fdez-Ortiz de Vallejuelo, S; de Diego, A; Arana, G; Stoichev, T; Amigo, J M; Wasserman, J C; Botello, A V; Sarkar, S K; Schäfer, J; Moreno, C; de la Guardia, M; Madariaga, J M

    2015-09-15

    As a consequence of the limited resources of underdeveloped countries and the limited interest of the developed ones, the assessment of the chemical quality of entire water bodies around the world is a utopia in the near future. The methodology described here may serve as a first approach for the fast identification of water bodies that do not meet the good chemical status demanded by the European Water Framework Directive (WFD). It also allows estimating the natural background (or reference values of concentration) of the areas under study using a simple criterion. The starting point is the calculation the World-Wide Natural Background Levels (WWNBLs) and World-Wide Threshold Values (WWTVs), two indexes that depend on the concentration of seven elements present in sediments. These elements, As, Cd, Cr, Cu, Ni, Pb and Zn, have been selected taking into account the recommendations of the UNEP (United Nations Environment Programme) and USEPA (United States Environmental Protection Agency), that describe them as elements of concern with respect to environmental toxicity. The methodology has been exemplified in a case study that includes 134 sediment samples collected in 11 transitional water bodies from 7 different countries and 4 different continents. Six of the water bodies considered met the good chemical status demanded by the WFD. The rest of them exceeded the reference WWTVs, at least for one of the elements. The estuaries of the Nerbioi-Ibaizabal (Basque Country) and Cavado (Portugal), the sea inlet of Río San Pedro (Spain), the Sepetiba Bay (Brazil) and the Yucateco lagoon (Mexico) belong to that group. PMID:26143082

  7. Expanding the Parameters for Excellence in Patient Assignments: Is Leveraging an Evidence-Data-Based Acuity Methodology Realistic?

    Science.gov (United States)

    Gray, Joel; Kerfoot, Karlene

    2016-01-01

    Finding the balance of equitable assignments continues to be a challenge for health care organizations seeking to leverage evidence-based leadership practices. Ratios and subjective acuity strategies for nurse-patient staffing continue to be the dominant approach in health care organizations. In addition to ratio-based assignments and acuity-based assignment models driven by financial targets, more emphasis on using evidence-based leadership strategies to manage and create science for effective staffing is needed. In particular, nurse leaders are challenged to increase the sophistication of management of patient turnover (admissions, discharges, and transfers) and integrate tools from Lean methodologies and quality management strategies to determine the effectiveness of nurse-patient staffing. PMID:26636229

  8. Application of backtracking algorithm to depletion calculations

    International Nuclear Information System (INIS)

    Based on the theory of linear chain method for analytical depletion calculations, the burn-up matrix is decoupled by the divide and conquer strategy and the linear chain with Markov characteristic is formed. The density, activity and decay heat of every nuclide in the chain can be calculated by analytical solutions. Every possible reaction path of the nuclide must be considered during the linear chain establishment process. To confirm the calculation precision and efficiency, the algorithm which can cover all the reaction paths of the nuclide and search the paths automatically according to to problem description and precision restrictions should be sought. Through analysis and comparison of several kinds of searching algorithms, the backtracking algorithm was selected to search and calculate the linear chains using Depth First Search (DFS) method. The depletion program can solve the depletion problem adaptively and with high fidelity. The solution space and time complexity of the program were analyzed. The new developed depletion program was coupled with Monte Carlo program MCMG-II to calculate the benchmark burn-up problem of the first core of China Experimental Fast Reactor (CEFR). The initial verification and validation of the program was performed by the calculation. (author)

  9. Methodology for qualification of wood-based ash according to REACH - prestudy

    Energy Technology Data Exchange (ETDEWEB)

    Sjoeblom, Rolf (Tekedo AB, Nykoeping (Sweden)); Tivegaard, Anna-Maria (SSAB Merox AB, Oxeloesund (Sweden))

    2010-02-15

    The new European Union framework directive on waste is to be implemented during the year 2010. According to this directive, much of what today is regarded as waste will instead be assessed as by-products and in many cases fall under the new European union regulation REACH (Registration, Evaluation, Authorisation and Restriction of Chemicals). REACH applies in conjunction with the new European Union regulation CLP (Classification, Labelling and Packaging of substances and mixtures). There are introductory periods for both of these regulations, and in the case of CLP this regards transition from the present and previous rules under the dangerous substances and dangerous preparations directives (DSD and DPD, respectively). Similarly, the new framework directive on waste supersedes the previous directive and some other statements. There is a connection between the directives of waste and the rules for classification and labelling in that the classification of waste (in the categories hazardous and non-hazardous) build on (but are not identical to) the rules for labelling. Similarly, the national Swedish rules for acceptance of recycled material (waste) for use in geotechnical constructions relate to the provisions in REACH on assessment of chemical safety in the both request that the risk be assessed to be small, and that the same or similar methodologies can be applied to verify this. There is a 'reference alternative' in REACH that implies substantial testing prior to registration. Registration is the key to use of a substance even though a substance may be used as such, in a mixture, or to be released from an article. However, REACH as well as CLP contain a number of provisions for using literature data, data on similar chemicals e t c in order to avoid unnecessary testing. This especially applies to testing on humans and vertebrate animals. Vaermeforsk, through its Programme on Environmentally Friendly Use of Non-Coal Ashes has developed methodologies and

  10. A Platform-Based Design Methodology With Contracts and Related Tools for the Design of Cyber-Physical Systems

    OpenAIRE

    Nuzzo, P; Sangiovanni-Vincentelli, AL; Bresolin, D; Geretti, L; Villa, T

    2015-01-01

    © 2015 IEEE. We introduce a platform-based design methodology that uses contracts to specify and abstract the components of a cyber-physical system (CPS), and provide formal support to the entire CPS design flow. The design is carried out as a sequence of refinement steps from a high-level specification to an implementation built out of a library of components at the lower level. We review formalisms and tools that can be used to specify, analyze, or synthesize the design at different levels ...

  11. On-line new event detection and clustering using the concepts of the cover coefficient-based clustering methodology

    OpenAIRE

    Vural, Ahmet

    2002-01-01

    Cataloged from PDF version of article. In this study, we use the concepts of the cover coefficient-based clustering methodology (C3 M) for on-line new event detection and event clustering. The main idea of the study is to use the seed selection process of the C3 M algorithm for the purpose of detecting new events. Since C3 M works in a retrospective manner, we modify the algorithm to work in an on-line environment. Furthermore, in order to prevent producing oversize...

  12. A methodology for obtaining the control rods patterns in a BWR using systems based on ants colonies

    International Nuclear Information System (INIS)

    In this work the AZCATL-PBC system based on a technique of ants colonies for the search of control rods patterns of those reactors of the Nuclear Power station of Laguna Verde (CNLV) is presented. The technique was applied to a transition cycle and one of balance. For both cycles they were compared the kef values obtained with a Haling calculation and the control rods pattern proposed by AZCATL-PBC for a burnt one fixed. It was found that the methodology is able to extend the length of the cycle with respect to the Haling prediction, maintaining sure to the reactor. (Author)

  13. Estimation of internal radiation dose in human based on animal data. Application of methodology in drug metabolism and pharmacokinetics

    International Nuclear Information System (INIS)

    Before conducting human study on radiolabeled drug, internal radiation dose is evaluated based on the animal data. Generally, however, species difference in the elimination process of radioactivity, mostly in the hepatic metabolism, is ignored. The methodology of correction was described for drugs that are eliminated mostly by hepatic metabolism. We showed the validity of using the method where the hepatic clearance in animal and human are constructed by the hepatic blood flow, protein unbound fraction and metabolic rate in vitro, and the internal radiation exposure calculated is corrected by the animal/human ratio of the hepatic clearance. (author)

  14. Benchmarking Public Policy : Methodological Insights from Measurement of School Based Management

    OpenAIRE

    Parandekar, Suhas D.

    2014-01-01

    This working paper presents a benchmarking analysis of School Based Management (SBM) using empirical data from the Philippines. School based management is widely used as a policy tool in many countries that seek to improve the quality of service delivery through decentralization. School based management typically takes many years to have an impact on educational outcomes, but policy makers...

  15. Depletable resources and the economy.

    OpenAIRE

    Heijman, W. J. M.

    1991-01-01

    The subject of this thesis is the depletion of scarce resources. The main question to be answered is how to avoid future resource crises. After dealing with the complex relation between nature and economics, three important concepts in relation with resource depletion are discussed: steady state, time preference and efficiency.For the steady state, three variants are distinguished; the stationary state, the physical steady state and the state of steady growth. It is concluded that the so-call...

  16. Visual methodologies and participatory action research: Performing women's community-based health promotion in post-Katrina New Orleans.

    Science.gov (United States)

    Lykes, M Brinton; Scheib, Holly

    2016-01-01

    Recovery from disaster and displacement involves multiple challenges including accompanying survivors, documenting effects, and rethreading community. This paper demonstrates how African-American and Latina community health promoters and white university-based researchers engaged visual methodologies and participatory action research (photoPAR) as resources in cross-community praxis in the wake of Hurricane Katrina and the flooding of New Orleans. Visual techniques, including but not limited to photonarratives, facilitated the health promoters': (1) care for themselves and each other as survivors of and responders to the post-disaster context; (2) critical interrogation of New Orleans' entrenched pre- and post-Katrina structural racism as contributing to the racialised effects of and responses to Katrina; and (3) meaning-making and performances of women's community-based, cross-community health promotion within this post-disaster context. This feminist antiracist participatory action research project demonstrates how visual methodologies contributed to the co-researchers' cross-community self- and other caring, critical bifocality, and collaborative construction of a contextually and culturally responsive model for women's community-based health promotion post 'unnatural disaster'. Selected limitations as well as the potential for future cross-community antiracist feminist photoPAR in post-disaster contexts are discussed. PMID:27080253

  17. A Mode-Shape-Based Fault Detection Methodology for Cantilever Beams

    Science.gov (United States)

    Tejada, Arturo

    2009-01-01

    An important goal of NASA's Internal Vehicle Health Management program (IVHM) is to develop and verify methods and technologies for fault detection in critical airframe structures. A particularly promising new technology under development at NASA Langley Research Center is distributed Bragg fiber optic strain sensors. These sensors can be embedded in, for instance, aircraft wings to continuously monitor surface strain during flight. Strain information can then be used in conjunction with well-known vibrational techniques to detect faults due to changes in the wing's physical parameters or to the presence of incipient cracks. To verify the benefits of this technology, the Formal Methods Group at NASA LaRC has proposed the use of formal verification tools such as PVS. The verification process, however, requires knowledge of the physics and mathematics of the vibrational techniques and a clear understanding of the particular fault detection methodology. This report presents a succinct review of the physical principles behind the modeling of vibrating structures such as cantilever beams (the natural model of a wing). It also reviews two different classes of fault detection techniques and proposes a particular detection method for cracks in wings, which is amenable to formal verification. A prototype implementation of these methods using Matlab scripts is also described and is related to the fundamental theoretical concepts.

  18. A methodological proposal for the development of an HPC-based antenna array scheduler

    Science.gov (United States)

    Bonvallet, Roberto; Hoffstadt, Arturo; Herrera, Diego; López, Daniela; Gregorio, Rodrigo; Almuna, Manuel; Hiriart, Rafael; Solar, Mauricio

    2010-07-01

    As new astronomy projects choose interferometry to improve angular resolution and to minimize costs, preparing and optimizing schedules for an antenna array becomes an increasingly critical task. This problem shares similarities with the job-shop problem, which is known to be a NP-hard problem, making a complete approach infeasible. In the case of ALMA, 18000 projects per season are expected, and the best schedule must be found in the order of minutes. The problem imposes severe difficulties: the large domain of observation projects to be taken into account; a complex objective function, composed of several abstract, environmental, and hardware constraints; the number of restrictions imposed and the dynamic nature of the problem, as weather is an ever-changing variable. A solution can benefit from the use of High-Performance Computing for the final implementation to be deployed, but also for the development process. Our research group proposes the use of both metaheuristic search and statistical learning algorithms, in order to create schedules in a reasonable time. How these techniques will be applied is yet to be determined as part of the ongoing research. Several algorithms need to be implemented, tested and evaluated by the team. This work presents the methodology proposed to lead the development of the scheduler. The basic functionality is encapsulated into software components implemented on parallel architectures. These components expose a domain-level interface to the researchers, enabling then to develop early prototypes for evaluating and comparing their proposed techniques.

  19. Study on an ISO 15926 based data modeling methodology for nuclear power industry

    International Nuclear Information System (INIS)

    The scope is therefore data integration and data to support the whole life of a plant. This representation is specified by a generic, conceptual Data Model (DM) that is independent of any particular application, but that is able to record data from the applications used in plant design, fabrication and operation. The data model is designed to be used in conjunction with Reference Data (RD): standard instances of the DM that represent information common to a number of users, plants, or both. This paper introduces a high level description of the structure of ISO 15926 and how this can be adapted to the nuclear power plant industry in particular. This paper introduces ISO 15926 methodology and how to extend the existing RDL for nuclear power industry. As the ISO 15926 representation is independent of applications, interfaces to existing or future applications have to be developed. Such interfaces are provided by Templates that takes input from external sources and 'lifts' it into an ISO 15926 repository, and/or 'lowers' the data into other applications. This is a similar process to the process defined by W3C. Data exchange can be done using e.g. XML messages, but the modelling is independent of technology used for the exchange

  20. A radioisotope based methodology for plant-fungal interactions in the rhizosphere

    Energy Technology Data Exchange (ETDEWEB)

    Weisenberger, A. G.; Bonito, G.; Lee, S.; McKisson, J. E.; Gryganskyi, A.; Reid, C. D.; Smith, M. F.; Vaidyanathan, G.; Welch, B.

    2013-10-01

    In plant ecophysiology research there is interest in studying the biology of the rhizosphere because of its importance in plant nutrient-interactions. The rhizosphere is the zone of soil surrounding a plant's root system where microbes (such as fungi) are influenced by the root and the roots by the microbes. We are investigating a methodology for imaging the distribution of molecular compounds of interest in the rhizosphere without disturbing the root or soil habitat. Our intention is to develop a single photon emission computed tomography (SPECT) system (PhytoSPECT) to image the bio-distribution of fungi in association with a host plant's roots. The technique we are exploring makes use of radioactive isotopes as tracers to label molecules that bind to fungal-specific compounds of interest and to image the fungi distribution in the plant and/or soil. We report on initial experiments designed to test the ability of fungal-specific compounds labeled with an iodine radioisotope that binds to chitin monomers (N-acetylglucosamine). Chitin is a compound not found in roots but in fungal cell walls. We will test the ability to label the compound with radioactive isotopes of iodine ({sup 125}I, and {sup 123}I).

  1. IMPROVING PSYCHOMOTRICITY COMPONENTS IN PRESCHOOL CHILDREN USING TEACHING METHODOLOGIES BASED ON MIRROR NEURONS ACTIVATION

    Directory of Open Access Journals (Sweden)

    Gáll Zs. Sz.

    2015-08-01

    Full Text Available The scientific substrate of the study relies upon the concept of mirror neurons. Unlike other neurons, these are characterized by an imitation feature. They play an important role in learning processes – especially during childhood, enabling the imitation of motions and determining the primary acquirement thereof. Using this as a starting point, the study aims to work out and apply a methodology in keeping with the content of the psychomotor expression activities curriculum for preschool education, resorting to the demonstration procedures as a main teaching-learning method. Thus, we deem that mirror neurons reactivity will be determined more thoroughly, with a view to enhance the subject's psychomotor development according to body scheme, self-image and performance of basic postures and motions. For the research progress, an experimental group and a control group has been set up and the children’s psychomotor development level has been assessed both before the application of the independent variable and after the effects of the same upon the experimental group. As soon as the planned procedure was completed, the experimental group members showed a significant evolution in terms of the investigated psychomotor fields as compared to the control group.

  2. Vibrational Study and Force Field of the Citric Acid Dimer Based on the SQM Methodology

    Directory of Open Access Journals (Sweden)

    Laura Cecilia Bichara

    2011-01-01

    Full Text Available We have carried out a structural and vibrational theoretical study for the citric acid dimer. The Density Functional Theory (DFT method with the B3LYP/6-31G∗ and B3LYP/6-311++G∗∗ methods have been used to study its structure and vibrational properties. Then, in order to get a good assignment of the IR and Raman spectra in solid phase of dimer, the best fit possible between the calculated and recorded frequencies was carry out and the force fields were scaled using the Scaled Quantum Mechanic Force Field (SQMFF methodology. An assignment of the observed spectral features is proposed. A band of medium intensity at 1242 cm−1 together with a group of weak bands, previously not assigned to the monomer, was in this case assigned to the dimer. Furthermore, the analysis of the Natural Bond Orbitals (NBOs and the topological properties of electronic charge density by employing Bader's Atoms in Molecules theory (AIM for the dimer were carried out to study the charge transference interactions of the compound.

  3. OPTIMIZATION OF POTASSIUM NITRATE BASED SOLID PROPELLANT GRAINS FORMULATION USING RESPONSE SURFACE METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Oladipupo Olaosebikan Ogunleye

    2015-08-01

    Full Text Available This study was designed to evaluate the effect of propellant formulation and geometry on the solid propellant grains internal ballistic performance using core, bates, rod and tubular and end-burn geometries. Response Surface Methodology (RSM was used to analyze and optimize the effect of sucrose, potassium nitrate and carbon on the chamber pressure, temperature, thrust and specific impulse of the solid propellant grains through Central Composite Design (CCD of the experiment. An increase in potassium nitrate increased the specific impulse while an increase in sucrose and carbon decreased specific impulse. The coefficient of determination (R2 for models of chamber pressure, temperature, thrust and specific impulse in terms of composition and geometry were 0.9737, 0.9984, 0.9745 and 0.9589, respectively. The optimum specific impulse of 127.89 s, pressure (462201 Pa, temperature (1618.3 K and thrust (834.83 N were obtained using 0.584 kg of sucrose, 1.364 kg of potassium nitrate and 0.052 kg of carbon as well as bate geometry. There was no significant difference between the calculated and experimented ballistic properties at p < 0.05. The bate grain geometry is more efficient for minimizing the oscillatory pressure in the combustion chamber.

  4. Spatial analysis of electricity demand patterns in Greece: Application of a GIS-based methodological framework

    Science.gov (United States)

    Tyralis, Hristos; Mamassis, Nikos; Photis, Yorgos N.

    2016-04-01

    We investigate various uses of electricity demand in Greece (agricultural, commercial, domestic, industrial use as well as use for public and municipal authorities and street lightning) and we examine their relation with variables such as population, total area, population density and the Gross Domestic Product. The analysis is performed on data which span from 2008 to 2012 and have annual temporal resolution and spatial resolution down to the level of prefecture. We both visualize the results of the analysis and we perform cluster and outlier analysis using the Anselin local Moran's I statistic as well as hot spot analysis using the Getis-Ord Gi* statistic. The definition of the spatial patterns and relationships of the aforementioned variables in a GIS environment provides meaningful insight and better understanding of the regional development model in Greece and justifies the basis for an energy demand forecasting methodology. Acknowledgement: This research has been partly financed by the European Union (European Social Fund - ESF) and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: ARISTEIA II: Reinforcement of the interdisciplinary and/ or inter-institutional research and innovation (CRESSENDO project; grant number 5145).

  5. Towards a complete propagation uncertainties in depletion calculations

    Energy Technology Data Exchange (ETDEWEB)

    Martinez, J.S. [Universidad Politecnica de Madrid (Spain). Dept. of Nuclear Engineering; Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Garching (Germany); Zwermann, W.; Gallner, L.; Puente-Espel, Federico; Velkov, K.; Hannstein, V. [Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Garching (Germany); Cabellos, O. [Universidad Politecnica de Madrid (Spain). Dept. of Nuclear Engineering

    2013-07-01

    Propagation of nuclear data uncertainties to calculated values is interesting for design purposes and libraries evaluation. XSUSA, developed at GRS, propagates cross section uncertainties to nuclear calculations. In depletion simulations, fission yields and decay data are also involved and are a possible source of uncertainty that should be taken into account. We have developed tools to generate varied fission yields and decay libraries and to propagate uncertainties through depletion in order to complete the XSUSA uncertainty assessment capabilities. A generic test to probe the methodology is defined and discussed. (orig.)

  6. Towards a cognitive robotics methodology for reward-based decision-making: dynamical systems modelling of the Iowa Gambling Task

    Science.gov (United States)

    Lowe, Robert; Ziemke, Tom

    2010-09-01

    The somatic marker hypothesis (SMH) posits that the role of emotions and mental states in decision-making manifests through bodily responses to stimuli of import to the organism's welfare. The Iowa Gambling Task (IGT), proposed by Bechara and Damasio in the mid-1990s, has provided the major source of empirical validation to the role of somatic markers in the service of flexible and cost-effective decision-making in humans. In recent years the IGT has been the subject of much criticism concerning: (1) whether measures of somatic markers reveal that they are important for decision-making as opposed to behaviour preparation; (2) the underlying neural substrate posited as critical to decision-making of the type relevant to the task; and (3) aspects of the methodological approach used, particularly on the canonical version of the task. In this paper, a cognitive robotics methodology is proposed to explore a dynamical systems approach as it applies to the neural computation of reward-based learning and issues concerning embodiment. This approach is particularly relevant in light of a strongly emerging alternative hypothesis to the SMH, the reversal learning hypothesis, which links, behaviourally and neurocomputationally, a number of more or less complex reward-based decision-making tasks, including the 'A-not-B' task - already subject to dynamical systems investigations with a focus on neural activation dynamics. It is also suggested that the cognitive robotics methodology may be used to extend systematically the IGT benchmark to more naturalised, but nevertheless controlled, settings that might better explore the extent to which the SMH, and somatic states per se, impact on complex decision-making.

  7. Predicting Pedestrian Flow: A Methodology and a Proof of Concept Based on Real-Life Data

    OpenAIRE

    Maria Davidich; Gerta Köster

    2013-01-01

    Building a reliable predictive model of pedestrian motion is very challenging: Ideally, such models should be based on observations made in both controlled experiments and in real-world environments. De facto, models are rarely based on real-world observations due to the lack of available data; instead, they are largely based on intuition and, at best, literature values and laboratory experiments. Such an approach is insufficient for reliable simulations of complex real-life scenarios: For in...

  8. Estimating initial contaminant mass based on fitting mass-depletion functions to contaminant mass discharge data: Testing method efficacy with SVE operations data

    Science.gov (United States)

    Mainhagu, J.; Brusseau, M. L.

    2016-09-01

    The mass of contaminant present at a site, particularly in the source zones, is one of the key parameters for assessing the risk posed by contaminated sites, and for setting and evaluating remediation goals and objectives. This quantity is rarely known and is challenging to estimate accurately. This work investigated the efficacy of fitting mass-depletion functions to temporal contaminant mass discharge (CMD) data as a means of estimating initial mass. Two common mass-depletion functions, exponential and power functions, were applied to historic soil vapor extraction (SVE) CMD data collected from 11 contaminated sites for which the SVE operations are considered to be at or close to essentially complete mass removal. The functions were applied to the entire available data set for each site, as well as to the early-time data (the initial 1/3 of the data available). Additionally, a complete differential-time analysis was conducted. The latter two analyses were conducted to investigate the impact of limited data on method performance, given that the primary mode of application would be to use the method during the early stages of a remediation effort. The estimated initial masses were compared to the total masses removed for the SVE operations. The mass estimates obtained from application to the full data sets were reasonably similar to the measured masses removed for both functions (13 and 15% mean error). The use of the early-time data resulted in a minimally higher variation for the exponential function (17%) but a much higher error (51%) for the power function. These results suggest that the method can produce reasonable estimates of initial mass useful for planning and assessing remediation efforts.

  9. How Depleted is the MORB mantle?

    Science.gov (United States)

    Hofmann, A. W.; Hart, S. R.

    2015-12-01

    Knowledge of the degree of mantle depletion of highly incompatible elements is critically important for assessing Earth's internal heat production and Urey number. Current views of the degree of MORB source depletion are dominated by Salters and Stracke (2004), and Workman and Hart (2005). The first is based on an assessment of average MORB compositions, whereas the second considers trace element data of oceanic peridotites. Both require an independent determination of one absolute concentration, Lu (Salters & Stracke), or Nd (Workman & Hart). Both use parent-daughter ratios Lu/Hf, Sm/Nd, and Rb/Sr calculated from MORB isotopes combined with continental-crust extraction models, as well as "canonical" trace element ratios, to boot-strap the full range of trace element abundances. We show that the single most important factor in determining the ultimate degree of incompatible element depletion in the MORB source lies in the assumptions about the timing of continent extraction, exemplified by continuous extraction versus simple two-stage models. Continued crust extraction generates additional, recent mantle depletion, without affecting the isotopic composition of the residual mantle significantly. Previous emphasis on chemical compositions of MORB and/or peridotites has tended to obscure this. We will explore the effect of different continent extraction models on the degree of U, Th, and K depletion in the MORB source. Given the uncertainties of the two most popular models, the uncertainties of U and Th in DMM are at least ±50%, and this impacts the constraints on the terrestrial Urey ratio. Salters, F.J.M. and Stracke, A., 2004, Geochem. Geophys. Geosyst. 5, Q05004. Workman, R.K. and Hart, S.R., 2005, EPSL 231, 53-72.

  10. Brain-Based Learning and Classroom Practice: A Study Investigating Instructional Methodologies of Urban School Teachers

    Science.gov (United States)

    Morris, Lajuana Trezette

    2010-01-01

    The purpose of this study was to examine the implementation of brain-based instructional strategies by teachers serving at Title I elementary, middle, and high schools within the Memphis City School District. This study was designed to determine: (a) the extent to which Title I teachers applied brain-based strategies, (b) the differences in…

  11. Study on the Laser-Based Weld Surface Flaw Identification System Employing Wavelet Analysis Methodology

    DEFF Research Database (Denmark)

    Qu, Zhigang; Chong, Alvin Yung Boon; Chacon, Juan Luis Ferrando;

    2016-01-01

    . The proposed technique employs the integration of a laser-line profile sensor and processing module based on wavelet analysis. The laser-line sensor acquired the two-dimensional profile of a target weld based on the principle of laser triangulation and yield the height and width information of the weld...

  12. Response surface methodology based optimization of β-glucosidase production from Pichia pastoris.

    Science.gov (United States)

    Batra, Jyoti; Beri, Dhananjay; Mishra, Saroj

    2014-01-01

    The thermotolerant yeast Pichia etchellsii produces multiple cell bound β-glucosidases that can be used for synthesis of important alkyl- and aryl-glucosides. Present work focuses on enhancement of β-glucosidase I (BGLI) production in Pichia pastoris. In the first step, one-factor-at-a-time experimentation was used to investigate the effect of aeration, antifoam addition, casamino acid addition, medium pH, methanol concentration, and mixed feed components on BGLI production. Among these, initial medium pH, methanol concentration, and mixed feed in the induction phase were found to affect BGLI production. A 3.3-fold improvement in β-glucosidase expression was obtained at pH 7.5 as compared to pH 6.0 on induction with 1 % methanol. Addition of sorbitol, a non-repressing substrate, led to further enhancement in β-glucosidase production by 1.4-fold at pH 7.5. These factors were optimized with response surface methodology using Box-Behnken design. Empirical model obtained was used to define the optimum "operating space" for fermentation which was a pH of 7.5, methanol concentration of 1.29 %, and sorbitol concentration of 1.28 %. Interaction of pH and sorbitol had maximum effect leading to the production of 4,400 IU/L. The conditions were validated in a 3-L bioreactor with accumulation of 88 g/L biomass and 2,560 IU/L β-glucosidase activity.

  13. An accelerometry-based methodology for assessment of real-world bilateral upper extremity activity.

    Directory of Open Access Journals (Sweden)

    Ryan R Bailey

    Full Text Available The use of both upper extremities (UE is necessary for the completion of many everyday tasks. Few clinical assessments measure the abilities of the UEs to work together; rather, they assess unilateral function and compare it between affected and unaffected UEs. Furthermore, clinical assessments are unable to measure function that occurs in the real-world, outside the clinic. This study examines the validity of an innovative approach to assess real-world bilateral UE activity using accelerometry.Seventy-four neurologically intact adults completed ten tasks (donning/doffing shoes, grooming, stacking boxes, cutting playdough, folding towels, writing, unilateral sorting, bilateral sorting, unilateral typing, and bilateral typing while wearing accelerometers on both wrists. Two variables, the Bilateral Magnitude and Magnitude Ratio, were derived from accelerometry data to distinguish between high- and low-intensity tasks, and between bilateral and unilateral tasks. Estimated energy expenditure and time spent in simultaneous UE activity for each task were also calculated.The Bilateral Magnitude distinguished between high- and low-intensity tasks, and the Magnitude Ratio distinguished between unilateral and bilateral UE tasks. The Bilateral Magnitude was strongly correlated with estimated energy expenditure (ρ = 0.74, p<0.02, and the Magnitude Ratio was strongly correlated with time spent in simultaneous UE activity (ρ = 0.93, p<0.01 across tasks.These results demonstrate face validity and construct validity of this methodology to quantify bilateral UE activity during the performance of everyday tasks performed in a laboratory setting, and can now be used to assess bilateral UE activity in real-world environments.

  14. A stale challenge to the philosophy of science: commentary on "Is psychology based on a methodological error?" by Michael Schwarz.

    Science.gov (United States)

    Ruck, Nora; Slunecko, Thomas

    2010-06-01

    In his article "Is psychology based on a methodological error?" and based on a quite convincing empirical basis, Michael Schwarz offers a methodological critique of one of mainstream psychology's key test theoretical axioms, i.e., that of the in principle normal distribution of personality variables. It is characteristic of this paper--and at first seems to be a strength of it--that the author positions his critique within a frame of philosophy of science, particularly positioning himself in the tradition of Karl Popper's critical rationalism. When scrutinizing Schwarz's arguments, however, we find Schwarz's critique profound only as an immanent critique of test theoretical axioms. We raise doubts, however, as to Schwarz's alleged 'challenge' to the philosophy of science because the author not at all seems to be in touch with the state of the art of contemporary philosophy of science. Above all, we question the universalist undercurrent that Schwarz's 'bio-psycho-social model' of human judgment boils down to. In contrast to such position, we close our commentary with a plea for a context- and culture sensitive philosophy of science. PMID:20369392

  15. Computing elastic‐rebound‐motivated rarthquake probabilities in unsegmented fault models: a new methodology supported by physics‐based simulators

    Science.gov (United States)

    Field, Edward H.

    2015-01-01

    A methodology is presented for computing elastic‐rebound‐based probabilities in an unsegmented fault or fault system, which involves computing along‐fault averages of renewal‐model parameters. The approach is less biased and more self‐consistent than a logical extension of that applied most recently for multisegment ruptures in California. It also enables the application of magnitude‐dependent aperiodicity values, which the previous approach does not. Monte Carlo simulations are used to analyze long‐term system behavior, which is generally found to be consistent with that of physics‐based earthquake simulators. Results cast doubt that recurrence‐interval distributions at points on faults look anything like traditionally applied renewal models, a fact that should be considered when interpreting paleoseismic data. We avoid such assumptions by changing the "probability of what" question (from offset at a point to the occurrence of a rupture, assuming it is the next event to occur). The new methodology is simple, although not perfect in terms of recovering long‐term rates in Monte Carlo simulations. It represents a reasonable, improved way to represent first‐order elastic‐rebound predictability, assuming it is there in the first place, and for a system that clearly exhibits other unmodeled complexities, such as aftershock triggering.

  16. A stale challenge to the philosophy of science: commentary on "Is psychology based on a methodological error?" by Michael Schwarz.

    Science.gov (United States)

    Ruck, Nora; Slunecko, Thomas

    2010-06-01

    In his article "Is psychology based on a methodological error?" and based on a quite convincing empirical basis, Michael Schwarz offers a methodological critique of one of mainstream psychology's key test theoretical axioms, i.e., that of the in principle normal distribution of personality variables. It is characteristic of this paper--and at first seems to be a strength of it--that the author positions his critique within a frame of philosophy of science, particularly positioning himself in the tradition of Karl Popper's critical rationalism. When scrutinizing Schwarz's arguments, however, we find Schwarz's critique profound only as an immanent critique of test theoretical axioms. We raise doubts, however, as to Schwarz's alleged 'challenge' to the philosophy of science because the author not at all seems to be in touch with the state of the art of contemporary philosophy of science. Above all, we question the universalist undercurrent that Schwarz's 'bio-psycho-social model' of human judgment boils down to. In contrast to such position, we close our commentary with a plea for a context- and culture sensitive philosophy of science.

  17. A Comprehensive Methodology for Development, ParameterEstimation, and Uncertainty Analysis of Group Contribution Based Property Models -An Application to the Heat of Combustion

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Marcarie, Camille; Abildskov, Jens;

    2016-01-01

    A rigorous methodology is developed that addresses numerical and statistical issues when developing group contribution (GC) based property models such as regression methods, optimization algorithms, performance statistics, outlier treatment, parameter identifiability, and uncertainty of the predi...

  18. Educational software on the ozone layer Depletion

    OpenAIRE

    Psomiadis, Ploutarchos; Chalkidis, Anthimos; Saridaki, Anna; Tampakis, Constantine (Konstantinos); Skordoulis, Constantine

    2007-01-01

    This paper describes the design and the formative evaluation of educational software concerning the ‘Depletion of the Ozone Layer’ designed for the students of the Faculty of Primary Education (pre-service teachers) of the National and Kapodistrian University of Athens. The selection of the topic was based on: i) environmental criteria (importance of the phenomenon, complexity of the phenomenon), ii) societal criteria (local interest, human activities effects), iii) pedagogical cr...

  19. Project Management Methodology for the Development of M-Learning Web Based Applications

    Directory of Open Access Journals (Sweden)

    Adrian VISOIU

    2010-01-01

    Full Text Available M-learning web based applications are a particular case of web applications designed to be operated from mobile devices. Also, their purpose is to implement learning aspects. Project management of such applications takes into account the identified peculiarities. M-learning web based application characteristics are identified. M-learning functionality covers the needs of an educational process. Development is described taking into account the mobile web and its influences over the analysis, design, construction and testing phases. Activities building up a work breakdown structure for development of m-learning web based applications are presented. Project monitoring and control techniques are proposed. Resources required for projects are discussed.

  20. Assessment of RANS Based CFD Methodology using JAEA Experiment with a Wire-wrapped 127-pin Fuel Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, J. H.; Yoo, J.; Lee, K. L.; Ha, K. S. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    In this paper, we assess the RANS based CFD methodology with JAEA experimental data. The JAEA experiment study with the 127-pin wire-wrapped fuel assembly was implemented using water for validating pressure drop formulas in ASFRE code. Complicated and vortical flow phenomena in the wire-wrapped fuel bundles were captured by vortex structure identification technique based on the critical point theory. The SFR system is one of the nuclear reactors in which a recycling of transuranics (TRUs) by reusing spent nuclear fuel sustains the fission chain reaction. This situation strongly motivated the Korea Atomic Energy Research Institute (KAERI) to start a prototype Gen-4 Sodium-cooled Fast Reactor (PGSFR) design project under the national nuclear R and D program. Generally, the SFR system has a tight package of the fuel bundle and a high power density. The sodium material has a high thermal conductivity and boiling temperature than the water. That can make core design to be more compact than Light Water Reactor (LWR) through narrower sub-channels. The fuel assembly of the SFR system consists of long and thin wire-wrapped fuel bundles and a hexagonal duct, in which wire-wrapped fuel bundles in the hexagonal tube has triangular loose array. The main purpose of a wire spacer is to avoid collisions between adjacent rods. Furthermore, a wire spacer can mitigate a vortex induced vibration, and enhance convective heat transfer due to the secondary flow by helical type wire spacers. Most of numerical studies in the nuclear fields was widely conducted based on the simplified sub-channel analysis codes such as COBRA (Rowe), SABRE (Macdougall and Lillington), ASFRE (Ninokata), and MATRA-LMR (Kim et al.). The relationship between complex flow phenomena and helically wrapped-wire spacers will be discussed. The RANS based CFD methodology is evaluated with JAEA experimental data of the 127-pin wirewrapped fuel assembly. Complicated and vortical flow phenomena in the wire-wrapped fuel

  1. A test of a physically-based strong ground motion prediction methodology with the 26 September 1997, Mw = 6.0 Colfiorito (Umbria-Marche sequence), Italy earthquake.

    OpenAIRE

    Scognamiglio, L.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione CNT, Roma, Italia; Hutchings, L.; Lawrence Berkeley National Laboratory

    2009-01-01

    We test the physically-based ground motion hazard prediction methodology of Hutchings et al. [Hutchings, L., Ioannidou, E., Kalogeras, I., Voulgaris, N., Savy, J., Foxall, W., Scognamiglio, L., and Stavrakakis, G., (2007). A physically-based strong ground motion prediction methodology; Application to PSHA and the 1999 M = 6.0 Athens Earthquake. Geophys. J. Int. 168, 569–680.] through an a posteriori prediction of the 26 September 1997, Mw 6.0 Colfiorito (Umbria–Marche, Italy) earthquake at fo...

  2. A physically based strong ground-motion prediction methodology; application to PSHA and the 1999 Mw = 6.0 Athens earthquake

    OpenAIRE

    Hutchings, L.; Lawrence Livermore National Laboratory, Hazards Mitigation Center, PO Box 808, L-201, Livermore, CA 94551-0808, USA.; Ioannidou, E.; Department of Geophysics-Geothermics, University of Athens, Athens 15783, Greece; Foxall, W.; Lawrence Livermore National Laboratory, Hazards Mitigation Center, PO Box 808, L-201, Livermore, CA 94551-0808, USA.; Voulgaris, N.; Department of Geophysics-Geothermics, University of Athens, Athens 15783, Greece; Savy, J.; Lawrence Livermore National Laboratory, Hazards Mitigation Center, PO Box 808, L-201, Livermore, CA 94551-0808, USA.; Kalogeras, I.; Institute of Geodynamics, National Observatory of Athens, Athens, Greece; Scognamiglio, L.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione CNT, Roma, Italia; Stavrakakis, G.; Institute of Geodynamics, National Observatory of Athens, Athens, Greece

    2007-01-01

    We present a physically based methodology to predict the range of ground-motion hazard for earthquakes along specific faults or within specific source volumes, and we demonstrate how to incorporate this methodology into probabilistic seismic hazard analyses (PSHA). By ‘physically based,’ we refer to ground-motion syntheses derived from physics and an understanding of the earthquake process. This approach replaces the aleatory uncertainty that current PSHA studies estimate by re...

  3. Associations - Communities - Residents. Building together a citizen-based project of renewable energies - Methodological guide

    International Nuclear Information System (INIS)

    This guide first outlines the challenges and stakes of citizen-based renewable energies: example of a necessary energy transition in Brittany, interest of a local production of renewable energies, examples in other European countries, and emergence of a citizen-based energy movement in France. The second part presents the four main phases of such a project (diagnosis, development, construction, and exploitation), the main issues to be addressed, and the main steps of a citizen-based renewable energy project (technical, legal and financial, and citizen-related aspects during the different phases). The third part describes how to elaborate a citizen-based project: by addressing the project dimensions, by defining a legal specification, by performing a provisional business model, by choosing an appropriate legal structure, by creating a project company, and by mobilizing local actors). The last part addresses how to finance the project: by building up own funds, by asking banks for support, and by citizen participation to investment

  4. Developing a Methodology for Supplier Base Reduction : A Case Study at Dynapac GmbH

    OpenAIRE

    Böris, Elin; Hall, Vendela

    2015-01-01

    Dynapac GmbH is a manufacturer of road construction equipment and has historically been acquired and merged with several companies, resulting in an expansion of their supplier base. Currently, they are experiencing a large supplier base within direct material causing a decrease in the effectiveness and efficiency in the management of the suppliers. Dynapac GmbH therefore wishes to lower the number of suppliers in order to obtain desired effects, such as cost savings, reduction of administrati...

  5. Project Management Methodology for the Development of M-Learning Web Based Applications

    OpenAIRE

    Adrian VISOIU

    2010-01-01

    M-learning web based applications are a particular case of web applications designed to be operated from mobile devices. Also, their purpose is to implement learning aspects. Project management of such applications takes into account the identified peculiarities. M-learning web based application characteristics are identified. M-learning functionality covers the needs of an educational process. Development is described taking into account the mobile web and its influences over the analysis, d...

  6. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  7. Scenario development methodologies

    International Nuclear Information System (INIS)

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are a) Event tree analysis, b) Influence diagrams and c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs

  8. Data Centric Development Methodology

    Science.gov (United States)

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  9. A Geographical Information System (GIS) based methodology for determination of potential biomasses and sites for biogas plants in southern Finland

    International Nuclear Information System (INIS)

    Highlights: • The biomethane potential in southern Finland is over 3 TWh. • Agricultural biomass accounts >90% of the biomethane potential in study regions. • The GIS method can be used for detailed biogas plant planning. • The GIS provides tools for e.g. land locations, cost and emission calculations. - Abstract: Objective: The objective of this study was to analyse the spatial distribution and amount of potential biomass feedstock for biomethane production and optimal locations, sizes and number of biogas plants in southern Finland in the area of three regional waste management companies. Methods: A Geographical Information System (GIS) based methodology, which also included biomass transport optimisation considering the existing road network and spatially varied biomass sources, was used. Kernel Density (KD) maps were calculated to pinpoint areas with high biomass concentration. Results: The results show that the total amount of biomass corresponds to 2.8 TWh of energy of which agro materials account for more than 90%. It was found that a total of 49 biogas plants could be built in three case regions with feedstock available within maximum transportation radius of 10 or 40 km. With maximum of 10 km biomass transportation distance, the production capacity of the planned plants ranges from 2.1 to 8.4 MW. If transportation distance was increased to 40 km, the plant capacities could also increase from 2.3 to 16.8 MW. Conclusions: As demonstrated in this study, the studied GIS methodology can be used for identification of the most suitable locations for biogas plants by providing the tools for e.g. transportation routes and distances. Practice implications: The methodology can further be used in environmental impact assessment as well as in cost analysis

  10. Ecological modernization of socio-economic development of the region in the context of social transformations: theoretical and methodological bases

    Directory of Open Access Journals (Sweden)

    O.V. Shkarupa

    2015-09-01

    Full Text Available The aim of the article. The aim of the article is to study the theoretical and methodological bases of ecological modernization of socio-economic development of the region. The results of the analysis. This paper studies scientific basis of transformation processes for sustainable development, which is important at this time. Considering direction of the history of sustainable development concept, the author emphasizes that if to follow the basic guidelines to upgrade social and economic systems to «green» economy, then it can expected results of ecological modernization. It will save funds due to forewarned of economic damage from pollution and environmental cost savings compensation «rehabilitation» resources and territories. Moreover, prevention of anthropogenic pressure increases the chances of social systems to improve the quality of life and improve the health of the nation. From economic point of view, clean production is more competitive. This study considered the theoretical and methodological bases of ecological modernization of process of socio-economic systems in a region that involves making special reference points of development. Ecological modernization is a prerequisite for ecologically transformation based on quality eco-oriented reforms in social and economic systems. Ecologically safe transformation of socio-economic development means certain progressive changes (intersystem, and intersystem synergistic and transformations that are strategic in view of eco-focused goal-setting. Such understanding provides for: 1 understanding of transformation as a process which is already identified in environmental trend of socio-economic system; 2 spatial certainty of eco-oriented reforms in connection with certainty qualities of the future development of the system. Arguably, it can and should lead to the structural changes in innovation for sustainable development. Conclusions and directions of further researches. It is shown that

  11. Methodological exploratory study applied to occupational epidemiology

    International Nuclear Information System (INIS)

    The utilization of epidemiologic methods and techniques has been object of practical experimentation and theoretical-methodological reflection in health planning and programming process. Occupational Epidemiology is the study of the causes and prevention of diseases and injuries from exposition and risks in the work environment. In this context, there is no intention to deplete such a complex theme but to deal with basic concepts of Occupational Epidemiology, presenting the main characteristics of the analysis methods used in epidemiology, as investigate the possible determinants of exposition (chemical, physical and biological agents). For this study, the social-demographic profile of the IPEN-CNEN/SP work force was used. The knowledge of this reference population composition is based on sex, age, educational level, marital status and different occupations, aiming to know the relation between the health aggravating factors and these variables. The methodology used refers to a non-experimental research based on a theoretical methodological practice. The work performed has an exploratory character, aiming a later survey of indicators in the health area in order to analyze possible correlations related to epidemiologic issues. (author)

  12. Novel 3-D Object Recognition Methodology Employing a Curvature-Based Histogram

    Directory of Open Access Journals (Sweden)

    Liang-Chia Chen

    2013-07-01

    Full Text Available In this paper, a new object recognition algorithm employing a curvature-based histogram is presented. Recognition of three-dimensional (3-D objects using range images remains one of the most challenging problems in 3-D computer vision due to its noisy and cluttered scene characteristics. The key breakthroughs for this problem mainly lie in defining unique features that distinguish the similarity among various 3-D objects. In our approach, an object detection scheme is developed to identify targets underlining an automated search in the range images using an initial process of object segmentation to subdivide all possible objects in the scenes and then applying a process of object recognition based on geometric constraints and a curvature-based histogram for object recognition. The developed method has been verified through experimental tests for its feasibility confirmation.

  13. An Event-Based Methodology to Generate Class Diagrams and its Empirical Evaluation

    Directory of Open Access Journals (Sweden)

    Sandeep K. Singh

    2010-01-01

    Full Text Available Problem statement: Event-based systems have importance in many application domains ranging from real time monitoring systems in production, logistics, medical devices and networking to complex event processing in finance and security. The increasing popularity of Event-based systems has opened new challenging issues for them. One such issue is to carry out requirements analysis of event-based systems and build conceptual models. Currently, Object Oriented Analysis (OOA using Unified Modeling Language (UML is the most popular requirement analysis approach for which several OOA tools and techniques have been proposed. But none of the techniques and tools to the best of our knowledge, have focused on event-based requirements analysis, rather all are behavior-based approaches. Approach: This study described a requirement analysis approach specifically for event based systems. The proposed approach started from events occurring in the system and derives an importable class diagram specification in XML Metadata Interchange (XMI format for Argo UML tool. Requirements of the problem domain are captured as events in restricted natural language using the proposed Event Templates in order to reduce the ambiguity. Results: Rules were designed to extract a domain model specification (analysis-level class diagram from Event Templates. A prototype tool 'EV-ClassGEN' is also developed to provide automation support to extract events from requirements, document the extracted events in Event Templates and implement rules to derive specification for an analysis-level class diagram. The proposed approach is also validated through a controlled experiment by applying it on many cases from different application domains like real time systems, business applications, gaming. Conclusion: Results of the controlled experiment had shown that after studying and applying Event-based approach, student's perception about ease of use and usefulness of OOA technique has

  14. A Novel Multiscale Physics Based Progressive Failure Methodology for Laminated Composite Structures

    Science.gov (United States)

    Pineda, Evan J.; Waas, Anthony M.; Bednarcyk, Brett A.; Collier, Craig S.; Yarrington, Phillip W.

    2008-01-01

    A variable fidelity, multiscale, physics based finite element procedure for predicting progressive damage and failure of laminated continuous fiber reinforced composites is introduced. At every integration point in a finite element model, progressive damage is accounted for at the lamina-level using thermodynamically based Schapery Theory. Separate failure criteria are applied at either the global-scale or the microscale in two different FEM models. A micromechanics model, the Generalized Method of Cells, is used to evaluate failure criteria at the micro-level. The stress-strain behavior and observed failure mechanisms are compared with experimental results for both models.

  15. DEVELOPMENT OF THE CONTROL METHODOLOGY OF THE GIANT MAGNETOSTRICTIVE ACTUATOR BASED ON MAGNETIC FLUX DENSITY

    Institute of Scientific and Technical Information of China (English)

    Jia Zhenyuan; Yang Xing; Shi Chun; Guo Dongming

    2003-01-01

    According to the principle of the magnetostriction generating mechanism, the control model of giant magnetostriction material based on magnetic field and the control method with magnetic flux density are developed. Furthermore, this control method is used to develop a giant magnetostrictive micro-displacement actuator (GMA) and its driving system. Two control methods whose control variables are current intensity and magnetic flux density are compared with each other by experimental studies. Finally, effective methods on improving the linearity and control precision of micro-displacement actuator and reducing the hysteresis based on the controlling magnetic flux density are obtained.

  16. Methodology for Design of Web-Based Laparoscopy e-Training System

    Science.gov (United States)

    Borissova, Daniela; Mustakerov, Ivan

    2011-01-01

    The Web-based e-learning can benefit from the modern multimedia tools combined with network capabilities to overcome traditional education. The objective of this paper is focused on e-training system development to improve performance of theoretical knowledge and providing ample opportunity for practical attainment of manual skills in virtual…

  17. Comparing econometric and survey-based methodologies in measuring offshoring: The Danish experience

    DEFF Research Database (Denmark)

    Refslund, Bjarke

    2016-01-01

    such as the national or regional level. Most macro analyses are based on proxies and trade statistics with limitations. Drawing on unique Danish survey data, this article demonstrates how survey data can provide important insights into the national scale and impacts of offshoring, including changes of employment...

  18. Demonstrating the Potential for Web-Based Survey Methodology with a Case Study.

    Science.gov (United States)

    Mertler, Craig

    2002-01-01

    Describes personal experience with using the Internet to administer a teacher-motivation and job-satisfaction survey to elementary and secondary teachers. Concludes that advantages of Web-base surveys, such as cost savings and efficiency of data collection, outweigh disadvantages, such as the limitations of listservs. (Contains 10 references.)…

  19. A rigorous methodology for development and uncertainty analysis of group contribution based property models

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Abildskov, Jens; Sin, Gürkan

    ) weighted-least-square regression. 3) Initialization of estimation by use of linear algebra providing a first guess. 4) Sequential parameter and simultaneous GC parameter by using of 4 different minimization algorithms. 5) Thorough uncertainty analysis: a) based on asymptotic approximation of parameter...

  20. How to create a methodology of conceptual visualization based on experiential cognitive science and diagrammatology

    DEFF Research Database (Denmark)

    Toft, Birthe

    2013-01-01

    Based on the insights of experiential cognitive science and of diagrammatology as defined by Charles S. Peirce and Frederik Stjernfelt, this article analyses the iconic links connecting visualizations of Stjernfelt diagrams with human perception and action and starts to lay the theoretical...

  1. Comparison of Different Ground-Based NDVI Measurement Methodologies to Evaluate Crop Biophysical Properties

    Directory of Open Access Journals (Sweden)

    Rossana Monica Ferrara

    2010-06-01

    Full Text Available The usage of vegetation indices such as the Normalized Difference Vegetation Index (NDVI calculated by means of remote sensing data is widely spread for describing vegetation status on large space scale. However, a big limitation of these indices is their inadequate time resolution for agricultural purposes. This limitation could be overcome by the ground-based vegetation indices that could provide an interesting tool for integrating satellite-based value. In this work, three techniques to calculate the ground-NDVI have been evaluated for sugar beet cultivated in South Italy in all its phenological phases: the NDVI1 based on hand made reflectance measurements, the NDVI2 calculated on automatically reflectance measurements and the broadband NDVIb based on Photosynthetically Active Radiation (PAR and global radiation measurements. The best performance was obtained by the NDVIb. Moreover, crop-microclimate-NDVI relations were investigated. In particular, the relationship between NDVI and the Leaf Area Index (LAI was found logarithmic with a saturation of NDVI at LAI around 1.5 m2 m-2. A clear relation was found between NDVI and crop coefficient Kc experimentally determined by the ratio between actual and reference measured or modelled evapotranspirations, while the relation between NDVI and crop actual evapotranspiration was very weak and not usable for practical purposes. Lastly, no relationship between the microclimate and the NDVI was found.

  2. Partition-based Low Power DFT Methodology for System-on-chips

    Institute of Scientific and Technical Information of China (English)

    LI Yu-fei; CHEN Jian; FU Yu-zhuo

    2007-01-01

    This paper presents a partition-based Design-forTest (DFT) technique to reduce the power consumption during scan-based testing. This method is based on partitioning the chip into several independent scan domains. By enabling the scan domains alternatively, only a fraction of the entire chip will be active at the same time, leading to Iow power consumption during test. Therefore, it will significantly reduce the possibility of Electronic Migration and Overheating. In order to prevent the drop of fault coverage, wrappers on the boundaries between scan domains are employed. This paper also presents a detailed design flow based on Electronics Design Automation(EDA) tools from Synopsy(s) to implement the proposed test structure. The proposed DFT method is experimented on a state-of-theart System-on-chips (SOC). The simulation results show a significant reduction in both average and peak power dissipation without sacrificing the fault coverage and test time. This SOC has been taped out in TSMC and finished the final test in ADVANTEST.

  3. Dielectric Resonator-Based Flow and Stopped-Flow EPR with Rapid Field Scanning: A Methodology for Increasing Kinetic Information

    Science.gov (United States)

    Sienkiewicz, Andrzej; Ferreira, Ana Maria da Costa; Danner, Birgit; Scholes, Charles P.

    1999-02-01

    We report methodology which combines recently developed dielectric resonator-based, rapid-mix, stopped-flow EPR (appropriate for small, aqueous, lossy samples) with rapid scanning of the external (Zeeman) magnetic field where the scanning is preprogrammed to occur at selected times after the start of flow. This methodology gave spectroscopic information complementary to that obtained by stopped-flow EPR at single fields, and with low reactant usage, it yielded more graphic insight into the time evolution of radical and spin-labeled species. We first used the ascorbyl radical as a test system where rapid scans triggered after flow was stopped provided "snapshots" of simultaneously evolving and interacting radical species. We monitored ascorbyl radical populations either as brought on by biologically damaging peroxynitrite oxidant or as chemically and kinetically interacting with a spectroscopically overlapping nitroxide radical. In a different biophysical application, where a spin-label lineshape reflected rapidly changing molecular dynamics of folding spin-labeled protein, rapid scan spectra were taken during flow with different flow rates and correspondingly different times after the mixing-induced inception of protein folding. This flow/rapid scan method is a means for monitoring early immobilization of the spin probe in the course of the folding process.

  4. An optimum design on rollers containing the groove with changeable inner diameter based on response surface methodology

    Directory of Open Access Journals (Sweden)

    Xi Zhao

    2016-05-01

    Full Text Available In order to realize the precision plastic forming of the revolving body component with changeable wall thickness, a kind of roller containing grooves with changeable inner diameter is put forward, as the forming mould of the technology of rolling-extrusion. Specifically, first, the arc length of the groove in the roller is designed according to the prediction on the forward slip value during the process of forming, to make accurate control of the actual length of the forming segments; then, to obtain better parameters of the roller structure, a second-order response surface model combining finite element numerical simulation and response surface methodology was put forward, taking the factor of forming uniformity as evaluation index. The result of the experiment shows that, for the formed component, not only the size can meet the needs but also each mechanical property index can be greatly improved, which verify the rationality of the forward slip model and the structural parameter of the optimum model based on the response surface methodology.

  5. Information Technology Service Management (ITSM Implementation Methodology Based on Information Technology Infrastructure Library Ver.3 (ITIL V3

    Directory of Open Access Journals (Sweden)

    Mostafa Mohamed AlShamy

    2012-06-01

    Full Text Available This paper is intended to cover the concept of IT Infrastructure Library (ITIL v3 and how to implement it in order to increase the efficiency of any Egyptian IT corporate and to help the corporate employees to do their work easily and its clients to feel the quality of services provided to them. ITIL is considered now as the de facto standard for IT Service Management (ITSM in organizations which operate their business based on IT infrastructure and services. ITIL v3 was implemented in western organizations but still it is a new framework for the Egyptian and Arabian environment. The best proof of the lack of ITSM in the Arab region and not Egypt alone is that the percentage of the companies which have ISO/IEC 20000 are less than 2% of the total certified companies in the whole world and in Egypt no company has it until now as stated on APMG ISO/IEC 20000 website[1]. Accordingly this paper investigates an implementation methodology of ITIL in an Egyptian organization taking into consideration the cultural factors and how it will affect the success of this implementation. We have already implemented this methodology in three Egyptian companies and it succeeded to increase the level of process maturity from level one to level four according the PMF

  6. Novel integrative methodology for engineering large liver tissue equivalents based on three-dimensional scaffold fabrication and cellular aggregate assembly.

    Science.gov (United States)

    Pang, Y; Horimoto, Y; Sutoko, S; Montagne, K; Shinohara, M; Mathiue, D; Komori, K; Anzai, M; Niino, T; Sakai, Yasuyuki

    2016-01-01

    A novel engineering methodology for organizing a large liver tissue equivalent was established by intergrating both 'top down' and 'bottom up' approaches. A three-dimensional (3D) scaffold was engineered comprising 43 culture chambers (volume: 11.63 cm(3)) assembled in a symmetrical pattern on 3 layers, a design which enables further scaling up of the device to a clinically significant size (volume: 500 cm(3)). In addition, an inter-connected flow channel network was designed and proved to homogenously deliver culture medium to each chamber with the same pressure drop. After fabrication using nylon-12 and a selective laser sintering process, co-cultured cellular aggregates of human hepatoma Hep G2 and TMNK-1 cells were loosely packed into the culture chambers with biodegradable poly-L-lactic acid fibre pieces for 9 days of perfusion culture. The device enabled increased hepatic function and well-maintained cell viability, demonstrating the importance of an independent medium flow supply for cell growth and function provided by the current 3D scaffold. This integrative methodology from the macro- to the micro-scale provides an efficient way of arranging engineered liver tissue with improved mass transfer, making it possible to further scale up to a construct with clinically relevant size while maintaining high per-volume-based physiological function in the near future. PMID:27579855

  7. Final Report, Nuclear Energy Research Initiative (NERI) Project: An Innovative Reactor Analysis Methodology Based on a Quasidiffusion Nodal Core Model

    International Nuclear Information System (INIS)

    OAK (B204) Final Report, NERI Project: ''An Innovative Reactor Analysis Methodology Based on a Quasidiffusion Nodal Core Model'' The present generation of reactor analysis methods uses few-group nodal diffusion approximations to calculate full-core eigenvalues and power distributions. The cross sections, diffusion coefficients, and discontinuity factors (collectively called ''group constants'') in the nodal diffusion equations are parameterized as functions of many variables, ranging from the obvious (temperature, boron concentration, etc.) to the more obscure (spectral index, moderator temperature history, etc.). These group constants, and their variations as functions of the many variables, are calculated by assembly-level transport codes. The current methodology has two main weaknesses that this project addressed. The first weakness is the diffusion approximation in the full-core calculation; this can be significantly inaccurate at interfaces between different assemblies. This project used the nodal diffusion framework to implement nodal quasidiffusion equations, which can capture transport effects to an arbitrary degree of accuracy. The second weakness is in the parameterization of the group constants; current models do not always perform well, especially at interfaces between unlike assemblies. The project developed a theoretical foundation for parameterization and homogenization models and used that theory to devise improved models. The new models were extended to tabulate information that the nodal quasidiffusion equations can use to capture transport effects in full-core calculations

  8. K-Targeted Metabolomic Analysis Extends Chemical Subtraction to DESIGNER Extracts: Selective Depletion of Extracts of Hops (Humulus lupulus)⊥

    OpenAIRE

    Ramos Alvarenga, René F.; Friesen, J. Brent; Nikolić, Dejan; Simmler, Charlotte; Napolitano, José G.; van Breemen, Richard; Lankin, David C.; McAlpine, James B.; Pauli, Guido F.; Chen, Shao-Nong

    2014-01-01

    This study introduces a flexible and compound targeted approach to Deplete and Enrich Select Ingredients to Generate Normalized Extract Resources, generating DESIGNER extracts, by means of chemical subtraction or augmentation of metabolites. Targeting metabolites based on their liquid–liquid partition coefficients (K values), K targeting uses countercurrent separation methodology to remove single or multiple compounds from a chemically complex mixture, according to the following equation: DES...

  9. Improved base-collector depletion charge and capacitance model for SiGe HBT on thin-film SOI%薄膜SOI上SiGe HBT集电结耗尽电荷和电容改进模型

    Institute of Scientific and Technical Information of China (English)

    徐小波; 张鹤鸣; 胡辉勇

    2011-01-01

    文章研究了SOI衬底上SiGe npn异质结晶体管集电结耗尽电荷和电容.根据器件实际工作情况,基于课题组前面的工作,对耗尽电荷和电容模型进行扩展和优化.研究结果表明,耗尽电荷模型具有更好的光滑性;耗尽电容模型为纵向耗尽与横向耗尽电容的串联,考虑了不同电流流动面积,与常规器件相比,SOI器件全耗尽工作模式下表现出更小的集电结耗尽电容,因此更大的正向Early电压;在纵向工作模式到横向工作模式转变的电压偏置点,耗尽电荷和电容的变化趋势发生改变.SOI薄膜上纵向SiGe HBT集电结耗尽电荷和电容模型的建立和扩展为毫米波SOI BiCMOS工艺中双极器件核心参数如Early电压、特征频率等的设计提供了有价值的参考.%The SiGe heterojunction bipolar transistor (HBT) on thin film SOI is successfully integrated with SOI CMOS by "folded collector".This paper deals with the collector depletion charge and the capacitance of this structure.An optimized model is presented based on our previous research.The results show that the charge model is smoother,and that the capacitance model with considering different current flow areas,is vertical and horizontal depletion capacitances in series,showing that the depletion capacitance is smaller than that of a bulk HBT.The charge and capacitance vary with the increase of reverse collector-base bias.This collector depletion charge and capacitance model provides valuable reference to the SOI SiGe HBT electrical parameters design and simulation such as Early voltage and transit frequency in the latest 0.13μm SOI BiCMOS technology.

  10. A methodology toward manufacturing grid-based virtual enterprise operation platform

    Science.gov (United States)

    Tan, Wenan; Xu, Yicheng; Xu, Wei; Xu, Lida; Zhao, Xianhua; Wang, Li; Fu, Liuliu

    2010-08-01

    Virtual enterprises (VEs) have become one of main types of organisations in the manufacturing sector through which the consortium companies organise their manufacturing activities. To be competitive, a VE relies on the complementary core competences among members through resource sharing and agile manufacturing capacity. Manufacturing grid (M-Grid) is a platform in which the production resources can be shared. In this article, an M-Grid-based VE operation platform (MGVEOP) is presented as it enables the sharing of production resources among geographically distributed enterprises. The performance management system of the MGVEOP is based on the balanced scorecard and has the capacity of self-learning. The study shows that a MGVEOP can make a semi-automated process possible for a VE, and the proposed MGVEOP is efficient and agile.

  11. Attribute Based Selection of Thermoplastic Resin for Vacuum Infusion Process: A Decision Making Methodology

    DEFF Research Database (Denmark)

    Raghavalu Thirumalai, Durai Prabhakaran; Lystrup, Aage; Løgstrup Andersen, Tom

    2012-01-01

    The composite industry looks toward a new material system (resins) based on thermoplastic polymers for the vacuum infusion process, similar to the infusion process using thermosetting polymers. A large number of thermoplastics are available in the market with a variety of properties suitable...... for different engineering applications, and few of those are available in a not yet polymerised form suitable for resin infusion. The proper selection of a new resin system among these thermoplastic polymers is a concern for manufactures in the current scenario and a special mathematical tool would...... be beneficial. In this paper, the authors introduce a new decision making tool for resin selection based on significant attributes. This article provides a broad overview of suitable thermoplastic material systems for vacuum infusion process available in today’s market. An illustrative example—resin selection...

  12. A grammar based methodology for structural motif finding in ncRNA database search.

    Science.gov (United States)

    Quest, Daniel; Tapprich, William; Ali, Hesham

    2007-01-01

    In recent years, sequence database searching has been conducted through local alignment heuristics, pattern-matching, and comparison of short statistically significant patterns. While these approaches have unlocked many clues as to sequence relationships, they are limited in that they do not provide context-sensitive searching capabilities (e.g. considering pseudoknots, protein binding positions, and complementary base pairs). Stochastic grammars (hidden Markov models HMMs and stochastic context-free grammars SCFG) do allow for flexibility in terms of local context, but the context comes at the cost of increased computational complexity. In this paper we introduce a new grammar based method for searching for RNA motifs that exist within a conserved RNA structure. Our method constrains computational complexity by using a chain of topology elements. Through the use of a case study we present the algorithmic approach and benchmark our approach against traditional methods.

  13. A Model-Based Methodology for Integrated Design and Operation of Reactive Distillation Processes

    DEFF Research Database (Denmark)

    Mansouri, Seyed Soheil; Sales-Cruz, Mauricio; Huusom, Jakob Kjøbsted;

    2015-01-01

    calculation of reactive bubble points. For an energy-efficient design, the driving-forc eapproach (to determine the optimal feed location) for a reactive system has been employed. For both thereactive McCabe-Thiele and driving force method, vapor-liquid equilibrium data are based on elements. Thereactive...... and resolved. A new approach isto tackle process intensification and controllability issues in an integrated manner, in the early stages of process design. This integrated and simultaneous synthesis approach provides optimal operation and moreefficient control of complex intensified systems that suffice......-based method so that the complexity of the problemis reduced into a set of sub-problems that are solved sequentially. The method consists of four hierarchicalstages: (1) pre-analysis, (2) steady state analysis, (3) dynamic analysis, and (4) evaluation stage. To illustratethe application of the proposed...

  14. Achieving process intensification form the application of a phenomena based synthesis, Design and intensification methodology

    DEFF Research Database (Denmark)

    Babi, Deenesh Kavi; Lutze, Philip; Woodley, John;

    Process Intensification/Process Systems Engineering. Process intensification (PI) is a means by which one can achieve a more efficient and sustainable chemical process. Major success in the area of PI has been achieved by Eastman chemicals [1] which in 1984 intensified the process...... of PI still faces challenges [2] because the identification and design of intensified processes is not simple [3]. Lutze et al [3] has developed a systematic PI synthesis/design method at the unit operations (Unit-Ops) level, where the search space is based on a knowledge-base of existing PI equipment...... for the manufacture of methyl acetate by replacing with one single reactive distillation column the multi-step process which consisted of one reactor, extractive distillation, liquid-liquid separation and azeotropic distillation. However, except for reactive distillation and dividing wall columns, the implementation...

  15. A Trajectory-Oriented, Carriageway-Based Road Network Data Model, Part 2: Methodology

    Institute of Scientific and Technical Information of China (English)

    LI Xiang; LIN Hui

    2006-01-01

    This is the second of a three-part series of papers which presents the principle and architecture of the CRNM, a trajectory-oriented, carriageway-based road network data model. The first part of the series has introduced a general background of building trajectory-oriented road network data models, including motivation, related works, and basic concepts. Based on it, this paper describs the CRNM in detail. At first, the notion of basic roadway entity is proposed and discussed. Secondly, carriageway is selected as the basic roadway entity after compared with other kinds of roadway, and approaches to representing other roadways with carriageways are introduced. At last, an overall architecture of the CRNM is proposed.

  16. Methodology and theory evaluation of overall equipment effectiveness based on market

    OpenAIRE

    Anvari, Farhad; EDWARDS, Rodger; Starr, Andrew G.

    2010-01-01

    Purpose - Continuous manufacturing systems used within the steel industry involve different machines and processes that are arranged in a sequence of operations in order to manufacture the products. The steel industry is generally a capital-intensive industry and, because of high capital investment, the utilisation of equipment as effectively as possible is of high priority. This paper seeks to illustrate a new method, overall equipment effectiveness market-based (OEE-MB) for the precise calc...

  17. CONTRIBUTIONS REFERRING TO A TERRITORIAL MODEL FOR DEVELOPMENT OF THE ECONOMY BASED ON KNOWLEGDE (METHODOLOGICAL FRAMEWORK)

    OpenAIRE

    Mihail Dumitrescu; Lavinia Ţoţan

    2007-01-01

    This paper presented a short evolution of the concepts until the knowledge economy. It is mentioned the filosofy of the research which containts: the objectives of the projects, the changes in the preparation levels of human resources, the evolution of informational processes. The model also containts the determination of the characterizing indicators for the knowledge economy at territorial level with the computing relations, and also the evaluation on statistical bases.

  18. A Methodological Review of Piezoelectric Based Acoustic Wave Generation and Detection Techniques for Structural Health Monitoring

    OpenAIRE

    Zhigang Sun; Bruno Rocha; Kuo-Ting Wu; Nezih Mrad

    2013-01-01

    Piezoelectric transducers have a long history of applications in nondestructive evaluation of material and structure integrity owing to their ability of transforming mechanical energy to electrical energy and vice versa. As condition based maintenance has emerged as a valuable approach to enhancing continued aircraft airworthiness while reducing the life cycle cost, its enabling structural health monitoring (SHM) technologies capable of providing on-demand diagnosis of the structure without i...

  19. A systematic literature review of methodology used to measure effectiveness in digital game-based learning

    OpenAIRE

    All, Anissa; Nunez Castellar, Elena Patricia; Van Looy, Jan

    2013-01-01

    In recent years, a growing number of studies is being conducted into the effectiveness of digital game-based learning (DGBL). Despite this growing interest, however, it remains difficult to draw general conclusions due to the disparities in methods and reporting. Guidelines or a standardized procedure for conducting DGBL effectiveness research would allow to compare results across studies and provide well-founded and more generalizable evidence for the impact of DGBL. This study presents a fi...

  20. A GIS-based methodology for safe site selection of a building in a hilly region

    OpenAIRE

    Satish Kumar; Bansal, V. K.

    2016-01-01

    Worker safety during construction is widely accepted, but the selection of safe sites for a building is generally not considered. Safe site selection (SSS) largely depends upon compiling, analyzing, and refining the information of an area where a building is likely to be located. The locational and topographical aspects of an area located in hilly regions play a major role in SSS, but are generally neglected in traditional and CAD-based systems used for site selection. Architects and engineer...

  1. Tracking Environmental Compliance and Remediation Trajectories Using Image-Based Anomaly Detection Methodologies

    Directory of Open Access Journals (Sweden)

    James K. Lein

    2011-11-01

    Full Text Available Recent interest in use of satellite remote sensing for environmental compliance and remediation assessment has been heightened by growing policy requirements and the need to provide more rapid and efficient monitoring and enforcement mechanisms. However, remote sensing solutions are attractive only to the extent that they can deliver environmentally relevant information in a meaningful and time-sensitive manner. Unfortunately, the extent to which satellite-based remote sensing satisfies the demands for compliance and remediation assessment under the conditions of an actual environmental accident or calamity has not been well documented. In this study a remote sensing solution to the problem of site remediation and environmental compliance assessment was introduced based on the use of the RDX anomaly detection algorithm and vegetation indices developed from the Tasseled Cap Transform. Results of this analysis illustrate how the use of standard vegetation transforms, integrated into an anomaly detection strategy, enable the time-sequenced tracking of site remediation progress. Based on these results credible evidence can be produced to support compliance evaluation and remediation assessment following major environmental disasters.

  2. Design-Based Research in Science Education: One Step Towards Methodology

    Directory of Open Access Journals (Sweden)

    Kalle Juuti

    2012-10-01

    Full Text Available Recently, there has been critiques towards science education research, as the potential of this research has not been actualised in science teaching and learning praxis. The paper describes an analysis of a design-based research approach (DBR that has been suggested as a solution for the discontinuation between science education research and praxis. We propose that a pragmatic frame helps to clarify well the design-based research endeavour. We abstracted three aspects from the analysis that constitute design-based research: (a a design process is essentially iterative starting from the recognition of the change of the environment of praxis, (b it generates a widely usable artefact, (c and it provides educational knowledge for more intelligible praxis. In the knowledge acquisition process, the pragmatic viewpoint emphasises the role of a teacher’s reflected actions as well as the researches’ involvement in the authentic teaching and learning settings.

  3. Putting Order into Our Universe: The Concept of Blended Learning—A Methodology within the Concept-based Terminology Framework

    Directory of Open Access Journals (Sweden)

    Joana Fernandes

    2016-06-01

    Full Text Available This paper aims at discussing the advantages of a methodology design grounded on a concept-based approach to Terminology applied to the most prominent scenario of current Higher Education: blended learning. Terminology is a discipline that aims at representing, describing and defining specialized knowledge through language, putting order into our universe (Nuopponen, 2011. Concepts, as elements of the structure of knowledge (Sager, 1990 emerge as a complex research object. Can they be found in language? A concept-based approach to Terminology implies a clear-cut view of the role of language in terminological work: though language is postulated as being a fundamental tool to grasp, describe and organize knowledge, an isomorphic relationship between language and knowledge cannot be taken for granted. In other words, the foundational premise of a concept-based approach is that there is no one-to-one correspondence between atomic elements of knowledge and atomic elements of linguistic expression. This is why a methodological approach to Terminology merely based upon specialized text research is regarded as biased (Costa, 2013. As a consequence, we argue that interactional strategies between terminologist and domain expert deserve particular research attention. To our mind, the key to concept-based terminological work is to carry out a concept analysis of data gathered from a specialised text corpora combined with an elicitation process of the tacit knowledge and concept-oriented discursive negotiation. Following such view, we put forward a methodology to answer the question: how is blended learning defined in the Post-Bologna scenario? Even though there are numerous high-quality models and practical descriptions for its implementation (similarly to other concepts related to distance learning, the need to understand, demarcate and harmonize the concept of blended learning against the current Higher Education background results from the premise that

  4. Health and environmental impact of depleted uranium

    International Nuclear Information System (INIS)

    Depleted Uranium (DU) is 'nuclear waste' produced from the enrichment process and is mostly made up of 238U and is depleted in the fissionable isotope 235U compared to natural uranium (NU). Depleted uranium has about 60% of the radioactivity of natural uranium. Depleted uranium and natural uranium are identical in terms of the chemical toxicity. Uranium's high density gives depleted uranium shells increased range and penetrative power. This density, combined with uranium's pyrophoric nature, results in a high-energy kinetic weapon that can punch and burn through armour plating. Striking a hard target, depleted uranium munitions create extremely high temperatures. The uranium immediately burns and vaporizes into an aerosol, which is easily diffused in the environment. People can inhale the micro-particles of uranium oxide in an aerosol and absorb them mainly from lung. Depleted uranium has both aspects of radiological toxicity and chemical toxicity. The possible synergistic effect of both kinds of toxicities is also pointed out. Animal and cellular studies have been reported the carcinogenic, neurotoxic, immuno-toxic and some other effects of depleted uranium including the damage on reproductive system and foetus. In addition, the health effects of micro/ nano-particles, similar in size of depleted uranium aerosols produced by uranium weapons, have been reported. Aerosolized DU dust can easily spread over the battlefield spreading over civilian areas, sometimes even crossing international borders. Therefore, not only the military personnel but also the civilians can be exposed. The contamination continues after the cessation of hostilities. Taking these aspects into account, DU weapon is illegal under international humanitarian laws and is considered as one of the inhumane weapons of 'indiscriminate destruction'. The international society is now discussing the prohibition of DU weapons based on 'precautionary principle'. The 1991 Gulf War is reportedly the first

  5. Strain-Based Design Methodology of Large Diameter Grade X80 Linepipe

    Energy Technology Data Exchange (ETDEWEB)

    Lower, Mark D. [ORNL

    2014-04-01

    Continuous growth in energy demand is driving oil and natural gas production to areas that are often located far from major markets where the terrain is prone to earthquakes, landslides, and other types of ground motion. Transmission pipelines that cross this type of terrain can experience large longitudinal strains and plastic circumferential elongation as the pipeline experiences alignment changes resulting from differential ground movement. Such displacements can potentially impact pipeline safety by adversely affecting structural capacity and leak tight integrity of the linepipe steel. Planning for new long-distance transmission pipelines usually involves consideration of higher strength linepipe steels because their use allows pipeline operators to reduce the overall cost of pipeline construction and increase pipeline throughput by increasing the operating pressure. The design trend for new pipelines in areas prone to ground movement has evolved over the last 10 years from a stress-based design approach to a strain-based design (SBD) approach to further realize the cost benefits from using higher strength linepipe steels. This report presents an overview of SBD for pipelines subjected to large longitudinal strain and high internal pressure with emphasis on the tensile strain capacity of high-strength microalloyed linepipe steel. The technical basis for this report involved engineering analysis and examination of the mechanical behavior of Grade X80 linepipe steel in both the longitudinal and circumferential directions. Testing was conducted to assess effects on material processing including as-rolled, expanded, and heat treatment processing intended to simulate coating application. Elastic-plastic and low-cycle fatigue analyses were also performed with varying internal pressures. Proposed SBD models discussed in this report are based on classical plasticity theory and account for material anisotropy, triaxial strain, and microstructural damage effects

  6. Weibull-Based Design Methodology for Rotating Structures in Aircraft Engines

    OpenAIRE

    Zaretsky V. E.; Hendricks C. R.; Soditus S.

    2003-01-01

    The NASA Energy-Efficient Engine (E3-Engine) is used as the basis of a Weibull-based life and reliability analysis. Each component's life, and thus the engine's life, is defined by high-cycle fatigue or low-cycle fatigue. Knowing the cumulative life distribution of each of the components making up the engine as represented by a Weibull slope is a prerequisite to predicting the life and reliability of the entire engine. As the engine's Weibull slope increases, the predicted life decreases. The...

  7. Model-Driven Methodology for Rapid Deployment of Smart Spaces Based on Resource-Oriented Architectures

    OpenAIRE

    José R. Casar; Josué Iglesias; Bernardos, Ana M.; Iván Corredor

    2012-01-01

    Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT) and Web of Things (WoT) are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i) to integrate sensing and actuating functionalities into everyday objects, and (ii) to easily allow a diver...

  8. Analysis of current and alternative phenol based RNA extraction methodologies for cyanobacteria

    Directory of Open Access Journals (Sweden)

    Lindblad Peter

    2009-08-01

    Full Text Available Abstract Background The validity and reproducibility of gene expression studies depend on the quality of extracted RNA and the degree of genomic DNA contamination. Cyanobacteria are gram-negative prokaryotes that synthesize chlorophyll a and carry out photosynthetic water oxidation. These organisms possess an extended array of secondary metabolites that impair cell lysis, presenting particular challenges when it comes to nucleic acid isolation. Therefore, we used the NHM5 strain of Nostoc punctiforme ATCC 29133 to compare and improve existing phenol based chemistry and procedures for RNA extraction. Results With this work we identify and explore strategies for improved and lower cost high quality RNA isolation from cyanobacteria. All the methods studied are suitable for RNA isolation and its use for downstream applications. We analyse different Trizol based protocols, introduce procedural changes and describe an alternative RNA extraction solution. Conclusion It was possible to improve purity of isolated RNA by modifying protocol procedures. Further improvements, both in RNA purity and experimental cost, were achieved by using a new extraction solution, PGTX.

  9. Machine learning based methodology to identify cell shape phenotypes associated with microenvironmental cues.

    Science.gov (United States)

    Chen, Desu; Sarkar, Sumona; Candia, Julián; Florczyk, Stephen J; Bodhak, Subhadip; Driscoll, Meghan K; Simon, Carl G; Dunkers, Joy P; Losert, Wolfgang

    2016-10-01

    Cell morphology has been identified as a potential indicator of stem cell response to biomaterials. However, determination of cell shape phenotype in biomaterials is complicated by heterogeneous cell populations, microenvironment heterogeneity, and multi-parametric definitions of cell morphology. To associate cell morphology with cell-material interactions, we developed a shape phenotyping framework based on support vector machines. A feature selection procedure was implemented to select the most significant combination of cell shape metrics to build classifiers with both accuracy and stability to identify and predict microenvironment-driven morphological differences in heterogeneous cell populations. The analysis was conducted at a multi-cell level, where a "supercell" method used average shape measurements of small groups of single cells to account for heterogeneous populations and microenvironment. A subsampling validation algorithm revealed the range of supercell sizes and sample sizes needed for classifier stability and generalization capability. As an example, the responses of human bone marrow stromal cells (hBMSCs) to fibrous vs flat microenvironments were compared on day 1. Our analysis showed that 57 cells (grouped into supercells of size 4) are the minimum needed for phenotyping. The analysis identified that a combination of minor axis length, solidity, and mean negative curvature were the strongest early shape-based indicator of hBMSCs response to fibrous microenvironment. PMID:27449947

  10. The Soft Systems Methodology Based Analysis Model in the Development of Selfmanaging Information Systems

    Directory of Open Access Journals (Sweden)

    Sa'Adah Hassan

    2012-01-01

    Full Text Available Problem statement: In order to be able to manage its own information within its operating environment with minimal human intervention, a selfmanaging information system ought to identify and make use of information from the resources available in its environment. The development of requirements for selfmanaging information systems should start with an appropriate analysis model that can explicitly show the collaborating entities in the environment. The traditional forms of analysis in systems development approaches have not focused on computing systems environments and ways to identify environment resources. Approach: This study discusses the analysis activities in the development of selfmanaging information systems. Results: We propose an SSM based analysis model, which is able to examine the requirements for selfmanaging information systems. We describe the analysis of one particular system, the inventory management system and illustrate how this system fulfils certain desired selfmanaging properties. Conclusion: The SSM based analysis model is able to address the actuation capabilities of the systems and considers internal and external resources in the environment. The analysis model not only takes into account the information from the environment but is also able to provide support in determining the requirements for selfmanaging properties.

  11. The Narrative-Based Enhancement Methodology for the Cultural Property and the Cultural Institution

    Directory of Open Access Journals (Sweden)

    Kastytis Rudokas

    2013-10-01

    Full Text Available The paper addresses the contemporary conception of the place marketing, image and the impact of multiple identities on the cultural institution of a city. The first part of the paper is based on the most famous Clare A. Gunn theory on two possible perceptions of the postmodern place image. The author of the article points out that the cultural value of an object is conditional and depends on communicational strategies and community needs. As an example of identity introduction to a place, the case of Berlin Platten is taken, where creative society is creating a new public image of multi-dwelling buildings. Basketball Club Žalgiris is taken as a Lithuanian example of an image that emerges deep from the past. In this case, the author of the article shows how the Club is constructing its narrative by manipulating historical facts at present. In the last part of the paper, it is argued that rapidly changing society causes the abstract valuation of culture, which is also based on a wider communicational context. As a conclusion, the author points out that processes of identity construction will provide the background for cultural and economic rivalry between cities of the World.

  12. A new methodology based on littoral community cartography dominated by macroalgae for the implementation of the European Water Framework Directive.

    Science.gov (United States)

    Ballesteros, Enric; Torras, Xavier; Pinedo, Susana; García, María; Mangialajo, Luisa; de Torres, Mariona

    2007-01-01

    Macroalgae is a biological key element for the assessment of the ecological status in coastal waters in the frame of the European Water Framework Directive (WFD, 2000/60/EC). Here we propose a methodology for monitoring water quality based on the cartography of littoral and upper-sublittoral rocky-shore communities (CARLIT, in short). With the use of spatial databases, GIS, and available information about the value of rocky-shore communities as indicators of water quality, it is possible to obtain an environmental quality index representative of the ecological status of rocky coasts. This index, which completely fulfils the requirements of the WFD, is expressed as a ratio between the observed values in the sector of shore that is being assessed and the expected value in a reference condition zone with the same substrate and coastal morphology (Ecological Quality Ratio, EQR). The application of this index to the coast of Catalonia (North-Western Mediterranean) is presented.

  13. New statistical methodology, mathematical models, and data bases relevant to the assessment of health impacts of energy technologies

    International Nuclear Information System (INIS)

    The present research develops new statistical methodology, mathematical models, and data bases of relevance to the assessment of health impacts of energy technologies, and uses these to identify, quantify, and pedict adverse health effects of energy related pollutants. Efforts are in five related areas including: (1) evaluation and development of statistical procedures for the analysis of death rate data, disease incidence data, and large scale data sets; (2) development of dose response and demographic models useful in the prediction of the health effects of energy technologies; (3) application of our method and models to analyses of the health risks of energy production; (4) a reanalysis of the Tri-State leukemia survey data, focusing on the relationship between myelogenous leukemia risk and diagnostic x-ray exposure; and (5) investigation of human birth weights as a possible early warning system for the effects of environmental pollution

  14. Physiologically Based Pharmacokinetic Modeling: Methodology, Applications, and Limitations with a Focus on Its Role in Pediatric Drug Development

    Directory of Open Access Journals (Sweden)

    Feras Khalil

    2011-01-01

    Full Text Available The concept of physiologically based pharmacokinetic (PBPK modeling was introduced years ago, but it has not been practiced significantly. However, interest in and implementation of this modeling technique have grown, as evidenced by the increased number of publications in this field. This paper demonstrates briefly the methodology, applications, and limitations of PBPK modeling with special attention given to discuss the use of PBPK models in pediatric drug development and some examples described in detail. Although PBPK models do have some limitations, the potential benefit from PBPK modeling technique is huge. PBPK models can be applied to investigate drug pharmacokinetics under different physiological and pathological conditions or in different age groups, to support decision-making during drug discovery, to provide, perhaps most important, data that can save time and resources, especially in early drug development phases and in pediatric clinical trials, and potentially to help clinical trials become more “confirmatory” rather than “exploratory”.

  15. Development of the point-depletion code DEPTH

    International Nuclear Information System (INIS)

    Highlights: ► The DEPTH code has been developed for the large-scale depletion system. ► DEPTH uses the data library which is convenient to couple with MC codes. ► TTA and matrix exponential methods are implemented and compared. ► DEPTH is able to calculate integral quantities based on the matrix inverse. ► Code-to-code comparisons prove the accuracy and efficiency of DEPTH. -- Abstract: The burnup analysis is an important aspect in reactor physics, which is generally done by coupling of transport calculations and point-depletion calculations. DEPTH is a newly-developed point-depletion code of handling large burnup depletion systems and detailed depletion chains. For better coupling with Monte Carlo transport codes, DEPTH uses data libraries based on the combination of ORIGEN-2 and ORIGEN-S and allows users to assign problem-dependent libraries for each depletion step. DEPTH implements various algorithms of treating the stiff depletion systems, including the Transmutation trajectory analysis (TTA), the Chebyshev Rational Approximation Method (CRAM), the Quadrature-based Rational Approximation Method (QRAM) and the Laguerre Polynomial Approximation Method (LPAM). Three different modes are supported by DEPTH to execute the decay, constant flux and constant power calculations. In addition to obtaining the instantaneous quantities of the radioactivity, decay heats and reaction rates, DEPTH is able to calculate the integral quantities by a time-integrated solver. Through calculations compared with ORIGEN-2, the validity of DEPTH in point-depletion calculations is proved. The accuracy and efficiency of depletion algorithms are also discussed. In addition, an actual pin-cell burnup case is calculated to illustrate the DEPTH code performance in coupling with the RMC Monte Carlo code

  16. Interparticle force based methodology for prediction of cohesive powder flow properties

    Science.gov (United States)

    Esayanur, Madhavan Sujatha Sarma

    The transport and handling of powders are key areas in the process industry that have a direct impact on the efficiency and/or the quality of the finished product. A lack of fundamental understanding of powder flow properties as a function of operating variables such as relative humidity, and particle size, leading to problems such as arching, rat-holing and segregation, is one the main causes for unscheduled down times in plant operation and loss of billions of dollars in revenues. Most of the current design strategies and characterization techniques for industrial powders are based on a continuum approach similar to the field of soil mechanics. Due to an increase in complexity of the synthesis process and reduction in size of powders to the nanoscale, the surface properties and inter particle forces play a significant role in determining the flow characteristics. The use of ensemble techniques such as direct shear testing to characterize powders are no longer adequate due to lack of understanding of the changes in the property of powders as a function of the major operating variables such as relative humidity, temperature etc. New instrumentation or techniques need to be developed to reliably characterize powder flow behavior. Simultaneously, scalability of the current models to predict powder flow needs to be revisited. Specifically, this study focuses on the development of an inter particle force based model for predicting the unconfined yield strength of cohesive powders. To understand the role of interparticle forces in determining the strength of cohesive powders, the particle scale interactions were characterized using Atomic Force Microscopy (AFM), contact angle, surface tension, and coefficient of friction. The bulk scale properties such as unconfined yield strength, packing structure, and size of the shear zone were also investigated. It was determined that an interparticle force based model incorporating the effect of particle size and packing structure

  17. How can activity-based costing methodology be performed as a powerful tool to calculate costs and secure appropriate patient care?

    Science.gov (United States)

    Lin, Blossom Yen-Ju; Chao, Te-Hsin; Yao, Yuh; Tu, Shu-Min; Wu, Chun-Ching; Chern, Jin-Yuan; Chao, Shiu-Hsiung; Shaw, Keh-Yuong

    2007-04-01

    Previous studies have shown the advantages of using activity-based costing (ABC) methodology in the health care industry. The potential values of ABC methodology in health care are derived from the more accurate cost calculation compared to the traditional step-down costing, and the potentials to evaluate quality or effectiveness of health care based on health care activities. This project used ABC methodology to profile the cost structure of inpatients with surgical procedures at the Department of Colorectal Surgery in a public teaching hospital, and to identify the missing or inappropriate clinical procedures. We found that ABC methodology was able to accurately calculate costs and to identify several missing pre- and post-surgical nursing education activities in the course of treatment. PMID:17489499

  18. How can activity-based costing methodology be performed as a powerful tool to calculate costs and secure appropriate patient care?

    Science.gov (United States)

    Lin, Blossom Yen-Ju; Chao, Te-Hsin; Yao, Yuh; Tu, Shu-Min; Wu, Chun-Ching; Chern, Jin-Yuan; Chao, Shiu-Hsiung; Shaw, Keh-Yuong

    2007-04-01

    Previous studies have shown the advantages of using activity-based costing (ABC) methodology in the health care industry. The potential values of ABC methodology in health care are derived from the more accurate cost calculation compared to the traditional step-down costing, and the potentials to evaluate quality or effectiveness of health care based on health care activities. This project used ABC methodology to profile the cost structure of inpatients with surgical procedures at the Department of Colorectal Surgery in a public teaching hospital, and to identify the missing or inappropriate clinical procedures. We found that ABC methodology was able to accurately calculate costs and to identify several missing pre- and post-surgical nursing education activities in the course of treatment.

  19. Dendritic cell-based vaccines in the setting of peripheral blood stem cell transplantation: CD34+ cell-depleted mobilized peripheral blood can serve as a source of potent dendritic cells.

    Science.gov (United States)

    Choi, D; Perrin, M; Hoffmann, S; Chang, A E; Ratanatharathorn, V; Uberti, J; McDonagh, K T; Mulé, J J

    1998-11-01

    We are investigating the use of tumor-pulsed dendritic cell (DC)-based vaccines in the treatment of patients with advanced cancer. In the current study, we evaluated the feasibility of obtaining both CD34+ hematopoietic stem/ progenitor cells (HSCs) and functional DCs from the same leukapheresis collection in adequate numbers for both peripheral blood stem cell transplantation (PBSCT) and immunization purposes, respectively. Leukapheresis collections of mobilized peripheral blood mononuclear cells (PBMCs) were obtained from normal donors receiving granulocyte colony-stimulating factor (G-CSF) (for allogeneic PBSCT) and from intermediate grade non-Hodgkin's lymphoma or multiple myeloma patients receiving cyclophosphamide plus G-CSF (for autologous PBSCT). High enrichment of CD34+ HSCs was obtained using an immunomagnetic bead cell separation device. After separation, the negative fraction of mobilized PBMCs from normal donors and cancer patients contained undetectable levels of CD34+ HSCs by flow cytometry. This fraction of cells was then subjected to plastic adherence, and the adherent cells were cultured for 7 days in GM-CSF (100 ng/ml) and interleukin 4 (50 ng/ml) followed by an additional 7 days in GM-CSF, interleukin 4, and tumor necrosis factor alpha (10 ng/ml) to generate DCs. Harvested DCs represented yields of 4.1+/-1.4 and 5.8+/-5.4% of the initial cells plated from the CD34+ cell-depleted mobilized PBMCs of normal donors and cancer patients, respectively, and displayed a high level expression of CD80, CD86, HLA-DR, and CD11c but not CD14. This phenotypic profile was similar to that of DCs derived from non-CD34+ cell-depleted mobilized PBMCs. DCs generated from CD34+ cell-depleted mobilized PBMCs elicited potent antitetanus as well as primary allogeneic T-cell proliferative responses in vitro, which were equivalent to DCs derived from non-CD34+ cell-depleted mobilized PBMCs. Collectively, these results demonstrate the feasibility of obtaining both DCs and

  20. Projective goals - concepts and pragmatic aspects based on the terminology and methodology of safety science

    International Nuclear Information System (INIS)

    Protective goals set the line of orientation of tasks and activities in the field of accident prevention. They have to be based on safety-science methods in order to develop from the conceptual idea to the practically feasible solution, while using the scientific methods to take into account the facts and the capabilities of a situation and, proceeding from them, finding an efficient and rational, optimal pragmatic approach by way of various strategies or tactics. In this process, the activities of defining, informing, thinking and developing need the proper terminology. Safety is absence of danger, protection is limitation of danger and prevention of damage. So it is protection what is needed with danger being given, and risks have to be minimized. Riskology is a novel method of safety science, combining risk analysis and risk control into a systematic concept which is practice-oriented. Applying this to the field of nuclear engineering, the hitherto achieved should receive new impulses. (orig.)

  1. A Methodological Review of Piezoelectric Based Acoustic Wave Generation and Detection Techniques for Structural Health Monitoring

    Directory of Open Access Journals (Sweden)

    Zhigang Sun

    2013-01-01

    Full Text Available Piezoelectric transducers have a long history of applications in nondestructive evaluation of material and structure integrity owing to their ability of transforming mechanical energy to electrical energy and vice versa. As condition based maintenance has emerged as a valuable approach to enhancing continued aircraft airworthiness while reducing the life cycle cost, its enabling structural health monitoring (SHM technologies capable of providing on-demand diagnosis of the structure without interrupting the aircraft operation are attracting increasing R&D efforts. Piezoelectric transducers play an essential role in these endeavors. This paper is set forth to review a variety of ingenious ways in which piezoelectric transducers are used in today’s SHM technologies as a means of generation and/or detection of diagnostic acoustic waves.

  2. ANALYSIS OF THE EFFECTIVENESS AND EFFICIENCY OF MANAGEMENT SYSTEMS BASED ON SYSTEM ANALYSIS METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Yurij Vasilkov

    2014-09-01

    Full Text Available In this paper we consider the problem of analyzing the effectiveness and efficiency of management systems that are relevant, especially in the implementation at the enterprise requirements of ISO 9001, 14001 and others. Research management system based on a systematic approach focused on the disclosure of its integrative qualities (i.e. systemic, on identifying the variety of relationships and mechanisms for these qualities. It allows to identify the causes of the real state of affairs, to explain the successes and failures. An important aspect of a systematic approach to the analysis of the effectiveness and efficiency of production control management is the multiplicity of "stakeholders" interests involved in the production process in the formation of operational goals and ways to achieve them.

  3. Optimization-based methodology for wastewater treatment plant synthesis – a full scale retrofitting case study

    DEFF Research Database (Denmark)

    Bozkurt, Hande; Gernaey, Krist; Sin, Gürkan

    2015-01-01

    framework to manage the multi-criteria WWTP design/retrofit problem for domestic wastewater treatment. The design space (i.e. alternative treatment technologies) is represented in a superstructure, which is coupled with a database containing data for both performance and economics of the novel alternative......Existing wastewater treatment plants (WWTP) need retrofitting in order to better handle changes in the wastewater flow and composition, reduce operational costs as well as meet newer and stricter regulatory standards on the effluent discharge limits. In this study, we use an optimization based...... technologies. The superstructure optimization problem is formulated as a Mixed Integer (non)Linear Programming problem and solved for different scenarios - represented by different objective functions and constraint definitions. A full-scale domestic wastewater treatment plant (265,000 PE) is used as a case...

  4. Design Novel Model Reference Artificial Intelligence Based Methodology to Optimized Fuel Ratio in IC Engine

    Directory of Open Access Journals (Sweden)

    FarzinPiltan

    2013-08-01

    Full Text Available In this research, model reference fuzzy based control is presented as robust controls for IC engine. The objective of the study is to design controls for IC engines without the knowledge of the boundary of uncertainties and dynamic information by using fuzzy model reference PD plus mass of air while improve the robustness of the PD plus mass of air control. A PD plus mass of air provides for eliminate the mass of air and ultimate accuracy in the presence of the bounded disturbance/uncertainties, although this methods also causes some oscillation. The fuzzy PD plus mass of air is proposed as a solution to the problems crated by unstability. This method has a good performance in presence of uncertainty.

  5. The pathogen- and incidence-based DALY approach: an appropriate [corrected] methodology for estimating the burden of infectious diseases.

    Directory of Open Access Journals (Sweden)

    Marie-Josée J Mangen

    Full Text Available In 2009, the European Centre for Disease Prevention and Control initiated the 'Burden of Communicable Diseases in Europe (BCoDE' project to generate evidence-based and comparable burden-of-disease estimates of infectious diseases in Europe. The burden-of-disease metric used was the Disability-Adjusted Life Year (DALY, composed of years of life lost due to premature death (YLL and due to disability (YLD. To better represent infectious diseases, a pathogen-based approach was used linking incident cases to sequelae through outcome trees. Health outcomes were included if an evidence-based causal relationship between infection and outcome was established. Life expectancy and disability weights were taken from the Global Burden of Disease Study and alternative studies. Disease progression parameters were based on literature. Country-specific incidence was based on surveillance data corrected for underestimation. Non-typhoidal Salmonella spp. and Campylobacter spp. were used for illustration. Using the incidence- and pathogen-based DALY approach the total burden for Salmonella spp. and Campylobacter spp. was estimated at 730 DALYs and at 1,780 DALYs per year in the Netherlands (average of 2005-2007. Sequelae accounted for 56% and 82% of the total burden of Salmonella spp. and Campylobacter spp., respectively. The incidence- and pathogen-based DALY methodology allows in the case of infectious diseases a more comprehensive calculation of the disease burden as subsequent sequelae are fully taken into account. Not considering subsequent sequelae would strongly underestimate the burden of infectious diseases. Estimates can be used to support prioritisation and comparison of infectious diseases and other health conditions, both within a country and between countries.

  6. Uranium resource assessment by the Geological Survey; methodology and plan to update the national resource base

    Science.gov (United States)

    Finch, Warren Irvin; McCammon, Richard B.

    1987-01-01

    Based on the Memorandum of Understanding {MOU) of September 20, 1984, between the U.S. Geological Survey of the U.S. Department of Interior and the Energy Information Administration {EIA) of the U.S. Department of Energy {DOE), the U.S. Geological Survey began to make estimates of the undiscovered uranium endowment of selected areas of the United States in 1985. A modified NURE {National Uranium Resource Evaluation) method will be used in place of the standard NURE method of the DOE that was used for the national assessment reported in October 1980. The modified method, here named the 'deposit-size-frequency' {DSF) method, is presented for the first time, and calculations by the two methods are compared using an illustrative example based on preliminary estimates for the first area to be evaluated under the MOU. The results demonstrate that the estimate of the endowment using the DSF method is significantly larger and more uncertain than the estimate obtained by the NURE method. We believe that the DSF method produces a more realistic estimate because the principal factor estimated in the endowment equation is disaggregated into more parts and is more closely tied to specific geologic knowledge than by the NURE method. The DSF method consists of modifying the standard NURE estimation equation, U=AxFxTxG, by replacing the factors FxT by a single factor that represents the tonnage for the total number of deposits in all size classes. Use of the DSF method requires that the size frequency of deposits in a known or control area has been established and that the relation of the size-frequency distribution of deposits to probable controlling geologic factors has been determined. Using these relations, the principal scientist {PS) first estimates the number and range of size classes and then, for each size class, estimates the lower limit, most likely value, and upper limit of the numbers of deposits in the favorable area. Once these probable estimates have been refined

  7. Agro-designing: sustainability-driven, vision-oriented, problem preventing and knowledge-based methodology for improving farming systems sustainability

    OpenAIRE

    Znaor, Darko; Goewie, Eric

    1999-01-01

    ABSTRACT While classical research focuses to problem solving, design is a problem- prevention methodology, and is suitable for multi- and and interdisciplinary research teams with the vision of how to improve the agricultural sustainability. Since organic agriculture is based on the holistic approach and is also problem-prevention oriented in that it refrains from certain inputs and practices, design is an interesting methodology that could be applied more often in organic agriculture. ...

  8. A methodology for uncertainty quantification in quantitative technology valuation based on expert elicitation

    Science.gov (United States)

    Akram, Muhammad Farooq Bin

    uncertainty propagation. Non-linear behavior in technology interactions is captured through expert elicitation based technology synergy matrices (TSM). Proposed TSMs increase the fidelity of current technology forecasting methods by including higher order technology interactions. A test case for quantification of epistemic uncertainty on a large scale problem of combined cycle power generation system was selected. A detailed multidisciplinary modeling and simulation environment was adopted for this problem. Results have shown that evidence theory based technique provides more insight on the uncertainties arising from incomplete information or lack of knowledge as compared to deterministic or probability theory methods. Margin analysis was also carried out for both the techniques. A detailed description of TSMs and their usage in conjunction with technology impact matrices and technology compatibility matrices is discussed. Various combination methods are also proposed for higher order interactions, which can be applied according to the expert opinion or historical data. The introduction of technology synergy matrix enabled capturing the higher order technology interactions, and improvement in predicted system performance.

  9. Major methodological constraints to the assessment of environmental status based on the condition of benthic communities

    Science.gov (United States)

    Medeiros, João Paulo; Pinto, Vanessa; Sá, Erica; Silva, Gilda; Azeda, Carla; Pereira, Tadeu; Quintella, Bernardo; Raposo de Almeida, Pedro; Lino Costa, José; José Costa, Maria; Chainho, Paula

    2014-05-01

    The Marine Strategy Framework Directive (MSFD) was published in 2008 and requires Member States to take the necessary measures to achieve or maintain good environmental status in aquatic ecosystems by the year of 2020. The MSFD indicates 11 qualitative descriptors for environmental status assessment, including seafloor integrity, using the condition of the benthic community as an assessment indicator. Member States will have to define monitoring programs for each of the MSFD descriptors based on those indicators in order to understand which areas are in a Good Environmental Status and what measures need to be implemented to improve the status of areas that fail to achieve that major objective. Coastal and offshore marine waters are not frequently monitored in Portugal and assessment tools have only been developed very recently with the implementation of the Water Framework Directive (WFD). The lack of historical data and knowledge on the constraints of benthic indicators in coastal areas requires the development of specific studies addressing this issue. The major objective of the current study was to develop and test and experimental design to assess impacts of offshore projects. The experimental design consisted on the seasonal and interannual assessment of benthic invertebrate communities in the area of future implementation of the structures (impact) and two potential control areas 2 km from the impact area. Seasonal benthic samples were collected at nine random locations within the impact and control areas in two consecutive years. Metrics included in the Portuguese benthic assessment tool (P-BAT) were calculated since this multimetric tool was proposed for the assessment of the ecological status in Portuguese coastal areas under the WFD. Results indicated a high taxonomic richness in this coastal area and no significant differences were found between impact and control areas, indicating the feasibility of establishing adequate control areas in marine

  10. Activity-based costing methodology as tool for costing in hematopathology laboratory

    Directory of Open Access Journals (Sweden)

    Gujral Sumeet

    2010-01-01

    Full Text Available Background: Cost analysis in laboratories represents a necessary phase in their scientific progression. Aim: To calculate indirect cost and thus total cost per sample of various tests at Hematopathology laboratory (HPL Settings and Design: Activity-based costing (ABC method is used to calculate per cost test of the hematopathology laboratory. Material and Methods: Information is collected from registers, purchase orders, annual maintenance contracts (AMCs, payrolls, account books, hospital bills and registers along with informal interviews with hospital staff. Results: Cost per test decreases as total number of samples increases. Maximum annual expense at the HPL is on reagents and consumables followed by manpower. Cost per test is higher for specialized tests which interpret morphological or flow data and are done by a pathologist. Conclusions: Despite several limitations and assumptions, this was an attempt to understand how the resources are consumed in a large size government-run laboratory. The rate structure needs to be revised for most of the tests, mainly for complete blood counts (CBC, bone marrow examination, coagulation tests and Immunophenotyping. This costing exercise is laboratory specific and each laboratory needs to do its own costing. Such an exercise may help a laboratory redesign its costing structure or at least understand the economics involved in the laboratory management.

  11. Non-invasive ultrasound based temperature measurements at reciprocating screw plastication units: Methodology and applications

    Science.gov (United States)

    Straka, Klaus; Praher, Bernhard; Steinbichler, Georg

    2015-05-01

    Previous attempts to accurately measure the real polymer melt temperature in the screw chamber as well as in the screw channels have failed on account of the challenging metrological boundary conditions (high pressure, high temperature, rotational and axial screw movement). We developed a novel ultrasound system - based on reflection measurements - for the online determination of these important process parameters. Using available pressure-volume-temperature (pvT) data from a polymer it is possible to estimate the density and adiabatic compressibility of the material and therefore the pressure and temperature depending longitudinal ultrasound velocity. From the measured ultrasonic reflection time from the screw root and barrel wall and the pressure it is possible to calculate the mean temperature in the screw channel or in the chamber in front of the screw (in opposition to flush mounted infrared or thermocouple probes). By means of the above described system we are able to measure axial profiles of the mean temperature in the screw chamber. The data gathered by the measurement system can be used to develop control strategies for the plastication process to reduce temperature gradients within the screw chamber or as input data for injection moulding simulation.

  12. A Novel Data Assimilation Methodology for Predicting Lithology Based on Sequence Labeling Algorithms

    Science.gov (United States)

    Park, E.; Jeong, J.; Han, W. S.; Kim, K. Y.

    2014-12-01

    A hidden Markov model (HMM) and a conditional random fields (CRFs) model for lithological predictions based on multiple geophysical well-logging data are derived for dealing with directional non-stationarity through bi-directional training and conditioning. The developed models were benchmarked against their conventional counterparts, and hypothetical boreholes with the corresponding synthetic geophysical data including artificial errors were employed. In the three test scenarios devised, the average fitness and unfitness values of the developed CRFs model and HMM are 0.84 and 0.071, and 0.81 and 0.084, respectively, while those of the conventional CRFs model and HMM are 0.78 and 0.091, and 0.77 and 0.099, respectively. Comparisons of their predictabilities show that the models designed for directional non-stationarity clearly perform better than the conventional models for all tested examples. Among them, the developed linear-chain CRFs model showed the best or close to the best performance with high predictability and a low training data requirement. Keywords: one-dimensional lithological characterization, sequence labeling algorithm, conditional random fields, hidden Markov model, borehole, geophysical well-logging data.

  13. A Solution Methodology of Bi-Level Linear Programming Based on Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    M. S. Osman

    2009-01-01

    Full Text Available Problem statement: We deal with the bi-level linear programming problem. A bi-level programming problem is formulated for a problem in which two Decision-Makers (DMs make decisions successively. Approach: In this research we studied and designs a Genetic Algorithm (GA of Bi-Level Linear Programming Problems (BLPP by constructing the fitness function of the upper-level programming problems based on the definition of the feasible degree. This GA avoids the use of penalty function to deal with the constraints, by changing the randomly generated initial population into an initial population satisfying the constraints in order to improve the ability of the GA to deal with the constraints. Also we designed software to solve this problem. A comparative study between proposed method and previous methods through numerical results of some examples. Finally, parametric information of the GA was introduced. Results: Results of the study showed that the proposed method is feasible and more efficient to solve (BLPP, also there exist package to solve (BLPP problem. Conclusion: This GA avoids the use of penalty function to deal with the constraints, by changing the randomly generated initial population into an initial population satisfying the constraints in order to improve the ability of the GA to deal with the constraints.

  14. THEORETICAL - METHODOLOGICAL BASES FOR THE FORMATION OF NATIONAL IDENTITY IN PRESCHOOL CHILDREN

    Directory of Open Access Journals (Sweden)

    Irina Aleksandrovna Galkina

    2015-01-01

    Full Text Available The article highlights the modern approaches to organization of education of preschool children in multicultural environment, states some basic principles and conditions for the formation of national identity, socialization of children in modern multicultural society, where multicultural education is based on the attitude of children to their homeland, family and immediate environment, to cultures of neighboring nations, from the expression of interest and sympathy towards people of other nationalities to learning their traditions and customs, developing knowledge about them, friendly relationship and respect to them. The article presents scientific positions in the context of philosophical, ethnographic, sociological researches, which reconsider the understanding and interpretation of the concept of "multicultural education" in the context of such categories as "internationality", "nationality", "humaneness". It defines and explains the most effective means of the formation of spiritual and moral character of a growing person, such as literature and theatre. The article presents the project enhancing the formation of national identity among preschool children, reflecting the multicultural traditions of Irkutsk region by fiction writers of the Angara region and Siberia.

  15. Weibull-Based Design Methodology for Rotating Structures in Aircraft Engines

    Directory of Open Access Journals (Sweden)

    Erwin V. Zaretsky

    2003-01-01

    Full Text Available The NASA Energy-Efficient Engine (E3-Engine is used as the basis of a Weibull-based life and reliability analysis. Each component's life, and thus the engine's life, is defined by high-cycle fatigue or low-cycle fatigue. Knowing the cumulative life distribution of each of the components making up the engine as represented by a Weibull slope is a prerequisite to predicting the life and reliability of the entire engine. As the engine's Weibull slope increases, the predicted life decreases. The predicted engine lives L5 (95% probability of survival of approximately 17,000 and 32,000 hr do correlate with current engine-maintenance practices without and with refurbishment, respectively. The individual high-pressure turbine (HPT blade lives necessary to obtain a blade system life L0.1 (99.9% probability of survival of 9000 hr for Weibull slopes of 3, 6, and 9 are 47,391; 20,652; and 15,658 hr, respectively. For a design life of the HPT disks having probable points of failure equal to or greater than 36,000 hr at a probability of survival of 99.9%, the predicted disk system life L0.1 can vary from 9408 to 24,911 hr.

  16. Looking for phase-space structures in star-forming regions: An MST-based methodology

    CERN Document Server

    Alfaro, Emilio J

    2015-01-01

    We present a method for analysing the phase space of star-forming regions. In particular we are searching for clumpy structures in the 3D subspace formed by two position coordinates and radial velocity. The aim of the method is the detection of kinematic segregated radial velocity groups, that is, radial velocity intervals whose associated stars are spatially concentrated. To this end we define a kinematic segregation index, $\\tilde{\\Lambda}$(RV), based on the Minimum Spanning Tree (MST) graph algorithm, which is estimated for a set of radial velocity intervals in the region. When $\\tilde{\\Lambda}$(RV) is significantly greater than 1 we consider that this bin represents a grouping in the phase space. We split a star-forming region into radial velocity bins and calculate the kinematic segregation index for each bin, and then we obtain the spectrum of kinematic groupings, which enables a quick visualization of the kinematic behaviour of the region under study. We carried out numerical models of different config...

  17. ELABORATION OF METHODOLOGICAL TOOLS FOR AGRICULTURAL RISK MANAGEMENT BASED ON INNOVATION

    Directory of Open Access Journals (Sweden)

    Voroshilova I. V.

    2015-03-01

    Full Text Available The article deals with the possibility of expanding of agricultural tools in risk management based on commodity financial instruments and weather derivatives. On the basis of summarizing the research results of domestic and foreign scholars and creative interpretation of the results the authors supplemented and refined definition of the category of "risk" and "risk of agricultural production” is obtained. The article supplements classification of risk in agricultural production and circulation of agricultural products, considers a proven techniques and methods of agricultural risk management, discusses the current trends of the global and domestic market of derivatives, gives a market segmentation by type of derivative instruments and the characteristics of the underlying assets, analyzes the reasons for the low level of development of derivatives markets at the meso level using the example of the Krasnodar Region, describes the potential derivatives in addressing management of agricultural risks on the basis of foreign sources, gives an insufficient level of financial literacy of potential participants, the lack of regulations and regulatory infrastructures, describe the problem of accounting and reporting of the results of operations in this segment, insufficient training of market operators and reveals the possibility of expanding the agricultural tools of risk management

  18. Attentional bias modification based on visual probe task: methodological issues, results and clinical relevance

    Directory of Open Access Journals (Sweden)

    Fernanda Machado Lopes

    2015-12-01

    Full Text Available Introduction: Attentional bias, the tendency that a person has to drive or maintain attention to a specific class of stimuli, may play an important role in the etiology and persistence of mental disorders. Attentional bias modification has been studied as a form of additional treatment related to automatic processing. Objectives: This systematic literature review compared and discussed methods, evidence of success and potential clinical applications of studies about attentional bias modification (ABM using a visual probe task. Methods: The Web of Knowledge, PubMed and PsycInfo were searched using the keywords attentional bias modification, attentional bias manipulation and attentional bias training. We selected empirical studies about ABM training using a visual probe task written in English and published between 2002 and 2014. Results: Fifty-seven studies met inclusion criteria. Most (78% succeeded in training attention in the predicted direction, and in 71% results were generalized to other measures correlated with the symptoms. Conclusions: ABM has potential clinical utility, but to standardize methods and maximize applicability, future studies should include clinical samples and be based on findings of studies about its effectiveness.

  19. The methodological features of managing the value of companies introducing "green" innovations

    OpenAIRE

    Kharin A. G.

    2012-01-01

    Although it is a common assumption that innovations are one of the most important factors of economic development, there is a need to review some provisions of innovation methodology so that new fundamental values are taken into account more fully. Most recent business models are based on the depletion of natural environment, whose potential has been almost exhausted. It is necessary to introduce new ideas that are of use for society and create values for compani...

  20. Verification & Validation of High-Order Short-Characteristics-Based Deterministic Transport Methodology on Unstructured Grids

    Energy Technology Data Exchange (ETDEWEB)

    Azmy, Yousry [North Carolina State Univ., Raleigh, NC (United States); Wang, Yaqi [North Carolina State Univ., Raleigh, NC (United States)

    2013-12-20

    The research team has developed a practical, high-order, discrete-ordinates, short characteristics neutron transport code for three-dimensional configurations represented on unstructured tetrahedral grids that can be used for realistic reactor physics applications at both the assembly and core levels. This project will perform a comprehensive verification and validation of this new computational tool against both a continuous-energy Monte Carlo simulation (e.g. MCNP) and experimentally measured data, an essential prerequisite for its deployment in reactor core modeling. Verification is divided into three phases. The team will first conduct spatial mesh and expansion order refinement studies to monitor convergence of the numerical solution to reference solutions. This is quantified by convergence rates that are based on integral error norms computed from the cell-by-cell difference between the code’s numerical solution and its reference counterpart. The latter is either analytic or very fine- mesh numerical solutions from independent computational tools. For the second phase, the team will create a suite of code-independent benchmark configurations to enable testing the theoretical order of accuracy of any particular discretization of the discrete ordinates approximation of the transport equation. For each tested case (i.e. mesh and spatial approximation order), researchers will execute the code and compare the resulting numerical solution to the exact solution on a per cell basis to determine the distribution of the numerical error. The final activity comprises a comparison to continuous-energy Monte Carlo solutions for zero-power critical configuration measurements at Idaho National Laboratory’s Advanced Test Reactor (ATR). Results of this comparison will allow the investigators to distinguish between modeling errors and the above-listed discretization errors introduced by the deterministic method, and to separate the sources of uncertainty.