WorldWideScience

Sample records for based depletion methodology

  1. Validation of a Monte Carlo Based Depletion Methodology Using HFIR Post-Irradiation Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Chandler, David [ORNL; Maldonado, G Ivan [ORNL; Primm, Trent [ORNL

    2009-11-01

    Post-irradiation uranium isotopic atomic densities within the core of the High Flux Isotope Reactor (HFIR) were calculated and compared to uranium mass spectrographic data measured in the late 1960s and early 70s [1]. This study was performed in order to validate a Monte Carlo based depletion methodology for calculating the burn-up dependent nuclide inventory, specifically the post-irradiation uranium

  2. Development of a practical fuel management system for PSBR based on advanced three-dimensional Monte Carlo coupled depletion methodology

    Science.gov (United States)

    Tippayakul, Chanatip

    The main objective of this research is to develop a practical fuel management system for the Pennsylvania State University Breazeale research reactor (PSBR) based on several advanced Monte Carlo coupled depletion methodologies. Primarily, this research involved two major activities: model and method developments and analyses and validations of the developed models and methods. The starting point of this research was the utilization of the earlier developed fuel management tool, TRIGSIM, to create the Monte Carlo model of core loading 51 (end of the core loading). It was found when comparing the normalized power results of the Monte Carlo model to those of the current fuel management system (using HELIOS/ADMARC-H) that they agreed reasonably well (within 2%--3% differences on average). Moreover, the reactivity of some fuel elements was calculated by the Monte Carlo model and it was compared with measured data. It was also found that the fuel element reactivity results of the Monte Carlo model were in good agreement with the measured data. However, the subsequent task of analyzing the conversion from the core loading 51 to the core loading 52 using TRIGSIM showed quite significant difference of each control rod worth between the Monte Carlo model and the current methodology model. The differences were mainly caused by inconsistent absorber atomic number densities between the two models. Hence, the model of the first operating core (core loading 2) was revised in light of new information about the absorber atomic densities to validate the Monte Carlo model with the measured data. With the revised Monte Carlo model, the results agreed better to the measured data. Although TRIGSIM showed good modeling and capabilities, the accuracy of TRIGSIM could be further improved by adopting more advanced algorithms. Therefore, TRIGSIM was planned to be upgraded. The first task of upgrading TRIGSIM involved the improvement of the temperature modeling capability. The new TRIGSIM was

  3. Validation of a Monte Carlo based depletion methodology via High Flux Isotope Reactor HEU post-irradiation examination measurements

    International Nuclear Information System (INIS)

    The purpose of this study is to validate a Monte Carlo based depletion methodology by comparing calculated post-irradiation uranium isotopic compositions in the fuel elements of the High Flux Isotope Reactor (HFIR) core to values measured using uranium mass-spectrographic analysis. Three fuel plates were analyzed: two from the outer fuel element (OFE) and one from the inner fuel element (IFE). Fuel plates O-111-8, O-350-I, and I-417-24 from outer fuel elements 5-O and 21-O and inner fuel element 49-I, respectively, were selected for examination. Fuel elements 5-O, 21-O, and 49-I were loaded into HFIR during cycles 4, 16, and 35, respectively (mid to late 1960s). Approximately one year after each of these elements were irradiated, they were transferred to the High Radiation Level Examination Laboratory (HRLEL) where samples from these fuel plates were sectioned and examined via uranium mass-spectrographic analysis. The isotopic composition of each of the samples was used to determine the atomic percent of the uranium isotopes. A Monte Carlo based depletion computer program, ALEPH, which couples the MCNP and ORIGEN codes, was utilized to calculate the nuclide inventory at the end-of-cycle (EOC). A current ALEPH/MCNP input for HFIR fuel cycle 400 was modified to replicate cycles 4, 16, and 35. The control element withdrawal curves and flux trap loadings were revised, as well as the radial zone boundaries and nuclide concentrations in the MCNP model. The calculated EOC uranium isotopic compositions for the analyzed plates were found to be in good agreement with measurements, which reveals that ALEPH/MCNP can accurately calculate burn-up dependent uranium isotopic concentrations for the HFIR core. The spatial power distribution in HFIR changes significantly as irradiation time increases due to control element movement. Accurate calculation of the end-of-life uranium isotopic inventory is a good indicator that the power distribution variation as a function of space and

  4. Global oil depletion: forecasts and methodologies

    OpenAIRE

    Boyle, Godfrey; Bentley, Roger

    2008-01-01

    A range of forecasts of global oil production made between 1956 and the present day are listed. For the majority of these the methodology used to generate the forecast is described. The paper distinguishes between three types of forecast: Group1: quantitative analyses that predict global oil production will reach a resource-limited peak in the near term, and certainly before the year 2020; Group2: forecasts that use quantitative methods, but which see no production peak within the forec...

  5. VERA Core Simulator Methodology for PWR Cycle Depletion

    Energy Technology Data Exchange (ETDEWEB)

    Kochunas, Brendan [University of Michigan; Collins, Benjamin S [ORNL; Jabaay, Daniel [University of Michigan; Kim, Kang Seog [ORNL; Graham, Aaron [University of Michigan; Stimpson, Shane [University of Michigan; Wieselquist, William A [ORNL; Clarno, Kevin T [ORNL; Palmtag, Scott [Core Physics, Inc.; Downar, Thomas [University of Michigan; Gehin, Jess C [ORNL

    2015-01-01

    This paper describes the methodology developed and implemented in MPACT for performing high-fidelity pressurized water reactor (PWR) multi-cycle core physics calculations. MPACT is being developed primarily for application within the Consortium for the Advanced Simulation of Light Water Reactors (CASL) as one of the main components of the VERA Core Simulator, the others being COBRA-TF and ORIGEN. The methods summarized in this paper include a methodology for performing resonance self-shielding and computing macroscopic cross sections, 2-D/1-D transport, nuclide depletion, thermal-hydraulic feedback, and other supporting methods. These methods represent a minimal set needed to simulate high-fidelity models of a realistic nuclear reactor. Results demonstrating this are presented from the simulation of a realistic model of the first cycle of Watts Bar Unit 1. The simulation, which approximates the cycle operation, is observed to be within 50 ppm boron (ppmB) reactivity for all simulated points in the cycle and approximately 15 ppmB for a consistent statepoint. The verification and validation of the PWR cycle depletion capability in MPACT is the focus of two companion papers.

  6. METHODOLOGICAL BASES OF OUTSOURCING

    Directory of Open Access Journals (Sweden)

    Lanskaya D. V.

    2014-09-01

    Full Text Available Outsourcing is investigated from a position of finding steady and unique competitive advantages of a public corporation due to attraction of carriers of unique intellectual and uses social capitals of the specialized companies within the institutional theory. Key researchers and events in the history of outsourcing are marked out, the existing approaches to definition of the concept of outsourcing, advantage and risks from application of technology of outsourcing are considered. It is established that differences of outsourcing, sub-contraction and cooperation are not in the nature of the functional relations, and in the depth of considered economic terms and phenomena. The methodology of outsourcing is considered as a part of methodology of cooperation of enterprise innovative structures of being formed sector of knowledge economy

  7. A SAS2H/KENO-V Methodology for 3D Full Core depletion analysis

    International Nuclear Information System (INIS)

    This paper describes the use of a SAS2H/KENO-V methodology for 3D full core depletion analysis and illustrates its capabilities by applying it to burnup analysis of the IRIS core benchmarks. This new SAS2H/KENO-V sequence combines a 3D Monte Carlo full core calculation of node power distribution and a 1D Wigner-Seitz equivalent cell transport method for independent depletion calculation of each of the nodes. This approach reduces by more than an order of magnitude the time required for getting comparable results using the MOCUP code system. The SAS2H/KENO-V results for the asymmetric IRIS core benchmark are in good agreement with the results of the ALPHA/PHOENIX/ANC code system. (author)

  8. Portal implementation methodology based on EMRIS methodology

    OpenAIRE

    Modic, Sašo

    2011-01-01

    The bachelor's thesis explores the possibility of usage of unified methodology for developing information systems for implementation of SharePoint system into business environment. The idea for the thesis originates in compulsory work pratice, regulated by the Faculty of computer and information science in Ljubljana. During my practice I obtained basic knowledge about the implementation of SharePoint systems and dificulties arising from this process. After consultation with my advisor, who...

  9. Siemens PWR burnup credit criticality analysis methodology: Depletion code and verification methods

    International Nuclear Information System (INIS)

    Application of burnup credit requires knowledge of the reactivity state of the irradiated fuel for which burnup credit is taken. The isotopic inventory of the irradiated fuel has to be calculated, therefore, by means of depletion codes. Siemens performs depletion calculations for PWR fuel burnup credit applications with the aid of the code package SAV. This code package is based on the first principles approach, i.e., avoids cycle or reactor specific fitting or adjustment parameters. This approach requires a general and comprehensive qualification of SAV by comparing experimental with calculational results. In the paper on hand the attention is focused mainly on the evaluation of chemical assay data received from different experimental programmes. (author)

  10. INPRO Methodology for Sustainability Assessment of Nuclear Energy Systems: Environmental Impact from Depletion of Resources

    International Nuclear Information System (INIS)

    INPRO is an international project to help ensure that nuclear energy is available to contribute in a sustainable manner to meeting the energy needs of the 21st century. A basic principle of INPRO in the area of environmental impact from depletion of resources is that a nuclear energy system will be capable of contributing to the energy needs in the 21st century while making efficient use of non-renewable resources needed for construction, operation and decommissioning. Recognizing that a national nuclear energy programme in a given country may be based both on indigenous resources and resources purchased from abroad, this publication provides background materials and summarizes the results of international global resource availability studies that could contribute to the corresponding national assessments

  11. Depletion methodology in the 3-D whole core transport code DeCART

    International Nuclear Information System (INIS)

    Three dimensional whole-core transport code DeCART has been developed to include a characteristics of the numerical reactor to replace partly the experiment. This code adopts the deterministic method in simulating the neutron behavior with the least assumption and approximation. This neutronic code is also coupled with the thermal hydraulic code CFD and the thermo mechanical code to simulate the combined effects. Depletion module has been implemented in DeCART code to predict the depleted composition in the fuel. The exponential matrix method of ORIGEN-2 has been used for the depletion calculation. The library of including decay constants, yield matrix and others has been used and greatly simplified for the calculation efficiency. This report summarizes the theoretical backgrounds and includes the verification of the depletion module in DeCART by performing the benchmark calculations

  12. Risk-based methodology for USNRC inspections

    International Nuclear Information System (INIS)

    This paper describes the development and trial applications of a risk-based methodology to enhance the inspection processes for US nuclear power plants. Objectives of risk-based methods to complement prescriptive engineering approaches in US Nuclear Regulatory Commission (USNRC) inspection programs are presented. Insights from time-dependent risk profiles of plant configurational from Individual Plant Evaluation (IPE) studies were integrated to develop a framework for optimizing inspection efforts in NRC regulatory initiatives. Lessons learned from NRC pilot applications of the risk-based methodology for evaluation of the effectiveness of operational risk management programs at US nuclear power plant sites are also discussed

  13. Towards an MDA-based development methodology

    NARCIS (Netherlands)

    Gavras, Anastasius; Belaunde, Mariano; Ferreira Pires, Luis; Almeida, João Paolo A.; Oquendo, Flavio; Warboys, Brian C.; Morrison, Ron

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  14. Adsorption of dyestuff from aqueous solutions through oxalic acid-modified swede rape straw: adsorption process and disposal methodology of depleted bioadsorbents.

    Science.gov (United States)

    Feng, Yanfang; Dionysiou, Dionysios D; Wu, Yonghong; Zhou, Hui; Xue, Lihong; He, Shiying; Yang, Linzhang

    2013-06-01

    Swede rape straw (Brassica napus L.) was modified by oxalic acid under mild conditions producing an efficient dye adsorbent (SRSOA). This low-cost and environmental friendly bioadsorbent was characterized by various techniques and then applied to purify dye-contaminated aqueous solutions. Equilibrium study showed that the Langmuir model demonstrated the best fit to the equilibrium data and the methylene blue (MB) adsorption capacity calculated by this model was 432mgg(-1). The adsorption process and mechanism is also discussed. To properly deal with the dye-loaded bioadsorbents, the disposal methodology is discussed and a biochar based on depleted bioadsorbents was for the first time produced and examined. This method both solved the disposal problem of contaminant-loaded bioadsorbents and produced an useful adsorbent thereafter. The study indicates that SRSOA is a promising substitute for ACs in purifying dye-contaminated wastewater and that producing biochars from contaminant-loaded bioadsorbents maybe a feasible disposal method. PMID:23612179

  15. Depleting methyl bromide residues in soil by reaction with bases

    Science.gov (United States)

    Despite generally being considered the most effective soil fumigant, methyl bromide (MeBr) use is being phased out because its emissions from soil can lead to stratospheric ozone depletion. However, a large amount is still currently used due to Critical Use Exemptions. As strategies for reducing the...

  16. Evaluation of acute tryptophan depletion and sham depletion with a gelatin-based collagen peptide protein mixture

    DEFF Research Database (Denmark)

    Stenbæk, D S; Einarsdottir, H S; Goregliad-Fjaellingsdal, T;

    2016-01-01

    Acute Tryptophan Depletion (ATD) is a dietary method used to modulate central 5-HT to study the effects of temporarily reduced 5-HT synthesis. The aim of this study is to evaluate a novel method of ATD using a gelatin-based collagen peptide (CP) mixture. We administered CP-Trp or CP+Trp mixtures ...... effects of CP-Trp compared to CP+Trp were observed. The transient increase in plasma Trp after CP+Trp may impair comparison to the CP-Trp and we therefore recommend in future studies to use a smaller dose of Trp supplement to the CP mixture....

  17. DKBLM—Deep Knowledge Based Learning Methodology

    Institute of Scientific and Technical Information of China (English)

    马志方

    1993-01-01

    To solve the Imperfect Theory Problem(ITP)faced by Explanation Based Generalization(EBG),this paper proposes a methodology,Deep Knowledge Based Learning Methodology(DKBLM)by name and gives an implementation of DKBLM,called Hierarchically Distributed Learning System(HDLS).As an example of HDLS's application,this paper shows a learning system(MLS)in meteorology domain and its running with a simplified example.DKBLM can acquire experiential knowledge with causality in it.It is applicable to those kinds of domains,in which experiments are relatively difficults to caryy out,and in which there exist many available knowledge systems at different levels for the same domain(such as weather forecasting).

  18. Satellite-based estimates of groundwater depletion in India

    OpenAIRE

    Rodell, M.; Velicogna, I; Famiglietti, JS

    2009-01-01

    Groundwater is a primary source of fresh water in many parts of the world. Some regions are becoming overly dependent on it, consuming groundwater faster than it is naturally replenished and causing water tables to decline unremittingly. Indirect evidence suggests that this is the case in northwest India, but there has been no regional assessment of the rate of groundwater depletion. Here we use terrestrial water storage-change observations from the NASA Gravity Recovery and Climate Experimen...

  19. Performance-based asphalt mixture design methodology

    Science.gov (United States)

    Ali, Al-Hosain Mansour

    performance based design procedure. Finally, the developed guidelines with easy-to-use flow charts for the integrated mix design methodology are presented.

  20. Project management methodology based on the PMI

    OpenAIRE

    Lukyanchenko, V. P.; Kizeev, Veniamin Mikhailovich; Nikolaenko, Nina Aleksandrovna; Лукьянченко, В. П.; Кизеев, Вениамин Михайлович; Николаенко, Нина Александровна

    2015-01-01

    Project management is a tool to improve business efficiency and bring maximum impact, that is why every organization should choose the project management methodology that best suits the specifics of its business. This article describes the Project Management Institute. The article investigates a PMI methodology. The paper defines the main objectives of this kind of methodology and considers its structure revealing the effectiveness of the methodology in PMI project management. Consider the fi...

  1. Case Based Reasoning: Case Representation Methodologies

    Directory of Open Access Journals (Sweden)

    Shaker H. El-Sappagh

    2015-11-01

    Full Text Available Case Based Reasoning (CBR is an important technique in artificial intelligence, which has been applied to various kinds of problems in a wide range of domains. Selecting case representation formalism is critical for the proper operation of the overall CBR system. In this paper, we survey and evaluate all of the existing case representation methodologies. Moreover, the case retrieval and future challenges for effective CBR are explained. Case representation methods are grouped in to knowledge-intensive approaches and traditional approaches. The first group overweight the second one. The first methods depend on ontology and enhance all CBR processes including case representation, retrieval, storage, and adaptation. By using a proposed set of qualitative metrics, the existing methods based on ontology for case representation are studied and evaluated in details. All these systems have limitations. No approach exceeds 53% of the specified metrics. The results of the survey explain the current limitations of CBR systems. It shows that ontology usage in case representation needs improvements to achieve semantic representation and semantic retrieval in CBR system.

  2. Methodologies for Crawler Based Web Surveys.

    Science.gov (United States)

    Thelwall, Mike

    2002-01-01

    Describes Web survey methodologies used to study the content of the Web, and discusses search engines and the concept of crawling the Web. Highlights include Web page selection methodologies; obstacles to reliable automatic indexing of Web sites; publicly indexable pages; crawling parameters; and tests for file duplication. (Contains 62…

  3. Too exhausted to remember: ego depletion undermines subsequent event-based prospective memory.

    Science.gov (United States)

    Li, Jian-Bin; Nie, Yan-Gang; Zeng, Min-Xia; Huntoon, Meghan; Smith, Jessi L

    2013-01-01

    Past research has consistently found that people are likely to do worse on high-level cognitive tasks after exerting self-control on previous actions. However, little has been unraveled about to what extent ego depletion affects subsequent prospective memory. Drawing upon the self-control strength model and the relationship between self-control resources and executive control, this study proposes that the initial actions of self-control may undermine subsequent event-based prospective memory (EBPM). Ego depletion was manipulated through watching a video requiring visual attention (Experiment 1) or completing an incongruent Stroop task (Experiment 2). Participants were then tested on EBPM embedded in an ongoing task. As predicted, the results showed that after ruling out possible intervening variables (e.g. mood, focal and nonfocal cues, and characteristics of ongoing task and ego depletion task), participants in the high-depletion condition performed significantly worse on EBPM than those in the low-depletion condition. The results suggested that the effect of ego depletion on EBPM was mainly due to an impaired prospective component rather than to a retrospective component. PMID:23432682

  4. A spreading resistance based method of mapping the resistivity and potential of a depleted diode

    International Nuclear Information System (INIS)

    The characterization of the depletion states of reverse operated p-n junctions is an important task within the scope of high energy physics detector development. The configuration of the sensitive volume inside these structures determines the particle detection process. Therefore a spreading resistance profiling based method has been developed to map the local resistivity and potential along the prepared edge of a depleted diode. This ''edge-SRP' method is capable of detecting the boarder of the space charge region and its transition to the electrical neutral bulk. In order to characterize the depleted space charge region, the surface potential along the edge could be measured by slightly modifying the setup. These surface potential results complement the spreading resistance one. In this paper the functionality of the developed method is verified by performing measurements on a prepared diode, which has been biased with different voltages

  5. A Systematic Methodology for Design of Emulsion Based Chemical Products

    DEFF Research Database (Denmark)

    Mattei, Michele; Kontogeorgis, Georgios; Gani, Rafiqul

    2012-01-01

    A systematic methodology for emulsion based chemical product design is presented. The methodology employs a model-based product synthesis/design stage and a modelexperiment based further refinement and/or validation stage. In this paper only the first stage is presented. The methodology employs a...... employed to obtain a list of formulations that satisfy constraints representing the desired needs (target properties). Through a conceptual case study dealing with the design of a sunscreen lotion, the application of this new methodology is illustrated....

  6. A Systematic Methodology for Design of Emulsion Based Chemical Products

    DEFF Research Database (Denmark)

    Mattei, Michele; Kontogeorgis, Georgios; Gani, Rafiqul

    A systematic methodology for emulsion based chemical product design is presented. The methodology employs a model-based product synthesis/design stage and a modelexperiment based further refinement and/or validation stage. In this paper only the first stage is presented. The methodology employs a...... employed to obtain a list of formulations that satisfy constraints representing the desired needs (target properties). Through a conceptual case study dealing with the design of a sunscreen lotion, the application of this new methodology is illustrated....

  7. EXPERIMENTAL ACIDIFICATION CAUSES SOIL BASE-CATION DEPLETION AT THE BEAR BROOK WATERSHED IN MAINE

    Science.gov (United States)

    There is concern that changes in atmospheric deposition, climate, or land use have altered the biogeochemistry of forests causing soil base-cation depletion, particularly Ca. The Bear Brook Watershed in Maine (BBWM) is a paired watershed experiment with one watershed subjected to...

  8. Measurement-based experimental research methodology

    OpenAIRE

    Papadimitriou D.; Fabrega L.; Vila P.; Careglio D.; Demeester P.

    2013-01-01

    Aiming at creating a dynamic between elaboration, realization, and validation by means of iterative cycles of experimentation, Future Internet Research and Experimentation (FIRE) projects have been rapidly confronted to the lack of systematic experimental research methodology. Moreover, the “validation by experimentation” objective involves a broad spectrum of experimentation tools ranging from simulation to field trial prototypes together with their associated measurement tools. As experimen...

  9. SNE's methodological basis - web-based software in entrepreneurial surveys

    DEFF Research Database (Denmark)

    Madsen, Henning

    This overhead based paper gives an introduction to the research methodology applied in the surveys carried out in the SNE-project.......This overhead based paper gives an introduction to the research methodology applied in the surveys carried out in the SNE-project....

  10. A Risk-Based Sensor Placement Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Ronald W [ORNL; Kulesz, James J [ORNL

    2006-08-01

    A sensor placement methodology is proposed to solve the problem of optimal location of sensors or detectors to protect population against the exposure to and effects of known and/or postulated chemical, biological, and/or radiological threats. Historical meteorological data are used to characterize weather conditions as wind speed and direction pairs with the percentage of occurrence of the pairs over the historical period. The meteorological data drive atmospheric transport and dispersion modeling of the threats, the results of which are used to calculate population at risk against standard exposure levels. Sensor locations are determined via a dynamic programming algorithm where threats captured or detected by sensors placed in prior stages are removed from consideration in subsequent stages. Moreover, the proposed methodology provides a quantification of the marginal utility of each additional sensor or detector. Thus, the criterion for halting the iterative process can be the number of detectors available, a threshold marginal utility value, or the cumulative detection of a minimum factor of the total risk value represented by all threats.

  11. Assessment of the depletion capability in MPACT

    International Nuclear Information System (INIS)

    The objective of the paper is to develop and demonstrate the depletion capability with pin resolved transport using the MPACT code. The first section of the paper provides a description of the depletion methodology and the algorithm used to solve the depletion equations in MPACT. A separate depletion library for MPACT is used based on the ORIGEN-S library to provide the basic decay constants and fission yields, as well as the 3-group cross-sections which are used for the isotopes not contained in the MPACT multi-group library. The cross sections for the depletion transmutation matrix were collapsed using the transport flux solution in MPACT based on either the 47 group HELIOS library based on ENDF-VI or a 56 group ORNL library based on ENDF-VII. The second section of this paper then describes the numerical verification of the depletion algorithm using two sets of benchmarks. The first is the JAERI LWR lattice benchmark which had participants from most of the lattice depletion codes currently used in the international nuclear community and the second benchmark is based on data from spent fuel of the Takahama-3 reactor. The results show that MPACT is generally in good agreement with the results of the other benchmark participants as well as the experimental data. Finally, a full core 2D model of CASL AMA benchmark was depleted based on the central plane of the Watts Bar reactor core which demonstrates the whole core depletion capability of MPACT. (author)

  12. School-Based Methylphenidate Placebo Protocols: Methodological and Practical Issues.

    Science.gov (United States)

    Hyman, Irwin A.; Wojtowicz, Alexandra; Lee, Kee Duk; Haffner, Mary Elizabeth; Fiorello, Catherine A.; And Others

    1998-01-01

    Focuses on methodological issues involved in choosing instruments to monitor behavior, once a comprehensive evaluation has suggested trials on Ritalin. Case examples illustrate problems of teacher compliance in filling out measures, supplying adequate placebos, and obtaining physical cooperation. Emerging school-based methodologies are discussed…

  13. Design Intelligent Model base Online Tuning Methodology for Nonlinear System

    Directory of Open Access Journals (Sweden)

    Ali Roshanzamir

    2014-04-01

    Full Text Available In various dynamic parameters systems that need to be training on-line adaptive control methodology is used. In this paper fuzzy model-base adaptive methodology is used to tune the linear Proportional Integral Derivative (PID controller. The main objectives in any systems are; stability, robust and reliability. However PID controller is used in many applications but it has many challenges to control of continuum robot. To solve these problems nonlinear adaptive methodology based on model base fuzzy logic is used. This research is used to reduce or eliminate the PID controller problems based on model reference fuzzy logic theory to control of flexible robot manipulator system and testing of the quality of process control in the simulation environment of MATLAB/SIMULINK Simulator.

  14. Towards an MDA-based development methodology for distributed applications

    NARCIS (Netherlands)

    Gavras, A.; Belaunde, M.; Ferreira Pires, L.; Andrade Almeida, J.P.; van Sinderen, M.J.; Ferreira Pires, L.

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  15. Regression-Based Artificial Neural Network Methodology in Response Surface Methodology

    Institute of Scientific and Technical Information of China (English)

    何桢; 肖粤翔

    2004-01-01

    Response surface methodology (RSM) is an important tool for process parameter optimization, robust design and other quality improvement efforts. When the relationship between influential input variables and output response is very complex, it' s hard to find the real response surface using RSM. In recent years artificial neural network(ANN) has been used in RSM. But the classical ANN does not work well under the constraints of real applications. An algorithm of regression-based ANN(R-ANN) is proposed in this paper, which is a supplement to the classical ANN methodology. It makes network closer to the response surface, so that training time is reduced and robustness is strengthened. The procedure of improving ANN by regressions is described and the comparisons among R-ANN, RSM and classical ANN are computed graphically in three examples. Our research shows that the R-ANN methodology is a good supplement to the RSM and classical ANN methodology,which can yield lower standard error of prediction under conditions that the scope of experiment is rigidly restricted.

  16. A Model-Based Prognostics Methodology For Electrolytic Capacitors Based On Electrical Overstress Accelerated Aging

    Data.gov (United States)

    National Aeronautics and Space Administration — A remaining useful life prediction methodology for electrolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical...

  17. A Model-based Prognostics Methodology for Electrolytic Capacitors Based on Electrical Overstress Accelerated Aging

    Data.gov (United States)

    National Aeronautics and Space Administration — A remaining useful life prediction methodology for elec- trolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical...

  18. Design Methodology for Self-organized Mobile Networks Based

    Directory of Open Access Journals (Sweden)

    John Petearson Anzola

    2016-06-01

    Full Text Available The methodology proposed in this article enables a systematic design of routing algorithms based on schemes of biclustering, which allows you to respond with timely techniques, clustering heuristics proposed by a researcher, and a focused approach to routing in the choice of clusterhead nodes. This process uses heuristics aimed at improving the different costs in communication surface groups called biclusters. This methodology globally enables a variety of techniques and heuristics of clustering that have been addressed in routing algorithms, but we have not explored all possible alternatives and their different assessments. Therefore, the methodology oriented design research of routing algorithms based on biclustering schemes will allow new concepts of evolutionary routing along with the ability to adapt the topological changes that occur in self-organized data networks.

  19. Radiation protection optimization using a knowledge based methodology

    International Nuclear Information System (INIS)

    This paper presents a knowledge based methodology for radiological planning and radiation protection optimization. The cost-benefit methodology described on International Commission of Radiation Protection Report No. 37 is employed within a knowledge based framework for the purpose of optimizing radiation protection and plan maintenance activities while optimizing radiation protection.1,2 The methodology is demonstrated through an application to a heating ventilation and air conditioning (HVAC) system. HVAC is used to reduce radioactivity concentration levels in selected contaminated multi-compartment models at nuclear power plants when higher than normal radiation levels are detected. The overall objective is to reduce personnel exposure resulting from airborne radioactivity, when routine or maintenance access is required in contaminated areas. 2 figs, 15 refs

  20. The Impact of an Ego Depletion Manipulation on Performance-Based and Self-Report Assessment Measures.

    Science.gov (United States)

    Charek, Daniel B; Meyer, Gregory J; Mihura, Joni L

    2016-10-01

    We investigated the impact of ego depletion on selected Rorschach cognitive processing variables and self-reported affect states. Research indicates acts of effortful self-regulation transiently deplete a finite pool of cognitive resources, impairing performance on subsequent tasks requiring self-regulation. We predicted that relative to controls, ego-depleted participants' Rorschach protocols would have more spontaneous reactivity to color, less cognitive sophistication, and more frequent logical lapses in visualization, whereas self-reports would reflect greater fatigue and less attentiveness. The hypotheses were partially supported; despite a surprising absence of self-reported differences, ego-depleted participants had Rorschach protocols with lower scores on two variables indicative of sophisticated combinatory thinking, as well as higher levels of color receptivity; they also had lower scores on a composite variable computed across all hypothesized markers of complexity. In addition, self-reported achievement striving moderated the effect of the experimental manipulation on color receptivity, and in the Depletion condition it was associated with greater attentiveness to the tasks, more color reactivity, and less global synthetic processing. Results are discussed with an emphasis on the response process, methodological limitations and strengths, implications for calculating refined Rorschach scores, and the value of using multiple methods in research and experimental paradigms to validate assessment measures. PMID:26002059

  1. Optimal (Solvent) Mixture Design through a Decomposition Based CAMD methodology

    DEFF Research Database (Denmark)

    Achenie, L.; Karunanithi, Arunprakash T.; Gani, Rafiqul

    2004-01-01

    Computer Aided Molecular/Mixture design (CAMD) is one of the most promising techniques for solvent design and selection. A decomposition based CAMD methodology has been formulated where the mixture design problem is solved as a series of molecular and mixture design sub-problems. This approach is...... able to overcome most of the difficulties associated with the solution of mixture design problems. The new methodology has been illustrated with the help of a case study involving the design of solvent-anti solvent binary mixtures for crystallization of Ibuprofen....

  2. Methodological Innovation in Practice-Based Design Doctorates

    Directory of Open Access Journals (Sweden)

    Joyce S. R. Yee

    2010-01-01

    Full Text Available This article presents a selective review of recent design PhDs that identify and analyse the methodological innovation that is occurring in the field, in order to inform future provision of research training. Six recently completed design PhDs are used to highlight possible philosophical and practical models that can be adopted by future PhD students in design. Four characteristics were found in design PhD methodology: innovations in the format and structure of the thesis, a pick-and-mix approach to research design, situating practice in the inquiry, and the validation of visual analysis. The article concludes by offering suggestions on how research training can be improved. By being aware of recent methodological innovations in the field, design educators will be better informed when developing resources for future design doctoral candidates and assisting supervision teams in developing a more informed and flexible approach to practice-based research.

  3. Online monitoring of immunoaffinity-based depletion of high-abundance blood proteins by UV spectrophotometry using enhanced green fluorescence protein and FITC-labeled human serum albumin

    OpenAIRE

    Yu Hyeong; Kim Byungwook; Kim Hyunsoo; Min Hophil; Yu Jiyoung; Kim Kyunggon; Kim Youngsoo

    2010-01-01

    Abstract Background The removal of high-abundance proteins from plasma is an efficient approach to investigating flow-through proteins for biomarker discovery studies. Most depletion methods are based on multiple immunoaffinity methods available commercially including LC columns and spin columns. Despite its usefulness, high-abundance depletion has an intrinsic problem, the sponge effect, which should be assessed during depletion experiments. Concurrently, the yield of depletion of high-abund...

  4. Measurement-based research: methodology, experiments, and tools

    OpenAIRE

    Papadimitriou, Dimitri; Fàbrega, Lluís; Vila Fumas, Pere; Careglio, Davide; Demeester, Piet

    2012-01-01

    In this paper, we report the results of the workshop organized by the FP7 EULER project on measurement-based research and associated methodology, experiments and tools. This workshop aimed at gathering all Future Internet Research and Experimentation (FIRE) experimental research projects under this thematic. Participants were invited to present the usage of measurement techniques in their experiments, their developments on measurement tools, and their foreseeable needs with respect to new dom...

  5. Valence-dependent influence of serotonin depletion on model-based choice strategy.

    Science.gov (United States)

    Worbe, Y; Palminteri, S; Savulich, G; Daw, N D; Fernandez-Egea, E; Robbins, T W; Voon, V

    2016-05-01

    Human decision-making arises from both reflective and reflexive mechanisms, which underpin goal-directed and habitual behavioural control. Computationally, these two systems of behavioural control have been described by different learning algorithms, model-based and model-free learning, respectively. Here, we investigated the effect of diminished serotonin (5-hydroxytryptamine) neurotransmission using dietary tryptophan depletion (TD) in healthy volunteers on the performance of a two-stage decision-making task, which allows discrimination between model-free and model-based behavioural strategies. A novel version of the task was used, which not only examined choice balance for monetary reward but also for punishment (monetary loss). TD impaired goal-directed (model-based) behaviour in the reward condition, but promoted it under punishment. This effect on appetitive and aversive goal-directed behaviour is likely mediated by alteration of the average reward representation produced by TD, which is consistent with previous studies. Overall, the major implication of this study is that serotonin differentially affects goal-directed learning as a function of affective valence. These findings are relevant for a further understanding of psychiatric disorders associated with breakdown of goal-directed behavioural control such as obsessive-compulsive disorders or addictions. PMID:25869808

  6. Methodological Approaches for Estimating Gross Regional Product after Taking into Account Depletion of Natural Resources, Environmental Pollution and Human Capital Aspects

    Directory of Open Access Journals (Sweden)

    Boris Alengordovich Korobitsyn

    2015-09-01

    Full Text Available A key indicator of the System of National Accounts of Russia at a regional scale is Gross Regional Product characterizing the value of goods and services produced in all sectors of the economy in a country and intended for final consumption, capital formation and net exports (excluding imports. From a sustainability perspective, the most weakness of GRP is that it ignores depreciation of man-made assets, natural resource depletion, environmental pollution and degradation, and potential social costs such as poorer health due to exposure to occupational hazards. Several types of alternative approaches to measuring socio-economic progress are considering for six administrative units of the Ural Federal District for the period 2006–2014. Proposed alternatives to GRP as a measure of social progress are focused on natural resource depletion, environmental externalities and some human development aspects. The most promising is the use of corrected macroeconomic indicators similar to the “genuine savings” compiled by the World Bank. Genuine savings are defined in this paper as net savings (net gross savings minus consumption of fixed capital minus the consumption of natural non-renewable resources and the monetary evaluations of damages resulting from air pollution, water pollution and waste disposal. Two main groups of non renewable resources are considered: energy resources (uranium ore, oil and natural gas and mineral resources (iron ore, copper, and aluminum. In spite of various shortcomings, this indicator represents a considerable improvement over GRP information. For example, while GRP demonstrates steady growth between 2006 and 2014 for the main Russian oil- and gas-producing regions — Hanty-Mansi and Yamalo-Nenets Autonomous Okrugs, genuine savings for these regions decreased over all period. It means that their resource-based economy could not be considered as being on a sustainable path even in the framework of

  7. ONTOLOGY BASED DATA MINING METHODOLOGY FOR DISCRIMINATION PREVENTION

    Directory of Open Access Journals (Sweden)

    Nandana Nagabhushana

    2014-09-01

    Full Text Available Data Mining is being increasingly used in the field of automation of decision making processes, which involve extraction and discovery of information hidden in large volumes of collected data. Nonetheless, there are negative perceptions like privacy invasion and potential discrimination which contribute as hindrances to the use of data mining methodologies in software systems employing automated decision making. Loan granting, Employment, Insurance Premium calculation, Admissions in Educational Institutions etc., can make use of data mining to effectively prevent human biases pertaining to certain attributes like gender, nationality, race etc. in critical decision making. The proposed methodology prevents discriminatory rules ensuing due to the presence of certain information regarding sensitive discriminatory attributes in the data itself. Two aspects of novelty in the proposal are, first, the rule mining technique based on ontologies and the second, concerning generalization and transformation of the mined rules that are quantized as discriminatory, into non-discriminatory ones.

  8. A Network Based Methodology to Reveal Patterns in Knowledge Transfer

    Directory of Open Access Journals (Sweden)

    Orlando López-Cruz

    2015-12-01

    Full Text Available This paper motivates, presents and demonstrates in use a methodology based in complex network analysis to support research aimed at identification of sources in the process of knowledge transfer at the interorganizational level. The importance of this methodology is that it states a unified model to reveal knowledge sharing patterns and to compare results from multiple researches on data from different periods of time and different sectors of the economy. This methodology does not address the underlying statistical processes. To do this, national statistics departments (NSD provide documents and tools at their websites. But this proposal provides a guide to model information inferences gathered from data processing revealing links between sources and recipients of knowledge being transferred and that the recipient detects as main source to new knowledge creation. Some national statistics departments set as objective for these surveys the characterization of innovation dynamics in firms and to analyze the use of public support instruments. From this characterization scholars conduct different researches. Measures of dimensions of the network composed by manufacturing firms and other organizations conform the base to inquiry the structure that emerges from taking ideas from other organizations to incept innovations. These two sets of data are actors of a two- mode-network. The link between two actors (network nodes, one acting as the source of the idea. The second one acting as the destination comes from organizations or events organized by organizations that “provide” ideas to other group of firms. The resulting demonstrated design satisfies the objective of being a methodological model to identify sources in knowledge transfer of knowledge effectively used in innovation.

  9. A new evaluation methodology for literature-based discovery systems.

    Science.gov (United States)

    Yetisgen-Yildiz, Meliha; Pratt, Wanda

    2009-08-01

    While medical researchers formulate new hypotheses to test, they need to identify connections to their work from other parts of the medical literature. However, the current volume of information has become a great barrier for this task. Recently, many literature-based discovery (LBD) systems have been developed to help researchers identify new knowledge that bridges gaps across distinct sections of the medical literature. Each LBD system uses different methods for mining the connections from text and ranking the identified connections, but none of the currently available LBD evaluation approaches can be used to compare the effectiveness of these methods. In this paper, we present an evaluation methodology for LBD systems that allows comparisons across different systems. We demonstrate the abilities of our evaluation methodology by using it to compare the performance of different correlation-mining and ranking approaches used by existing LBD systems. This evaluation methodology should help other researchers compare approaches, make informed algorithm choices, and ultimately help to improve the performance of LBD systems overall. PMID:19124086

  10. Development of a statistically based access delay timeline methodology.

    Energy Technology Data Exchange (ETDEWEB)

    Rivera, W. Gary; Robinson, David Gerald; Wyss, Gregory Dane; Hendrickson, Stacey M. Langfitt

    2013-02-01

    The charter for adversarial delay is to hinder access to critical resources through the use of physical systems increasing an adversary's task time. The traditional method for characterizing access delay has been a simple model focused on accumulating times required to complete each task with little regard to uncertainty, complexity, or decreased efficiency associated with multiple sequential tasks or stress. The delay associated with any given barrier or path is further discounted to worst-case, and often unrealistic, times based on a high-level adversary, resulting in a highly conservative calculation of total delay. This leads to delay systems that require significant funding and personnel resources in order to defend against the assumed threat, which for many sites and applications becomes cost prohibitive. A new methodology has been developed that considers the uncertainties inherent in the problem to develop a realistic timeline distribution for a given adversary path. This new methodology incorporates advanced Bayesian statistical theory and methodologies, taking into account small sample size, expert judgment, human factors and threat uncertainty. The result is an algorithm that can calculate a probability distribution function of delay times directly related to system risk. Through further analysis, the access delay analyst or end user can use the results in making informed decisions while weighing benefits against risks, ultimately resulting in greater system effectiveness with lower cost.

  11. A PROPOSED METHODOLOGY FOR STRAIN BASED FAILURE CRITERIA

    International Nuclear Information System (INIS)

    This paper proposes an alternative methodology to determine the failure criteria for use in dynamic simulations of radioactive material shipping packages in the events of hypothetical accident conditions. The current stress failure criteria defined in the Nuclear Regulatory Guide 7.6 [1] and the ASME Code, Section III, Appendix F [2] for Level D Service Loads are based on the ultimate strength of uniaxial tensile test specimen rather than on the material fracture in the state of multi-axial stresses. On the other hand, the proposed strain-based failure criteria are directly related to the material failure mechanisms in multi-axial stresses. In addition, unlike the stress-based criteria, the strain-based failure criteria are applicable to the evaluation of cumulative damages caused by the sequential loads in the hypothetical accident events as required by the Nuclear Regulatory Guide 7.8 [4

  12. DNA vector-based RNAi approach for stable depletion of poly(ADP-ribose) polymerase-1

    International Nuclear Information System (INIS)

    RNA-mediated interference (RNAi) is a powerful technique that is now being used in mammalian cells to specifically silence a gene. Some recent studies have used this technique to achieve variable extent of depletion of a nuclear enzyme poly(ADP-ribose) polymerase-1 (PARP-1). These studies reported either transient silencing of PARP-1 using double-stranded RNA or stable silencing of PARP-1 with a DNA vector which was introduced by a viral delivery system. In contrast, here we report that a simple RNAi approach which utilizes a pBS-U6-based DNA vector containing strategically selected PARP-1 targeting sequence, introduced in the cells by conventional CaPO4 protocol, can be used to achieve stable and specific silencing of PARP-1 in different types of cells. We also provide a detailed strategy for selection and cloning of PARP-1-targeting sequences for the DNA vector, and demonstrate that this technique does not affect expression of its closest functional homolog PARP-2

  13. Investment Strategies Optimization based on a SAX-GA Methodology

    CERN Document Server

    Canelas, António M L; Horta, Nuno C G

    2013-01-01

    This book presents a new computational finance approach combining a Symbolic Aggregate approXimation (SAX) technique with an optimization kernel based on genetic algorithms (GA). While the SAX representation is used to describe the financial time series, the evolutionary optimization kernel is used in order to identify the most relevant patterns and generate investment rules. The proposed approach considers several different chromosomes structures in order to achieve better results on the trading platform The methodology presented in this book has great potential on investment markets.

  14. Improvements in EBR-2 core depletion calculations

    International Nuclear Information System (INIS)

    The need for accurate core depletion calculations in Experimental Breeder Reactor No. 2 (EBR-2) is discussed. Because of the unique physics characteristics of EBR-2, it is difficult to obtain accurate and computationally efficient multigroup flux predictions. This paper describes the effect of various conventional and higher order schemes for group constant generation and for flux computations; results indicate that higher-order methods are required, particularly in the outer regions (i.e. the radial blanket). A methodology based on Nodal Equivalence Theory (N.E.T.) is developed which allows retention of the accuracy of a higher order solution with the computational efficiency of a few group nodal diffusion solution. The application of this methodology to three-dimensional EBR-2 flux predictions is demonstrated; this improved methodology allows accurate core depletion calculations at reasonable cost. 13 refs., 4 figs., 3 tabs

  15. Ray-Based Calculations with DEPLETE of Laser Backscatter in ICF Targets

    Energy Technology Data Exchange (ETDEWEB)

    Strozzi, D J; Williams, E; Hinkel, D; Froula, D; London, R; Callahan, D

    2008-05-19

    A steady-state model for Brillouin and Raman backscatter along a laser ray path is presented. The daughter plasma waves are treated in the strong damping limit, and have amplitudes given by the (linear) kinetic response to the ponderomotive drive. Pump depletion, inverse-bremsstrahlung damping, bremsstrahlung emission, Thomson scattering off density fluctuations, and whole-beam focusing are included. The numerical code Deplete, which implements this model, is described. The model is compared with traditional linear gain calculations, as well as 'plane-wave' simulations with the paraxial propagation code pF3D. Comparisons with Brillouin-scattering experiments at the Omega Laser Facility show that laser speckles greatly enhance the reflectivity over the Deplete results. An approximate upper bound on this enhancement is given by doubling the Deplete coupling coefficient. Analysis with Deplete of an ignition design for the National Ignition Facility (NIF), with a peak radiation temperature of 285 eV, shows encouragingly low reflectivity. Doubling the coupling to bracket speckle effects suggests a less optimistic picture. Re-absorption of Raman light is seen to be significant in this design.

  16. Extended Methodology of RS Design and Instances Based on GIP

    Institute of Scientific and Technical Information of China (English)

    Qian-Hong Wu; Bo Qin; Yu-Min Wang

    2005-01-01

    Abe et al. proposed the methodology of ring signature (RS) design in 2002 and showed how to construct RS with a mixture of public keys based on factorization and/or discrete logarithms. Their methodology cannot be applied to knowledge signatures (KS) using the Fiat-Shamir heuristic and cut-and-choose techniques, for instance, the Goldreich KS.This paper presents a more general construction of RS from various public keys if there exists a secure signature using such a public key and an efficient algorithm to forge the relation to be checked if the challenges in such a signature are known in advance. The paper shows how to construct RS based on the graph isomorphism problem (GIP). Although it is unknown whether or not GIP is NP-Complete, there are no known arguments that it can be solved even in the quantum computation model. Hence, the scheme has a better security basis and it is plausibly secure against quantum adversaries.

  17. SPRINT RA 230: Methodology for knowledge based developments

    International Nuclear Information System (INIS)

    SPRINT RA 230: A Methodology for Knowledge Based Developments, funded by the European Commission, was set up to investigate the use of KBS in the engineering industry. Its aim was to find out low KBS were currently used and what people's conceptions of them was, to disseminate current knowledge and to recommend further research into this area. A survey (by post and face to face interviews) was carried out under SPRINT RA 230 to investigate requirements for more intelligent software. In the survey we looked both at how people think about Knowledge Based Systems (KBS), what they find useful and what is not useful, and what current expertise problems or limitations of conventional software might suggest KBS solutions. (orig./DG)

  18. Methodology of Maqamat Hamadani and Hariri Based on Buseman’s statistical methodology

    Directory of Open Access Journals (Sweden)

    Hamed Sedghi

    2016-02-01

    Full Text Available Abstract  Stylistics can be defined as analysis and interpretation of the expression and different forms of speech, based on linguistic elements. German theorist , Buseman , devised his theses in statistical style based on the ratio of verbs to adjectives, generalizing a variety of literary and non- literary genera in German Language and Literature .   According to Buseman , increasing the number of verbs in verses, makes the text closer to literary style. Therefore, increasing the number of adjectives, makes the text closer to scientific or subjective style.   The most important achievement of statistical methodology of Buseman can be considered as: a comparison of author’s style, literary periods and genres; b studying the language system and variety of words; and, c classification and grading differences or similarities of literary, periods and genres.   The purpose of this study: Stylistic analysis of Maqamat Hamadani and al-Hariri based on the statistical model of Buseman.   The question proposed in this study : a How effective is the use of the statistical methods , in identifying and analyzing a variety of literature including Maqamat? b How effective is the quantitative scope of verbs and adjectives , as two key parameters in determining the style of literary works? And c Which element of fiction is most effective in enriching the literary approach ?   Specific research method is statistical – analysis ; We initially discovered and classified the number of verbs and adjectives in the Fifty- one of Hamadani Maqamehs and Fifty of Hariri Maqamehs ; then the scope of verbs and adjectives usage is shown in the form of tables and graphs . After that , we will assess the style of the literary work , based on the use of verbs and adjectives.   The research findings show : All Hamadani and Hariri Maqamat s quantitatively benefit from high -active approach, for the use of the verb . At 46 Maqameh Hamadani , and 48 Maqameh Hariri the

  19. Comparison among MCNP-based depletion codes applied to burnup calculations of pebble-bed HTR lattices

    International Nuclear Information System (INIS)

    The double-heterogeneity characterising pebble-bed high temperature reactors (HTRs) makes Monte Carlo based calculation tools the most suitable for detailed core analyses. These codes can be successfully used to predict the isotopic evolution during irradiation of the fuel of this kind of cores. At the moment, there are many computational systems based on MCNP that are available for performing depletion calculation. All these systems use MCNP to supply problem dependent fluxes and/or microscopic cross sections to the depletion module. This latter then calculates the isotopic evolution of the fuel resolving Bateman's equations. In this paper, a comparative analysis of three different MCNP-based depletion codes is performed: Montburns2.0, MCNPX2.6.0 and BGCore. Monteburns code can be considered as the reference code for HTR calculations, since it has been already verified during HTR-N and HTR-N1 EU project. All calculations have been performed on a reference model representing an infinite lattice of thorium-plutonium fuelled pebbles. The evolution of k-inf as a function of burnup has been compared, as well as the inventory of the important actinides. The k-inf comparison among the codes shows a good agreement during the entire burnup history with the maximum difference lower than 1%. The actinide inventory prediction agrees well. However significant discrepancy in Am and Cm concentrations calculated by MCNPX as compared to those of Monteburns and BGCore has been observed. This is mainly due to different Am-241 (n,γ) branching ratio utilized by the codes. The important advantage of BGCore is its significantly lower execution time required to perform considered depletion calculations. While providing reasonably accurate results BGCore runs depletion problem about two times faster than Monteburns and two to five times faster than MCNPX.

  20. Formal model based methodology for developing software for nuclear applications

    International Nuclear Information System (INIS)

    The approach used in model based design is to build the model of the system in graphical/textual language. In older model based design approach, the correctness of the model is usually established by simulation. Simulation which is analogous to testing, cannot guarantee that the design meets the system requirements under all possible scenarios. This is however possible if the modeling language is based on formal semantics so that the developed model can be subjected to formal verification of properties based on specification. The verified model can then be translated into an implementation through reliable/verified code generator thereby reducing the necessity of low level testing. Such a methodology is admissible as per guidelines of IEC60880 standard applicable to software used in computer based systems performing category A functions in nuclear power plant and would also be acceptable for category B functions. In this article, the experience in implementation and formal verification of important controllers used in the process control system of a nuclear reactor. We have used The SCADE (Safety Critical System Analysis and Design Environment) environment to model the controllers. The modeling language used in SCADE is based on the synchronous dataflow model of computation. A set of safety properties has been verified using formal verification technique

  1. A response surface methodology based damage identification technique

    International Nuclear Information System (INIS)

    Response surface methodology (RSM) is a combination of statistical and mathematical techniques to represent the relationship between the inputs and outputs of a physical system by explicit functions. This methodology has been widely employed in many applications such as design optimization, response prediction and model validation. But so far the literature related to its application in structural damage identification (SDI) is scarce. Therefore this study attempts to present a systematic SDI procedure comprising four sequential steps of feature selection, parameter screening, primary response surface (RS) modeling and updating, and reference-state RS modeling with SDI realization using the factorial design (FD) and the central composite design (CCD). The last two steps imply the implementation of inverse problems by model updating in which the RS models substitute the FE models. The proposed method was verified against a numerical beam, a tested reinforced concrete (RC) frame and an experimental full-scale bridge with the modal frequency being the output responses. It was found that the proposed RSM-based method performs well in predicting the damage of both numerical and experimental structures having single and multiple damage scenarios. The screening capacity of the FD can provide quantitative estimation of the significance levels of updating parameters. Meanwhile, the second-order polynomial model established by the CCD provides adequate accuracy in expressing the dynamic behavior of a physical system

  2. Methodological Approaches for Estimating Gross Regional Product after Taking into Account Depletion of Natural Resources, Environmental Pollution and Human Capital Aspects

    OpenAIRE

    Boris Alengordovich Korobitsyn

    2015-01-01

    A key indicator of the System of National Accounts of Russia at a regional scale is Gross Regional Product characterizing the value of goods and services produced in all sectors of the economy in a country and intended for final consumption, capital formation and net exports (excluding imports). From a sustainability perspective, the most weakness of GRP is that it ignores depreciation of man-made assets, natural resource depletion, environmental pollution and degradation, and potent...

  3. Burnup credit methodology validation against WWER experimental data

    International Nuclear Information System (INIS)

    A methodology for criticality safety analyses with burnup credit application has been developed for WWER spent fuel management facilities. This methodology is based on two worldwide used code systems: SCALE 4.4 for depletion and criticality calculations and NESSEL-NUKO - for depletion calculations. The methodology is in process of extensive validation for WWER applications. The depletion code systems NESSEL-NUKO and SCALE4.4 (control module SAS2H) have been validated on the basis of comparison with the calculated results obtained by other depletion codes for the CB2 Calculational Burnup Credit Benchmark. The validation of these code systems for WWER-440 and WWER-1000 spent fuel assembly depletion analysis based on comparisons with appropriate experimental data commenced last year. In this paper some results from burnup methodology validation against measured nuclide concentration given in the ISTC project 2670 for WWER-440 and from ORNL publication for WWER-1000 are presented. (authors)

  4. Engineering Design Optimization Based on Intelligent Response Surface Methodology

    Institute of Scientific and Technical Information of China (English)

    SONG Guo-hui; WU Yu; LI Cong-xin

    2008-01-01

    An intelligent response surface methodology (IRSM) was proposed to achieve the most competitivemetal forming products, in which artificial intelligence technologies are introduced into the optimization process.It is used as simple and inexpensive replacement for computationally expensive simulation model. In IRSM,the optimal design space can be reduced greatly without any prior information about function distribution.Also, by identifying the approximation error region, new design points can be supplemented correspondingly toimprove the response surface model effectively. The procedure is iterated until the accuracy reaches the desiredthreshold value. Thus, the global optimization can be performed based on this substitute model. Finally, wepresent an optimization design example about roll forming of a "U" channel product.

  5. Nuclear insurance risk assessment using risk-based methodology

    International Nuclear Information System (INIS)

    This paper presents American Nuclear Insurers' (ANI's) and Mutual Atomic Energy Liability Underwriters' (MAELU's) process and experience for conducting nuclear insurance risk assessments using a risk-based methodology. The process is primarily qualitative and uses traditional insurance risk assessment methods and an approach developed under the auspices of the American Society of Mechanical Engineers (ASME) in which ANI/MAELU is an active sponsor. This process assists ANI's technical resources in identifying where to look for insurance risk in an industry in which insurance exposure tends to be dynamic and nonactuarial. The process is an evolving one that also seeks to minimize the impact on insureds while maintaining a mutually agreeable risk tolerance

  6. Universal Verification Methodology Based Register Test Automation Flow.

    Science.gov (United States)

    Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu

    2016-05-01

    In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers. PMID:27483924

  7. SMITHERS: An object-oriented modular mapping methodology for MCNP-based neutronic–thermal hydraulic multiphysics

    International Nuclear Information System (INIS)

    Highlights: • A modular mapping methodogy for neutronic-thermal hydraulic nuclear reactor multiphysics, SMITHERS, has been developed. • Written in Python, SMITHERS takes a novel object-oriented approach for facilitating data transitions between solvers. This approach enables near-instant compatibility with existing MCNP/MONTEBURNS input decks. • It also allows for coupling with thermal-hydraulic solvers of various levels of fidelity. • Two BWR and PWR test problems are presented for verifying correct functionality of the SMITHERS code routines. - Abstract: A novel object-oriented modular mapping methodology for externally coupled neutronics–thermal hydraulics multiphysics simulations was developed. The Simulator using MCNP with Integrated Thermal-Hydraulics for Exploratory Reactor Studies (SMITHERS) code performs on-the-fly mapping of material-wise power distribution tallies implemented by MCNP-based neutron transport/depletion solvers for use in estimating coolant temperature and density distributions with a separate thermal-hydraulic solver. The key development of SMITHERS is that it reconstructs the hierarchical geometry structure of the material-wise power generation tallies from the depletion solver automatically, with only a modicum of additional information required from the user. Additionally, it performs the basis mapping from the combinatorial geometry of the depletion solver to the required geometry of the thermal-hydraulic solver in a generalizable manner, such that it can transparently accommodate varying levels of thermal-hydraulic solver geometric fidelity, from the nodal geometry of multi-channel analysis solvers to the pin-cell level of discretization for sub-channel analysis solvers. The mapping methodology was specifically developed to be flexible enough such that it could successfully integrate preexisting depletion solver case files with different thermal-hydraulic solvers. This approach allows the user to tailor the selection of a

  8. A methodology for physically based rockfall hazard assessment

    Directory of Open Access Journals (Sweden)

    G. B. Crosta

    2003-01-01

    Full Text Available Rockfall hazard assessment is not simple to achieve in practice and sound, physically based assessment methodologies are still missing. The mobility of rockfalls implies a more difficult hazard definition with respect to other slope instabilities with minimal runout. Rockfall hazard assessment involves complex definitions for "occurrence probability" and "intensity". This paper is an attempt to evaluate rockfall hazard using the results of 3-D numerical modelling on a topography described by a DEM. Maps portraying the maximum frequency of passages, velocity and height of blocks at each model cell, are easily combined in a GIS in order to produce physically based rockfall hazard maps. Different methods are suggested and discussed for rockfall hazard mapping at a regional and local scale both along linear features or within exposed areas. An objective approach based on three-dimensional matrixes providing both a positional "Rockfall Hazard Index" and a "Rockfall Hazard Vector" is presented. The opportunity of combining different parameters in the 3-D matrixes has been evaluated to better express the relative increase in hazard. Furthermore, the sensitivity of the hazard index with respect to the included variables and their combinations is preliminarily discussed in order to constrain as objective as possible assessment criteria.

  9. Stratospheric ozone depletion over Antarctica - Role of aerosols based on SAGE II satellite observations

    Science.gov (United States)

    Lin, N.-H.; Saxena, V. K.

    1992-01-01

    The physical characteristics of the Antarctic stratospheric aerosol are investigated via a comprehensive analysis of the SAGE II data during the most severe ozone depletion episode of October 1987. The aerosol size distribution is found to be bimodal in several instances using the randomized minimization search technique, which suggests that the distribution of a single mode may be used to fit the data in the retrieved size range only at the expense of resolution for the larger particles. On average, in the region below 18 km, a wavelike perturbation with the upstream tilting for the parameters of mass loading, total number, and surface area concentration is found to be located just above the region of the most severe ozone depletion.

  10. Adjoint-based uncertainty quantification and sensitivity analysis for reactor depletion calculations

    Science.gov (United States)

    Stripling, Hayes Franklin

    Depletion calculations for nuclear reactors model the dynamic coupling between the material composition and neutron flux and help predict reactor performance and safety characteristics. In order to be trusted as reliable predictive tools and inputs to licensing and operational decisions, the simulations must include an accurate and holistic quantification of errors and uncertainties in its outputs. Uncertainty quantification is a formidable challenge in large, realistic reactor models because of the large number of unknowns and myriad sources of uncertainty and error. We present a framework for performing efficient uncertainty quantification in depletion problems using an adjoint approach, with emphasis on high-fidelity calculations using advanced massively parallel computing architectures. This approach calls for a solution to two systems of equations: (a) the forward, engineering system that models the reactor, and (b) the adjoint system, which is mathematically related to but different from the forward system. We use the solutions of these systems to produce sensitivity and error estimates at a cost that does not grow rapidly with the number of uncertain inputs. We present the framework in a general fashion and apply it to both the source-driven and k-eigenvalue forms of the depletion equations. We describe the implementation and verification of solvers for the forward and ad- joint equations in the PDT code, and we test the algorithms on realistic reactor analysis problems. We demonstrate a new approach for reducing the memory and I/O demands on the host machine, which can be overwhelming for typical adjoint algorithms. Our conclusion is that adjoint depletion calculations using full transport solutions are not only computationally tractable, they are the most attractive option for performing uncertainty quantification on high-fidelity reactor analysis problems.

  11. Experimental method for scanning the surface depletion region in nitride based heterostructures

    International Nuclear Information System (INIS)

    The group-III-nitride semiconductors feature strong spontaneous polarization in the[0001] direction and charges on the respective polar surfaces. Within the resulting surface depletion region the surface field causes band banding and affects the optical transitions in quantum wells. We studied the changes of the emission characteristics of a single GaInN quantum well when its distance to the surface and the influence of the surface field varies. We observe a strong increase of the quantum well emission energy and a decrease of the line width when the surface field partially compensates the piezoelectric field of the quantum well. A scan of the total surface depletion region with a single quantum well as probe was performed. The obtained emission data allow for the direct determination of the width of the depletion region. The experimental method is promising for studies of the surface field and the surface potential of III-nitride surfaces and interfaces. (copyright 2009 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  12. A strategy for selective detection based on interferent depleting and redox cycling using the plane-recessed microdisk array electrodes

    Energy Technology Data Exchange (ETDEWEB)

    Zhu Feng [State Key Laboratory for Physical Chemistry of Solid Surfaces and Department of Chemistry, College of Chemistry and Chemical Engineering, Xiamen University, Xiamen, Fujian 361005 (China); Yan Jiawei, E-mail: jwyan@xmu.edu.cn [State Key Laboratory for Physical Chemistry of Solid Surfaces and Department of Chemistry, College of Chemistry and Chemical Engineering, Xiamen University, Xiamen, Fujian 361005 (China); Lu Miao [Pen-Tung Sah Micro-Nano Technology Research Center, Xiamen University, Xiamen, Fujian 361005 (China); Zhou Yongliang; Yang Yang; Mao Bingwei [State Key Laboratory for Physical Chemistry of Solid Surfaces and Department of Chemistry, College of Chemistry and Chemical Engineering, Xiamen University, Xiamen, Fujian 361005 (China)

    2011-10-01

    Highlights: > A novel strategy based on a combination of interferent depleting and redox cycling is proposed for the plane-recessed microdisk array electrodes. > The strategy break up the restriction of selectively detecting a species that exhibits reversible reaction in a mixture with one that exhibits an irreversible reaction. > The electrodes enhance the current signal by redox cycling. > The electrodes can work regardless of the reversibility of interfering species. - Abstract: The fabrication, characterization and application of the plane-recessed microdisk array electrodes for selective detection are demonstrated. The electrodes, fabricated by lithographic microfabrication technology, are composed of a planar film electrode and a 32 x 32 recessed microdisk array electrode. Different from commonly used redox cycling operating mode for array configurations such as interdigitated array electrodes, a novel strategy based on a combination of interferent depleting and redox cycling is proposed for the electrodes with an appropriate configuration. The planar film electrode (the plane electrode) is used to deplete the interferent in the diffusion layer. The recessed microdisk array electrode (the microdisk array), locating within the diffusion layer of the plane electrode, works for detecting the target analyte in the interferent-depleted diffusion layer. In addition, the microdisk array overcomes the disadvantage of low current signal for a single microelectrode. Moreover, the current signal of the target analyte that undergoes reversible electron transfer can be enhanced due to the redox cycling between the plane electrode and the microdisk array. Based on the above working principle, the plane-recessed microdisk array electrodes break up the restriction of selectively detecting a species that exhibits reversible reaction in a mixture with one that exhibits an irreversible reaction, which is a limitation of single redox cycling operating mode. The advantages of the

  13. EPRI depletion benchmark calculations using PARAGON

    International Nuclear Information System (INIS)

    Highlights: • PARAGON depletion calculations are benchmarked against the EPRI reactivity decrement experiments. • Benchmarks cover a wide range of enrichments, burnups, cooling times, and burnable absorbers, and different depletion and storage conditions. • Results from PARAGON-SCALE scheme are more conservative relative to the benchmark data. • ENDF/B-VII based data reduces the excess conservatism and brings the predictions closer to benchmark reactivity decrement values. - Abstract: In order to conservatively apply burnup credit in spent fuel pool criticality analyses, code validation for both fresh and used fuel is required. Fresh fuel validation is typically done by modeling experiments from the “International Handbook.” A depletion validation can determine a bias and bias uncertainty for the worth of the isotopes not found in the fresh fuel critical experiments. Westinghouse’s burnup credit methodology uses PARAGON™ (Westinghouse 2-D lattice physics code) and its 70-group cross-section library, which have been benchmarked, qualified, and licensed both as a standalone transport code and as a nuclear data source for core design simulations. A bias and bias uncertainty for the worth of depletion isotopes, however, are not available for PARAGON. Instead, the 5% decrement approach for depletion uncertainty is used, as set forth in the Kopp memo. Recently, EPRI developed a set of benchmarks based on a large set of power distribution measurements to ascertain reactivity biases. The depletion reactivity has been used to create 11 benchmark cases for 10, 20, 30, 40, 50, and 60 GWd/MTU and 3 cooling times 100 h, 5 years, and 15 years. These benchmark cases are analyzed with PARAGON and the SCALE package and sensitivity studies are performed using different cross-section libraries based on ENDF/B-VI.3 and ENDF/B-VII data to assess that the 5% decrement approach is conservative for determining depletion uncertainty

  14. Materialized View Selection Approach Using Tree Based Methodology

    Directory of Open Access Journals (Sweden)

    MR. P. P. KARDE

    2010-10-01

    Full Text Available In large databases particularly in distributed database, query response time plays an important role as timely access to information and it is the basic requirement of successful business application. A data warehouse uses multiple materialized views to efficiently process a given set of queries. Quick response time and accuracy areimportant factors in the success of any database. The materialization of all views is not possible because of the space constraint and maintenance cost constraint. Selection of Materialized views is one of the most important decisions in designing a data warehouse for optimal efficiency. Selecting a suitable set of views that minimizesthe total cost associated with the materialized views and is the key component in data warehousing. Materialized views are found to be very useful for fast query processing. This paper gives the results of proposed tree based materialized view selection algorithm for query processing. In distributed environment where database is distributed over the nodes on which query should get executed and also plays an important role. This paper also proposes node selection algorithm for fast materialized view selection in distributed environment. And finally it is found that the proposed methodology performs better for query processing as compared to other materializedview selection strategies.

  15. Cernavoda NPP risk - Based test and maintenance planning - Methodology development

    International Nuclear Information System (INIS)

    The Cernavoda Power Plant starts the commercial operation in November 1996. During operation of the nuclear power plant, several mandatory tests and maintenance are performed on stand-by safety system components to ensure their availability in case of accident. The basic purpose of such activities is the early detection of any failure and degradation, and timely correction of deteriorations. Because of the large number of such activities, emphasis on plant safety and allocation of resources becomes difficult. The probabilistic model and methodology can be effectively used to obtain the risk significance of these activities so that the resources are directed to the most important areas. The proposed Research Contract activity is strongly connected with other safety related areas under development. Since, the Cernavoda Probabilistic Safety Evaluation Level 1 PSA Study (CPSE) was performed and now the study is revised taking into account the as-built information, it is recommended to implement into the model the necessary modeling features to support further PSA application, especially related to Test and Maintenance optimization. Methods need to be developed in order to apply the PSA model including risk information together with other needed information for Test and Maintenance optimization. Also, in parallel with the CPSE study updating, the software interface for the PSA model is under development (Risk Monitor Software class), methods and models needing to be developed for the purpose of using it for qualified monitoring of Test and Maintenance Strategy efficiency. Similar, the Data Collection System need to be appropriate for the purpose of an ongoing implementation of a risk - based Test and Maintenance Strategy. (author). 4 refs, 1 fig

  16. Development of Proliferation Resistance Assessment Methodology based on International Standard

    International Nuclear Information System (INIS)

    Proliferation resistance is one of the requirement to be met in GEN IV and INPRO for next generation nuclear energy system. Internationally, the evaluation methodology on PR had been already initiated from 1980, but the systematic development was started at 2000s. In Korea, for the export of nuclear energy system and the increase of international credibility and transparence of domestic nuclear system and fuel cycle development, the independent development of PR evaluation methodology was started in 2007 as a nuclear long term R and D project and the development is being performed for the model of PR evaluation methodology. In 1st year, comparative study of GEN-IV/INPRO, PR indicator development, quantification of indicator and evaluation model development, analysis of technology system and international technology development trend had been performed. In 2nd and 3rd year, feasibility study of indicator, allowable limit of indicator, review of technical requirement of indicator, technical standard, design of evaluation model were done. The results of PR evaluation must be applied in the beginning of conceptual design of nuclear system. Through the technology development of PR evaluation methodology, the methodology will be applied in the regulatory requirement for authorization and permission to be developed

  17. Base line definitions and methodological lessons from Zimbabwe

    International Nuclear Information System (INIS)

    The UNEP Greenhouse Gas Abatement Costing Studies carried out under the management of the UNEP Collaborating Centre On Energy and Environment at Risoe National Laboratories in Denmark has placed effort in generating methodological approaches to assessing the cost of abatement activities to reduce CO2 emissions. These efforts have produced perhaps the most comprehensive set of methodological approaches to defining and assessing the cost of greenhouse gas abatement. Perhaps the most importance aspect of the UNEP study which involved teams of researchers from ten countries is the mix of countries in which the studies were conducted and hence the representation of views and concepts from researchers in these countries particularly those from developing countries namely, Zimbabwe, India, Venezuela, Brazil, Thailand and Senegal. Methodological lessons from Zimbabwe, therefore, would have benefited from the interactions with methodological experiences form the other participating countries. Methodological lessons from the Zimbabwean study can be placed in two categories. One relates to the modelling of tools to analyze economic trends and the various factors studied in order to determine the unit cost of CO2 abatement. The other is the definition of factors influencing the levels of emissions reducible and those realised under specific economic trends. (au)

  18. Online monitoring of immunoaffinity-based depletion of high-abundance blood proteins by UV spectrophotometry using enhanced green fluorescence protein and FITC-labeled human serum albumin

    Directory of Open Access Journals (Sweden)

    Yu Hyeong

    2010-12-01

    Full Text Available Abstract Background The removal of high-abundance proteins from plasma is an efficient approach to investigating flow-through proteins for biomarker discovery studies. Most depletion methods are based on multiple immunoaffinity methods available commercially including LC columns and spin columns. Despite its usefulness, high-abundance depletion has an intrinsic problem, the sponge effect, which should be assessed during depletion experiments. Concurrently, the yield of depletion of high-abundance proteins must be monitored during the use of the depletion column. To date, there is no reasonable technique for measuring the recovery of flow-through proteins after depletion and assessing the capacity for capture of high-abundance proteins. Results In this study, we developed a method of measuring recovery yields of a multiple affinity removal system column easily and rapidly using enhanced green fluorescence protein as an indicator of flow-through proteins. Also, we monitored the capture efficiency through depletion of a high-abundance protein, albumin labeled with fluorescein isothiocyanate. Conclusion This simple method can be applied easily to common high-abundance protein depletion methods, effectively reducing experimental variations in biomarker discovery studies.

  19. Integrated Methodology for Information System Change Control Based on Enterprise Architecture Models

    Directory of Open Access Journals (Sweden)

    Pirta Ruta

    2015-12-01

    Full Text Available The information system (IS change management and governance, according to the best practices, are defined and described in several international methodologies, standards, and frameworks (ITIL, COBIT, ValIT etc.. These methodologies describe IS change management aspects from the viewpoint of their particular enterprise resource management area. The areas are mainly viewed in a partly isolated environment, and the integration of the existing methodologies is insufficient for providing unified and controlled methodological support for holistic IS change management. In this paper, an integrated change management methodology is introduced. The methodology consists of guidelines for IS change control by integrating the following significant resource management areas – information technology (IT governance, change management and enterprise architecture (EA change management. In addition, the methodology includes lists of controls applicable at different phases. The approach is based on re-use and fusion of principles used by related methodologies as well as on empirical observations about typical IS change management mistakes in enterprises.

  20. Base cation depletion, eutrophication and acidification of species-rich grasslands in response to long-term simulated nitrogen deposition

    International Nuclear Information System (INIS)

    Pollutant nitrogen deposition effects on soil and foliar element concentrations were investigated in acidic and limestone grasslands, located in one of the most nitrogen and acid rain polluted regions of the UK, using plots treated for 8-10 years with 35-140 kg N ha-2 y-1 as NH4NO3. Historic data suggests both grasslands have acidified over the past 50 years. Nitrogen deposition treatments caused the grassland soils to lose 23-35% of their total available bases (Ca, Mg, K, and Na) and they became acidified by 0.2-0.4 pH units. Aluminium, iron and manganese were mobilised and taken up by limestone grassland forbs and were translocated down the acid grassland soil. Mineral nitrogen availability increased in both grasslands and many species showed foliar N enrichment. This study provides the first definitive evidence that nitrogen deposition depletes base cations from grassland soils. The resulting acidification, metal mobilisation and eutrophication are implicated in driving floristic changes. - Nitrogen deposition causes base cation depletion, acidification and eutrophication of semi-natural grassland soils

  1. An infrared image based methodology for breast lesions screening

    Science.gov (United States)

    Morais, K. C. C.; Vargas, J. V. C.; Reisemberger, G. G.; Freitas, F. N. P.; Oliari, S. H.; Brioschi, M. L.; Louveira, M. H.; Spautz, C.; Dias, F. G.; Gasperin, P.; Budel, V. M.; Cordeiro, R. A. G.; Schittini, A. P. P.; Neto, C. D.

    2016-05-01

    The objective of this paper is to evaluate the potential of utilizing a structured methodology for breast lesions screening, based on infrared imaging temperature measurements of a healthy control group to establish expected normality ranges, and of breast cancer patients, previously diagnosed through biopsies of the affected regions. An analysis of the systematic error of the infrared camera skin temperature measurements was conducted in several different regions of the body, by direct comparison to high precision thermistor temperature measurements, showing that infrared camera temperatures are consistently around 2 °C above the thermistor temperatures. Therefore, a method of conjugated gradients is proposed to eliminate the infrared camera direct temperature measurement imprecision, by calculating the temperature difference between two points to cancel out the error. The method takes into account the human body approximate bilateral symmetry, and compares measured dimensionless temperature difference values (Δ θ bar) between two symmetric regions of the patient's breast, that takes into account the breast region, the surrounding ambient and the individual core temperatures, and doing so, the results interpretation for different individuals become simple and non subjective. The range of normal whole breast average dimensionless temperature differences for 101 healthy individuals was determined, and admitting that the breasts temperatures exhibit a unimodal normal distribution, the healthy normal range for each region was considered to be the dimensionless temperature difference plus/minus twice the standard deviation of the measurements, Δ θ bar ‾ + 2σ Δ θ bar ‾ , in order to represent 95% of the population. Forty-seven patients with previously diagnosed breast cancer through biopsies were examined with the method, which was capable of detecting breast abnormalities in 45 cases (96%). Therefore, the conjugated gradients method was considered effective

  2. Black Box Chimera Check (B2C2): a Windows-Based Software for Batch Depletion of Chimeras from Bacterial 16S rRNA Gene Datasets.

    Science.gov (United States)

    Gontcharova, Viktoria; Youn, Eunseog; Wolcott, Randall D; Hollister, Emily B; Gentry, Terry J; Dowd, Scot E

    2010-01-01

    The existing chimera detection programs are not specifically designed for "next generation" sequence data. Technologies like Roche 454 FLX and Titanium have been adapted over the past years especially with the introduction of bacterial tag-encoded FLX/Titanium amplicon pyrosequencing methodologies to produce over one million 250-600 bp 16S rRNA gene reads that need to be depleted of chimeras prior to downstream analysis. Meeting the needs of basic scientists who are venturing into high-throughput microbial diversity studies such as those based upon pyrosequencing and specifically providing a solution for Windows users, the B2C2 software is designed to be able to accept files containing large multi-FASTA formatted sequences and screen for possible chimeras in a high throughput fashion. The graphical user interface (GUI) is also able to batch process multiple files. When compared to popular chimera screening software the B2C2 performed as well or better while dramatically decreasing the amount of time required generating and screening results. Even average computer users are able to interact with the Windows .Net GUI-based application and define the stringency to which the analysis should be done. B2C2 may be downloaded from http://www.researchandtesting.com/B2C2. PMID:21339894

  3. Knowledge-based and model-based hybrid methodology for comprehensive waste minimization in electroplating plants

    Science.gov (United States)

    Luo, Keqin

    1999-11-01

    The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and

  4. Nanoscale Field Effect Optical Modulators Based on Depletion of Epsilon-Near-Zero Films

    CERN Document Server

    Lu, Zhaolin; Shi, Kaifeng

    2015-01-01

    The field effect in metal-oxide-semiconductor (MOS) capacitors plays a key role in field-effect transistors (FETs), which are the fundamental building blocks of modern digital integrated circuits. Recent works show that the field effect can also be used to make optical/plasmonic modulators. In this paper, we report field effect electro-absorption modulators (FEOMs) each made of an ultrathin epsilon-near-zero (ENZ) film, as the active material, sandwiched in a silicon or plasmonic waveguide. Without a bias, the ENZ film maximizes the attenuation of the waveguides and the modulators work at the OFF state; contrariwise, depletion of the carriers in the ENZ film greatly reduces the attenuation and the modulators work at the ON state. The double capacitor gating scheme is used to enhance the modulation by the field effect. According to our simulation, extinction ratio up to 3.44 dB can be achieved in a 500-nm long Si waveguide with insertion loss only 0.71 dB (85.0%); extinction ratio up to 7.86 dB can be achieved...

  5. Depletion Calculations Based on Perturbations. Application to the Study of a Rep-Like Assembly at Beginning of Cycle with TRIPOLI-4®.

    Science.gov (United States)

    Dieudonne, Cyril; Dumonteil, Eric; Malvagi, Fausto; M'Backé Diop, Cheikh

    2014-06-01

    For several years, Monte Carlo burnup/depletion codes have appeared, which couple Monte Carlo codes to simulate the neutron transport to deterministic methods, which handle the medium depletion due to the neutron flux. Solving Boltzmann and Bateman equations in such a way allows to track fine 3-dimensional effects and to get rid of multi-group hypotheses done by deterministic solvers. The counterpart is the prohibitive calculation time due to the Monte Carlo solver called at each time step. In this paper we present a methodology to avoid the repetitive and time-expensive Monte Carlo simulations, and to replace them by perturbation calculations: indeed the different burnup steps may be seen as perturbations of the isotopic concentration of an initial Monte Carlo simulation. In a first time we will present this method, and provide details on the perturbative technique used, namely the correlated sampling. In a second time the implementation of this method in the TRIPOLI-4® code will be discussed, as well as the precise calculation scheme a meme to bring important speed-up of the depletion calculation. Finally, this technique will be used to calculate the depletion of a REP-like assembly, studied at beginning of its cycle. After having validated the method with a reference calculation we will show that it can speed-up by nearly an order of magnitude standard Monte-Carlo depletion codes.

  6. ENACTED SOFTWARE DEVELOPMENT PROCESS BASED ON AGILE AND AGENT METHODOLOGIES

    Directory of Open Access Journals (Sweden)

    DR. NACHAMAI. M

    2011-11-01

    Full Text Available Software Engineering gives the procedures and practices to be followed in the software development and acts as a backbone for computer science engineering techniques. This paper deals with current trends in software engineering methodologies, Agile and Agent Oriented software development process. Agile Methodology is to meet the needs of dynamic changing requirements of the customers. This model is iterative and incremental and accepts the changes in requirements at any stage of development. Agent oriented software’s is a rapidly developing area of research , Software agents are an innovative technology designed to support the development of complex, distributed, and heterogeneous information systems. The work of paper weight against factors of agile and agent oriented software development process on the basis of Architectural Design ,Applicability,Project Duration, Customer Interaction Level, Team collaboration, Documentation, Software Models.

  7. Theoretical and Methodological Bases of Accounting Product Formation

    OpenAIRE

    Mykhaylo Prodanchuk

    2015-01-01

    The article is devoted to deepening of theoretical and methodological provisions of accounting by determining the economic essence of the concept of 'accounting product', the formulation of its main characteristics and properties. There is defined the place of accounting in information system of a company. There is proved that the accounting system is a major source of accounting information that turns raw data recorded in the documents to qualitative informative product and adapts all the di...

  8. Cooperative Decision Making : a methodology based on collective preferences aggregation

    OpenAIRE

    Sibertin-Blanc, Christophe; ZARATÉ, Pascale

    2014-01-01

    The benefice of a collective decisions process mainly rests upon the possibility for the participants to confront their respective points of views. To this end, they must have cognitive and technical tools that ease the sharing of the reasons that motivate their own preferences, while accounting for information and feelings they should keep for their own. The paper presents the basis of such a cooperative decision making methodology that allows sharing information by accurately distinguishing...

  9. Some Methodological Considerations in Theory-Based Health Behavior Research

    OpenAIRE

    Collins, Linda M.; MacKinnon, David P.; Reeve, Bryce B.

    2013-01-01

    As this special issue shows, much research in social and personality psychology is directly relevant to health psychology. In this brief commentary, we discuss three topics in research methodology that may be of interest to investigators involved in health-related psychological research. The first topic is statistical analysis of mediated and moderated effects. The second is measurement of latent constructs. The third is the Multiphase Optimization Strategy, a framework for translation of inn...

  10. Auto-reactive T cells revised. Overestimation based on methodology?

    DEFF Research Database (Denmark)

    Thorlacius-Ussing, Gorm; Sørensen, Jesper F; Wandall, Hans H;

    2015-01-01

    loaded with E. coli produced recombinant protein or unmodified synthetic HLA binding peptides. Our concern is that this approach may ignore the presence of natural genetic variation and post-translational modifications such as e.g. the complex nature of N- and O-linked glycosylation of mammalian proteins...... methodology applied to document T cell reactivity against unmodified protein or peptide may lead to overinterpretation of the reported frequencies of autoreactive CD4+ and CD8+ T cells....

  11. Analytical base-collector depletion capacitance in vertical SiGe heterojunction bipolar transistors fabricated on CMOS-compatible silicon on insulator

    Institute of Scientific and Technical Information of China (English)

    Xu Xiao-Bo; Zhang He-Ming; Hu Hui-Yong; Ma Jian-Li; Xu Li-Jun

    2011-01-01

    The base-collector depletion capacitance for vertical SiGe npn heterojunction bipolar transistors (HBTs) on silicon on insulator (SOI) is split into vertical and lateral parts. This paper proposes a novel analytical depletion capacitance model of this structure for the first time. A large discrepancy is predicted when the present model is compared with the conventional depletion model, and it is shown that the capacitance decreases with the increase of the reverse collectorbase bias-and shows a kink as the reverse collector-base bias reaches the effective vertical punch-through voltage while the voltage differs with the collector doping concentrations, which is consistent with measurement results. The model can be employed for a fast evaluation of the depletion capacitance of an SOI SiGe HBT and has useful applications on the design and simulation of high performance SiGe circuits and devices.

  12. Depleted uranium

    International Nuclear Information System (INIS)

    Full text: The Director General of the International Atomic Energy Agency (IAEA), Mohamed ElBaradei, issued today the following statement: The IAEA has been involved in United Nations efforts relating to the impact of the use of depleted uranium (DU) ammunition in Kosovo. It has supported the United Nations Environment Programme (UNEP) in the assessment which it is making, at the request of the Secretary-General, of that impact. In this connection, in November 2000, Agency experts participated in a UNEP-led fact-finding mission in Kosovo. DU is only slightly radioactive, being about 40% as radioactive as natural uranium. Chemically and physically, DU behaves in the same way as natural uranium. The chemical toxicity is normally the dominant factor for human health. However, it is necessary to carefully assess the impact of DU in the special circumstances in which it was used, e.g. to determine whether it was inhaled or ingested or whether fragments came into close contact with individuals. It is therefore essential, before an authoritative conclusion can be reached, that a detailed survey of the territory in which DU was used and of the people who came in contact with the depleted uranium in any form be carried out. In the meantime it would be prudent, as recommended by the leader of the November UNEP mission, to adopt precautionary measures. Depending on the results of the survey further measures may be necessary. The Agency, within its statutory responsibilities and on the basis of internationally accepted radiation safety standards, will continue to co-operate with other organizations, in particular WHO and UNEP, with a view to carrying out a comprehensive assessment. Co-operation by and additional information from NATO will be prerequisites. The experience gained from such an assessment could be useful for similar studies that may be carried out elsewhere in the Balkans or in the Gulf. (author)

  13. A Nuclear Reactor Transient Methodology Based on Discrete Ordinates Method

    Directory of Open Access Journals (Sweden)

    Shun Zhang

    2014-01-01

    Full Text Available With the rapid development of nuclear power industry, simulating and analyzing the reactor transient are of great significance for the nuclear safety. The traditional diffusion theory is not suitable for small volume or strong absorption problem. In this paper, we have studied the application of discrete ordinates method in the numerical solution of space-time kinetics equation. The fully implicit time integration was applied and the precursor equations were solved by analytical method. In order to improve efficiency of the transport theory, we also adopted some advanced acceleration methods. Numerical results of the TWIGL benchmark problem presented demonstrate the accuracy and efficiency of this methodology.

  14. A Model-based Prognostics Methodology for Electrolytic Capacitors Based on Electrical Overstress Accelerated Aging

    Science.gov (United States)

    Celaya, Jose; Kulkarni, Chetan; Biswas, Gautam; Saha, Sankalita; Goebel, Kai

    2011-01-01

    A remaining useful life prediction methodology for electrolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical degradation model. Electrolytic capacitors are used in several applications ranging from power supplies on critical avionics equipment to power drivers for electro-mechanical actuators. These devices are known for their comparatively low reliability and given their criticality in electronics subsystems they are a good candidate for component level prognostics and health management. Prognostics provides a way to assess remaining useful life of a capacitor based on its current state of health and its anticipated future usage and operational conditions. We present here also, experimental results of an accelerated aging test under electrical stresses. The data obtained in this test form the basis for a remaining life prediction algorithm where a model of the degradation process is suggested. This preliminary remaining life prediction algorithm serves as a demonstration of how prognostics methodologies could be used for electrolytic capacitors. In addition, the use degradation progression data from accelerated aging, provides an avenue for validation of applications of the Kalman filter based prognostics methods typically used for remaining useful life predictions in other applications.

  15. AN INDUCTIVE, INTERACTIVE AND ADAPTIVE HYBRID PROBLEM-BASED LEARNING METHODOLOGY: APPLICATION TO STATISTICS

    OpenAIRE

    ADA ZHENG; YAN ZHOU

    2011-01-01

    We have developed an innovative hybrid problem-based learning (PBL) methodology. The methodology has the following distinctive features: i) Each complex question was decomposed into a set of coherent finer subquestions by following the carefully designed criteria to maintain a delicate balance between guiding the students and inspiring them to think independently. This learning methodology enabled the students to solve the complex questions progressively in an inductive context. ii) Facilitat...

  16. A new methodology for multidimensional poverty measurement based on the capability approach

    OpenAIRE

    Zeumo, Vivien Kana; Tsoukiàs, Alexis; Some, Blaise

    2012-01-01

    In this paper, we propose a methodology based on the use of clustering techniques derived from data analysis and multi-attribute decision analysis methods aiming at purposeful mul- tidimensional poverty measurement. The issue of meaningfulness is thus analysed both from a theoretical point of view (measurement theory) and from an operational one (policy effectiveness). Through this new methodology of multidimensional poverty measurement, we aim at providing a contribution to methodological kn...

  17. Design of an Emulsion-based Personal Detergent through a Model-based Chemical Product Design Methodology

    DEFF Research Database (Denmark)

    Mattei, Michele; Hill, Michael; Kontogeorgis, Georgios; Gani, Rafiqul

    An extended systematic methodology for the design of emulsion-based Chemical products is presented. The methodology consists of a model-based framework involving seven sequential hierarchical steps: starting with the identification of the needs to be satisfied by the product and then adding one-b...... together to obtain one or more candidate formulations. A conceptual casestudy representing a personal detergent is presented to highlight the methodology....

  18. Legal and methodological bases of comprehensive forensic enquiry of pornography

    Directory of Open Access Journals (Sweden)

    Berdnikov D.V.

    2016-03-01

    Full Text Available The article gives an analysis of the legal definition of pornography. The author identified descriptive and target criteria groups which are required for the analysis and analyses the content of descriptive criteria of pornography and the way how they should be documented. Fixing attention to the anatomical and physiological characteristics of the sexual relations is determine as necessary target criterion. It is noted that the term "pornography" is a legal and cannot be subject of expertise. That is why author underlined some methodological basis of complex psycho-linguistic and psycho-art expertise. The article presents general issue depends on expert conclusion and studies cases where the research is necessary to involve doctors, as well as criteria for expert's opinion. Besides that, author defined subject, object and main tasks of psychological studies of pornographic information.

  19. 77 FR 59625 - NIH Evidence-Based Methodology Workshop on Polycystic Ovary; Syndrome

    Science.gov (United States)

    2012-09-28

    ... HUMAN SERVICES National Institutes of Health NIH Evidence-Based Methodology Workshop on Polycystic Ovary... Methodology Workshop on Polycystic Ovary Syndrome, to be held December 3-5, 2012. The workshop's opening... definition of the disorder and diagnostic criteria. The outcome of this conference, the NIH Criteria,...

  20. D2.1 - An EA Active, Problem Based Learning Methodology - EAtrain2

    DEFF Research Database (Denmark)

    Ryberg, Thomas; Georgsen, Marianne; Buus, Lillian;

    This deliverable reports on the work undertaken in work package 2 with the key objective to develop a learning methodology for web 2.0 mediated Enterprise Architecture (EA) learning building on a problem based learning (PBL) approach. The deliverable reports not only on the methodology but also o...

  1. Eutrophication of mangroves linked to depletion of foliar and soil base cations

    NARCIS (Netherlands)

    Fauzi, A.; Skidmore, A.K.; Heitkonig, I.M.A.; Gils, van H.; Schlerf, M.

    2014-01-01

    There is growing concern that increasing eutrophication causes degradation of coastal ecosystems. Studies in terrestrial ecosystems have shown that increasing the concentration of nitrogen in soils contributes to the acidification process, which leads to leaching of base cations. To test the effects

  2. Halogen activation and ozone depletion events as measured from space and ground-based DOAS measurements during Spring 2009

    Energy Technology Data Exchange (ETDEWEB)

    Sihler, Holger [Institute of Environmental Physics, University of Heidelberg (Germany); Max-Planck-Institute for Chemistry, Mainz (Germany); Friess, Udo; Platt, Ulrich [Institute of Environmental Physics, University of Heidelberg (Germany); Wagner, Thomas [Max-Planck-Institute for Chemistry, Mainz (Germany)

    2010-07-01

    Bromine monoxide (BrO) radicals are known to play an important role in the chemistry of the springtime polar troposphere. Their release by halogen activation processes leads to the almost complete destruction of near-surface ozone during ozone depletion events ODEs. In order to improve our understanding of the halogen activation processes in three dimensions, we combine active and passive ground-based and satellite-borne measurements of BrO radicals. While satellites can not resolve the vertical distribution and have rather coarse horizontal resolution, they may provide information on the large-scale horizontal distribution. Information on the spatial variability within a satellite pixel may be derived from our combined ground-based instrumentation. Simultaneous passive multi-axis differential optical absorption spectroscopy (MAX-DOAS) and active long-path DOAS (LP-DOAS) measurements were conducted during the jointly organised OASIS campaign in Barrow, Alaska during Spring 2009 within the scope of the International Polar Year (IPY). Ground-based measurements are compared to BrO column densities measured by GOME-2 in order to find a conclusive picture of the spatial pattern of bromine activation.

  3. Drag &Drop, Mixed-Methodology-based Lab-on-Chip Design Optimization Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall objective is to develop a ?mixed-methodology?, drag and drop, component library (fluidic-lego)-based, system design and optimization tool for complex...

  4. Radiochemical Analysis Methodology for uranium Depletion Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Scatena-Wachel DE

    2007-01-09

    This report provides sufficient material for a test sponsor with little or no radiochemistry background to understand and follow physics irradiation test program execution. Most irradiation test programs employ similar techniques and the general details provided here can be applied to the analysis of other irradiated sample types. Aspects of program management directly affecting analysis quality are also provided. This report is not an in-depth treatise on the vast field of radiochemical analysis techniques and related topics such as quality control. Instrumental technology is a very fast growing field and dramatic improvements are made each year, thus the instrumentation described in this report is no longer cutting edge technology. Much of the background material is still applicable and useful for the analysis of older experiments and also for subcontractors who still retain the older instrumentation.

  5. Rank-based statistical methodologies for quantitative trait locus mapping.

    OpenAIRE

    Zou, Fei; Yandell, Brian S.; Fine, Jason P.

    2003-01-01

    This article addresses the identification of genetic loci (QTL and elsewhere) that influence nonnormal quantitative traits with focus on experimental crosses. QTL mapping is typically based on the assumption that the traits follow normal distributions, which may not be true in practice. Model-free tests have been proposed. However, nonparametric estimation of genetic effects has not been studied. We propose an estimation procedure based on the linear rank test statistics. The properties of th...

  6. Scenario Based Methodologies in Identifying Ubicomp Application Sets

    OpenAIRE

    Mahon, F.; Pfeifer, Tom; Crotty, M.

    2005-01-01

    In ubiquitous systems of the future, a requirement for appropriate methods for the selection, analysis and evaluation of applications is evident. Traditional methods of analysis rely heavilyon the inherent predictability of the system, something which is totally lacking in ubiquitous systems. This paper discusses one possible approach to flexible analysis and evaluation, the ScenarioBased Approach, describing its use in the IST FP6 project, Daidalos. The Scenario Based approach addresses the ...

  7. A methodology for capturing and analyzing data from technology base seminar wargames.

    OpenAIRE

    Miles, Jeffrey T.

    1991-01-01

    Approved for public release; distribution is unlimited This thesis provides a structured methodology for obtaining, evaluating, and portraying to a decision maker, the opinions of players of Technology Base Seminar Wargames (TBSW). The thesis then demonstrates the methodology by applying the events of the Fire Support Technology Base Seminar Wargame held in May 1991. Specifically, the evaluation team developed six surveys, each survey capturing opinions using the categorical...

  8. ICT-Based, Cross-Cultural Communication: A Methodological Perspective

    Science.gov (United States)

    Larsen, Niels; Bruselius-Jensen, Maria; Danielsen, Dina; Nyamai, Rachael; Otiende, James; Aagaard-Hansen, Jens

    2014-01-01

    The article discusses how cross-cultural communication based on information and communication technologies (ICT) may be used in participatory health promotion as well as in education in general. The analysis draws on experiences from a health education research project with grade 6 (approx. 12 years) pupils in Nairobi (Kenya) and Copenhagen…

  9. Knowledge-based methodology in pattern recognition and understanding

    OpenAIRE

    Haton, Jean-Paul

    1987-01-01

    The interpretation and understanding of complex patterns (e.g. speech, images or other kinds of mono- or multi-dimensional signals) is related both to pattern recognition and artificial intelligence since it necessitates numerical processing as well as symbolic knoledge-based reasoning techniques. This paper presents a state-of-the-art in the field, including basic concepts and practical applications.

  10. Design Based Research Methodology for Teaching with Technology in English

    Science.gov (United States)

    Jetnikoff, Anita

    2015-01-01

    Design based research (DBR) is an appropriate method for small scale educational research projects involving collaboration between teachers, students and researchers. It is particularly useful in collaborative projects where an intervention is implemented and evaluated in a grounded context. The intervention can be technological, or a new program…

  11. Relational and Object-Oriented Methodology in Data Bases Systems

    Directory of Open Access Journals (Sweden)

    Marian Pompiliu CRISTESCU

    2006-01-01

    Full Text Available Database programming languages integrate concepts of databases and programming languages to provide both implementation tools for data-intensive applications and high-level user interfaces to databases. Frequently, database programs contain a large amount of application knowledge which is hidden in the procedural code and thus difficult to maintain with changing data and user views. This paper presents a first attempt to improve the situation by supporting the integrated definition and management of data and rules based on a setoriented and predicative approach. The use of database technology for integrated fact and rule base management is shown to have some important advantages in terms of fact and rule integrity, question-answering, and explanation of results.

  12. Electrocardiogram based methodology for computing of Coronary Sinus Pressure

    Directory of Open Access Journals (Sweden)

    Loay Alzubaidi

    2011-05-01

    Full Text Available In this paper, a method based on pattern recognition and ECG technology is introduced as a means of calculating the optimum occlusion and release points within Pressure controlled intermittent coronary sinus occlusion (PICSO cycles. There are favorable results that show PICSO can substantially salvage ischemic myocardium during medical surgery. These results are confirmed after studying groups of animals. The new method is a continuation of previous work on two other techniques using estimation and derivative calculations.

  13. Electrocardiogram based methodology for computing of Coronary Sinus Pressure

    OpenAIRE

    Loay Alzubaidi; Ammar El Hassan; Jaafar Al Ghazo

    2011-01-01

    In this paper, a method based on pattern recognition and ECG technology is introduced as a means of calculating the optimum occlusion and release points within Pressure controlled intermittent coronary sinus occlusion (PICSO) cycles. There are favorable results that show PICSO can substantially salvage ischemic myocardium during medical surgery. These results are confirmed after studying groups of animals. The new method is a continuation of previous work on two other techniques using estimat...

  14. A methodology to assess the contribution of biorefineries to a sustainable bio-based economy

    Energy Technology Data Exchange (ETDEWEB)

    Maga, Daniel

    2015-07-01

    Within this thesis for the first time an integrative methodology to assess the sustainability of biorefineries and bio-based products has been developed which is based on a fundamental understanding of sustainability as presented in the Brundtland report. The applied integrative concept of sustainability as developed by the Institute for Technology Assessment and Systems Analysis (ITAS) overcomes the widespread thinking in three pillars of sustainability and opens up new perspectives. The methodology developed addresses innovative life cycle assessment evaluation methods on midpoint level as well as on the area of protection and adopts state-of-the-art assessment procedures e.g. to determine water deprivation. It goes far beyond the scope of conventional LCA studies and examines effects on human health, on the environment, on the development of knowledge and physical capital, and on regional development and acceptance. In order to validate the developed method it was applied to an algae biorefinery currently under development and construction in the south of Spain. For this assessment for the first time extensive process data was collected of a real algae biorefinery which uses municipal waste water as a culture medium for microalgae. The use of waste water allows to reduce the demand for fresh water and avoids additional fertilisation of microalgae. Moreover, the analysed algae biorefinery replaces conventional waste water treatment by a biological purification and produces biogas by an anaerobic pretreatment of waste water as well as by anaerobic digestion of algae. After several purification steps the biogas can be used as automotive fuel and thus contributes to further development and increased use of biofuels. On the one hand the sustainability assessment shows that this way of waste water treatment contributes to climate protection and to the conservation of fossil energy carrier. On the other hand approximately ten times more land is needed and twenty times

  15. A methodology to assess the contribution of biorefineries to a sustainable bio-based economy

    International Nuclear Information System (INIS)

    Within this thesis for the first time an integrative methodology to assess the sustainability of biorefineries and bio-based products has been developed which is based on a fundamental understanding of sustainability as presented in the Brundtland report. The applied integrative concept of sustainability as developed by the Institute for Technology Assessment and Systems Analysis (ITAS) overcomes the widespread thinking in three pillars of sustainability and opens up new perspectives. The methodology developed addresses innovative life cycle assessment evaluation methods on midpoint level as well as on the area of protection and adopts state-of-the-art assessment procedures e.g. to determine water deprivation. It goes far beyond the scope of conventional LCA studies and examines effects on human health, on the environment, on the development of knowledge and physical capital, and on regional development and acceptance. In order to validate the developed method it was applied to an algae biorefinery currently under development and construction in the south of Spain. For this assessment for the first time extensive process data was collected of a real algae biorefinery which uses municipal waste water as a culture medium for microalgae. The use of waste water allows to reduce the demand for fresh water and avoids additional fertilisation of microalgae. Moreover, the analysed algae biorefinery replaces conventional waste water treatment by a biological purification and produces biogas by an anaerobic pretreatment of waste water as well as by anaerobic digestion of algae. After several purification steps the biogas can be used as automotive fuel and thus contributes to further development and increased use of biofuels. On the one hand the sustainability assessment shows that this way of waste water treatment contributes to climate protection and to the conservation of fossil energy carrier. On the other hand approximately ten times more land is needed and twenty times

  16. Publishing FAIR Data: An Exemplar Methodology Utilizing PHI-Base.

    Science.gov (United States)

    Rodríguez-Iglesias, Alejandro; Rodríguez-González, Alejandro; Irvine, Alistair G; Sesma, Ane; Urban, Martin; Hammond-Kosack, Kim E; Wilkinson, Mark D

    2016-01-01

    Pathogen-Host interaction data is core to our understanding of disease processes and their molecular/genetic bases. Facile access to such core data is particularly important for the plant sciences, where individual genetic and phenotypic observations have the added complexity of being dispersed over a wide diversity of plant species vs. the relatively fewer host species of interest to biomedical researchers. Recently, an international initiative interested in scholarly data publishing proposed that all scientific data should be "FAIR"-Findable, Accessible, Interoperable, and Reusable. In this work, we describe the process of migrating a database of notable relevance to the plant sciences-the Pathogen-Host Interaction Database (PHI-base)-to a form that conforms to each of the FAIR Principles. We discuss the technical and architectural decisions, and the migration pathway, including observations of the difficulty and/or fidelity of each step. We examine how multiple FAIR principles can be addressed simultaneously through careful design decisions, including making data FAIR for both humans and machines with minimal duplication of effort. We note how FAIR data publishing involves more than data reformatting, requiring features beyond those exhibited by most life science Semantic Web or Linked Data resources. We explore the value-added by completing this FAIR data transformation, and then test the result through integrative questions that could not easily be asked over traditional Web-based data resources. Finally, we demonstrate the utility of providing explicit and reliable access to provenance information, which we argue enhances citation rates by encouraging and facilitating transparent scholarly reuse of these valuable data holdings. PMID:27433158

  17. Publishing FAIR Data: an exemplar methodology utilizing PHI-base

    Directory of Open Access Journals (Sweden)

    Alejandro eRodríguez Iglesias

    2016-05-01

    Full Text Available Pathogen-Host interaction data is core to our understanding of disease processes and their molecular/genetic bases. Facile access to such core data is particularly important for the plant sciences, where individual genetic and phenotypic observations have the added complexity of being dispersed over a wide diversity of plant species versus the relatively fewer host species of interest to biomedical researchers. Recently, an international initiative interested in scholarly data publishing proposed that all scientific data should be FAIR - Findable, Accessible, Interoperable, and Reusable. In this work, we describe the process of migrating a database of notable relevance to the plant sciences - the Pathogen-Host Interaction Database (PHI-base - to a form that conforms to each of the FAIR Principles. We discuss the technical and architectural decisions, and the migration pathway, including observations of the difficulty and/or fidelity of each step. We examine how multiple FAIR principles can be addressed simultaneously through careful design decisions, including making data FAIR for both humans and machines with minimal duplication of effort. We note how FAIR data publishing involves more than data reformatting, requiring features beyond those exhibited by most life science Semantic Web or Linked Data resources. We explore the value-added by completing this FAIR data transformation, and then test the result through integrative questions that could not easily be asked over traditional Web-based data resources. Finally, we demonstrate the utility of providing explicit and reliable access to provenance information, which we argue enhances citation rates by encouraging and facilitating transparent scholarly reuse of these valuable data holdings.

  18. JOB SHOP METHODOLOGY BASED ON AN ANT COLONY

    Directory of Open Access Journals (Sweden)

    OMAR CASTRILLON

    2009-01-01

    Full Text Available The purpose of this study is to reduce the total process time (Makespan and to increase the machines working time, in a job shop environment, using a heuristic based on ant colony optimization. This work is developed in two phases: The first stage describes the identification and definition of heuristics for the sequential processes in the job shop. The second stage shows the effectiveness of the system in the traditional programming of production. A good solution, with 99% efficiency is found using this technique.

  19. Application of risk-based methodologies to prioritize safety resources

    International Nuclear Information System (INIS)

    The Electric Power Research Institute (EPRI) started a program entitled risk-based prioritization in 1992. The purpose of this program is to provide generic technical support to the nuclear power industry relative to its recent initiatives in the area of operations and maintenance (O ampersand M) cost control using state-of-the-art risk methods. The approach uses probabilistic risk assessment (PRA), or similar techniques, to allocate resources commensurate with the risk posed by nuclear plant operations. Specifically, those items or events that have high risk significance would receive the most attention, while those with little risk content would command fewer resources. As quantified in a companion paper,close-quote the potential O ampersand M cost reduction inherent in this approach is very large. Furthermore, risk-based methods should also lead to safety improvements. This paper outlines the way that the EPRI technical work complements the technical, policy, and regulatory initiatives taken by others in the industry and provides an example of the approach as used to prioritize motor-operated valve (MOV) testing in response to US Nuclear Regulatory Commission (NRC) Generic Letter 89-10

  20. A GIS-based methodology for selecting stormwater disconnection opportunities.

    Science.gov (United States)

    Moore, S L; Stovin, V R; Wall, M; Ashley, R M

    2012-01-01

    The purpose of this paper is to introduce a geographic information system (GIS)-based decision support tool that assists the user to select not only areas where (retrofit) sustainable drainage systems (SuDS) could be implemented within a large catchment (>100 ha), but also to allow discrimination between suitable SuDS techniques based on their likely feasibility and effectiveness. The tool is applied to a case study catchment within London, UK, with the aim of increasing receiving water quality by reducing combined sewer overflow (CSO) spill frequency and volume. The key benefit of the tool presented is to allow rapid assessment of the retrofit SuDS potential of large catchments. It is not intended to replace detailed site investigations, but may help to direct attention to sites that have the greatest potential for retrofit SuDS implementation. Preliminary InfoWorks CS modelling of 'global disconnections' within the case study catchment, e.g. the removal of 50% of the total impervious area, showed that CSO spill volume could be reduced by 55 to 78% during a typical year. Using the disconnection hierarchy developed by the authors, the feasibility of retrofit SuDS deployment within the case study catchment is assessed, and the implications discussed. PMID:22699330

  1. Synthesis of Schiff Bases via Environmentally Benign and Energy-Efficient Greener Methodologies

    OpenAIRE

    Arshi Naqvi; Mohd. Shahnawaaz; Arikatla V. Rao; Daya S. Seth; Sharma, Nawal K.

    2009-01-01

    Non classical methods (water based reaction, microwave and grindstone chemistry) were used for the preparation of Schiff bases from 3-chloro-4-fluoro aniline and several benzaldehydes. The key raw materials were allowed to react in water, under microwave irradiation and grindstone. These methodologies constitute an energy-efficient and environmentally benign greener chemistry version of the classical condensation reactions for Schiff bases formation.

  2. Tornado missile simulation and design methodology. Volume 2: model verification and data base updates. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments.

  3. A copula-based downscaling methodology of RCM precipitation fields

    Science.gov (United States)

    Lorenz, Manuel

    2016-04-01

    Many hydrological studies require long term precipitation time series at a fine spatial resolution. While regional climate models are nowadays capable of simulating reasonable high-resolution precipitation fields, the long computing time makes the generation of such long term time series often infeasible for practical purposes. We introduce a comparatively fast stochastic approach to simulate precipitation fields which resemble the spatial dependencies and density distributions of the dynamic model. Nested RCM simulations at two different spatial resolutions serve as a training set to derive the statistics which will then be used in a random path simulation where fine scale precipitation values are simulated from a multivariate Gaussian Copula. The chosen RCM is the Weather Research and Forecasting Model (WRF). Simulated daily precipitation fields of the RCM are based on ERA-Interim reanalysis data from 1971 to 2000 and are available at a spatial resolution of 42 km (Europe) and 7 km (Germany). In order to evaluate the method, the stochastic algorithm is applied to the nested German domain and the resulting spatial dependencies and density distributions are compared to the original 30 years long 7 km WRF simulations. Preliminary evaluations based on QQ-plots for one year indicate that the distributions of the downscaled values are very similar to the original values for most cells. In this presentation, a detailed overview of the stochastic downscaling algorithm and the evaluation of the long term simulations are given. Additionally, an outlook for a 5 km and 1 km downscaling experiment for urban hydrology studies is presented.

  4. Unbiased Selective Isolation of Protein N-Terminal Peptides from Complex Proteome Samples Using Phospho Tagging PTAG) and TiO2-based Depletion

    NARCIS (Netherlands)

    Mommen, G.P.M.; Waterbeemd, van de B.; Meiring, H.D.; Kersten, G.; Heck, A.J.R.; Jong, de A.P.J.M.

    2012-01-01

    A positional proteomics strategy for global N-proteome analysis is presented based on phospho tagging (PTAG) of internal peptides followed by depletion by titanium dioxide (TiO2) affinity chromatography. Therefore, N-terminal and lysine amino groups are initially completely dimethylated with formald

  5. Simulation-based reactor control design methodology for CANDU 9

    Energy Technology Data Exchange (ETDEWEB)

    Kattan, M.K.; MacBeth, M.J. [Atomic Energy of Canada Limited, Saskatoon, Saskatchewan (Canada); Chan, W.F.; Lam, K.Y. [Cassiopeia Technologies Inc., Toronto, Ontario (Canada)

    1996-07-01

    The next generation of CANDU nuclear power plant being designed by AECL is the 900 MWe CANDU 9 station. This design is based upon the Darlington CANDU nuclear power plant located in Ontario which is among the world leading nuclear power stations for highest capacity factor with the lowest operation, maintenance and administration costs in North America. Canadian-designed CANDU pressurized heavy water nuclear reactors have traditionally been world leaders in electrical power generation capacity performance. This paper introduces the CANDU 9 design initiative to use plant simulation during the design stage of the plant distributed control system (DCS), plant display system (PDS) and the control centre panels. This paper also introduces some details of the CANDU 9 DCS reactor regulating system (RRS) control application, a typical DCS partition configuration, and the interfacing of some of the software design processes that are being followed from conceptual design to final integrated design validation. A description is given of the reactor model developed specifically for use in the simulator. The CANDU 9 reactor model is a synthesis of 14 micro point-kinetic reactor models to facilitate 14 liquid zone controllers for bulk power error control, as well as zone flux tilt control. (author)

  6. Measure of Landscape Heterogeneity by Agent-Based Methodology

    Science.gov (United States)

    Wirth, E.; Szabó, Gy.; Czinkóczky, A.

    2016-06-01

    With the rapid increase of the world's population, the efficient food production is one of the key factors of the human survival. Since biodiversity and heterogeneity is the basis of the sustainable agriculture, the authors tried to measure the heterogeneity of a chosen landscape. The EU farming and subsidizing policies (EEA, 2014) support landscape heterogeneity and diversity, nevertheless exact measurements and calculations apart from statistical parameters (standard deviation, mean), do not really exist. In the present paper the authors' goal is to find an objective, dynamic method that measures landscape heterogeneity. It is achieved with the so called agent-based modelling, where randomly dispatched dynamic scouts record the observed land cover parameters and sum up the features of a new type of land. During the simulation the agents collect a Monte Carlo integral as a diversity landscape potential which can be considered as the unit of the `greening' measure. As a final product of the ABM method, a landscape potential map is obtained that can serve as a tool for objective decision making to support agricultural diversity.

  7. Weibull-Based Design Methodology for Rotating Aircraft Engine Structures

    Science.gov (United States)

    Zaretsky, Erwin; Hendricks, Robert C.; Soditus, Sherry

    2002-01-01

    The NASA Energy Efficient Engine (E(sup 3)-Engine) is used as the basis of a Weibull-based life and reliability analysis. Each component's life and thus the engine's life is defined by high-cycle fatigue (HCF) or low-cycle fatigue (LCF). Knowing the cumulative life distribution of each of the components making up the engine as represented by a Weibull slope is a prerequisite to predicting the life and reliability of the entire engine. As the engine Weibull slope increases, the predicted lives decrease. The predicted engine lives L(sub 5) (95 % probability of survival) of approximately 17,000 and 32,000 hr do correlate with current engine maintenance practices without and with refurbishment. respectively. The individual high pressure turbine (HPT) blade lives necessary to obtain a blade system life L(sub 0.1) (99.9 % probability of survival) of 9000 hr for Weibull slopes of 3, 6 and 9, are 47,391 and 20,652 and 15,658 hr, respectively. For a design life of the HPT disks having probable points of failure equal to or greater than 36,000 hr at a probability of survival of 99.9 %, the predicted disk system life L(sub 0.1) can vary from 9,408 to 24,911 hr.

  8. Design methodology for bio-based processing: Biodiesel and fatty alcohol production

    DEFF Research Database (Denmark)

    Simasatikul, Lida; Arpornwichanopa, Amornchai; Gani, Rafiqul

    2013-01-01

    A systematic design methodology is developed for producing multiple main products plus side products starting with one or more bio-based renewable source. A superstructure that includes all possible reaction and separation operations is generated through thermodynamic insights and available data........ Economic analysis and net present value are determined to find the best economically and operationally feasible process. The application of the methodology is presented through a case study involving biodiesel and fatty alcohol productions....

  9. Design methodology for bio-based processing: Biodiesel and fatty alcohol production

    DEFF Research Database (Denmark)

    Simasatikul, Lida; Arpornwichanop, Amornchai; Gani, Rafiqul

    2012-01-01

    A systematic design methodology is developed for producing two main products plus side products starting with one or more bio-based renewable source. A superstructure that includes all possible reaction and separation operations is generated through thermodynamic insights and available data. The ....... Economic analysis and net present value are determined to find the best economically and operationally feasible process. The application of the methodology is presented through a case study involving biodiesel and fatty alcohol productions....

  10. Conducting global team-based ethnography: Methodological challenges and practical methods

    OpenAIRE

    Jarzabkowski, P; Bednarek, R; Cabantous, L.

    2014-01-01

    Ethnography has often been seen as the province of the lone researcher; however, increasingly management scholars are examining global phenomena, necessitating a shift to global team-based ethnography. This shift presents some fundamental methodological challenges, as well as practical issues of method, that have not been examined in the literature on organizational research methods. That is the focus of this paper. We first outline the methodological implications of a shift from single resea...

  11. Estimating the revenues of a hydrogen-based high-capacity storage device: methodology and results

    OpenAIRE

    François-Lavet, Vincent; Fonteneau, Raphaël; Ernst, Damien

    2014-01-01

    This paper proposes a methodology to estimate the maximum revenue that can be generated by a company that operates a high-capacity storage device to buy or sell electricity on the day-ahead electricity market. The methodology exploits the Dynamic Programming (DP) principle and is specified for hydrogen-based storage devices that use electrolysis to produce hydrogen and fuel cells to generate electricity from hydrogen. Experimental results are generated using historical data of energy prices o...

  12. Design methodology for bio-based processing: Biodiesel and fatty alcohol production

    DEFF Research Database (Denmark)

    Simasatikul, Lida; Arpornwichanop, Amornchai; Gani, Rafiqul

    A systematic design methodology is developed for producing two main products plus side products starting with one or more bio-based renewable source. A superstructure that includes all possible reaction and separation operations is generated through thermodynamic insights and available data. The ....... Economic analysis and net present value are determined to find the best economically and operationally feasible process. The application of the methodology is presented through a case study involving biodiesel and fatty alcohol productions....

  13. Petri net based engineering and software methodology for service-oriented industrial automation

    OpenAIRE

    Mendes, João M.; Restivo, Francisco; Leitão, Paulo; Colombo, Armando W.

    2010-01-01

    Indexado ISI Collaborative industrial systems are becoming an emergent paradigm towards flexibility. One promising solution are service-oriented industrial automation systems, but integrated software methodologies and major frameworks for the engineering are still missing. This paper presents na overview of the current results on a unified and integrated methodology based on intrinsic and novel features of Petri nets. These nets are applied to the modeling, analysis, service management, em...

  14. Efficient Finite Element Methodology Based on Cartesian Grids: Application to Structural Shape Optimization

    OpenAIRE

    Nadal, E.; Ródenas, J. J.; Albelda, J.; Tur, M.; Tarancón, J. E.; Fuenmayor, F.J.

    2013-01-01

    This work presents an analysis methodology based on the use of the Finite Element Method (FEM) nowadays considered one of the main numerical tools for solving Boundary Value Problems (BVPs). The proposed methodology, so-called cg-FEM (Cartesian grid FEM), has been implemented for fast and accurate numerical analysis of 2D linear elasticity problems. The traditional FEM uses geometry-conforming meshes; however, in cg-FEM the analysis mesh is not conformal to the geometry. This allows for defin...

  15. Problems and Issues in Using Computer- Based Support Tools to Enhance 'Soft' Systems Methodologies

    Directory of Open Access Journals (Sweden)

    Mark Stansfield

    2001-11-01

    Full Text Available This paper explores the issue of whether computer-based support tools can enhance the use of 'soft' systems methodologies as applied to real-world problem situations. Although work has been carried out by a number of researchers in applying computer-based technology to concepts and methodologies relating to 'soft' systems thinking such as Soft Systems Methodology (SSM, such attempts appear to be still in their infancy and have not been applied widely to real-world problem situations. This paper will highlight some of the problems that may be encountered in attempting to develop computer-based support tools for 'soft' systems methodologies. Particular attention will be paid to an attempt by the author to develop a computer-based support tool for a particular 'soft' systems method of inquiry known as the Appreciative Inquiry Method that is based upon Vickers' notion of 'appreciation' (Vickers, 196S and Checkland's SSM (Checkland, 1981. The final part of the paper will explore some of the lessons learnt from developing and applying the computer-based support tool to a real world problem situation, as well as considering the feasibility of developing computer-based support tools for 'soft' systems methodologies. This paper will put forward the point that a mixture of manual and computer-based tools should be employed to allow a methodology to be used in an unconstrained manner, but the benefits provided by computer-based technology should be utilised in supporting and enhancing the more mundane and structured tasks.

  16. Ozone depletion by hydrofluorocarbons

    Science.gov (United States)

    Hurwitz, Margaret M.; Fleming, Eric L.; Newman, Paul A.; Li, Feng; Mlawer, Eli; Cady-Pereira, Karen; Bailey, Roshelle

    2015-10-01

    Atmospheric concentrations of hydrofluorocarbons (HFCs) are projected to increase considerably in the coming decades. Chemistry climate model simulations forced by current projections show that HFCs will impact the global atmosphere increasingly through 2050. As strong radiative forcers, HFCs increase tropospheric and stratospheric temperatures, thereby enhancing ozone-destroying catalytic cycles and modifying the atmospheric circulation. These changes lead to a weak depletion of stratospheric ozone. Simulations with the NASA Goddard Space Flight Center 2-D model show that HFC-125 is the most important contributor to HFC-related atmospheric change in 2050; its effects are comparable to the combined impacts of HFC-23, HFC-32, HFC-134a, and HFC-143a. Incorporating the interactions between chemistry, radiation, and dynamics, ozone depletion potentials (ODPs) for HFCs range from 0.39 × 10-3 to 30.0 × 10-3, approximately 100 times larger than previous ODP estimates which were based solely on chemical effects.

  17. Towards a self-adaptive service-oriented methodology based on extended SOMA

    Institute of Scientific and Technical Information of China (English)

    Alireza PARVIZI-MOSAED‡; Shahrouz MOAVEN; Jafar HABIBI; Ghazaleh BEIGI; Mahdieh NASER-SHARIAT

    2015-01-01

    We propose a self-adaptive process (SAP) that maintains the software architecture quality using the MAPE-K standard model. The proposed process can be plugged into various software development processes and service-oriented meth-odologies due to its explicitly defined inputs and outputs. To this aim, the proposed SAP is integrated with the service-oriented modeling and application (SOMA) methodology in a two-layered structure to create a novel methodology, named self-adaptive service-oriented architecture methodology (SASOAM), which provides a semi-automatic self-aware method by the composition of architectural tactics. Moreover, the maintenance activity of SOMA is improved using architectural and adaptive patterns, which results in controlling the software architecture quality. The improvement in the maintainability of SOMA is demonstrated by an analytic hierarchy process (AHP) based evaluation method. Furthermore, the proposed method is applied to a case study to represent the feasibility and practicality of SASOAM.

  18. A Step-by-Step Design Methodology for a Base Case Vanadium Redox-Flow Battery

    Science.gov (United States)

    Moore, Mark; Counce, Robert M.; Watson, Jack S.; Zawodzinski, Thomas A.; Kamath, Haresh

    2012-01-01

    The purpose of this work is to develop an evolutionary procedure to be used by Chemical Engineering students for the base-case design of a Vanadium Redox-Flow Battery. The design methodology is based on the work of Douglas (1985) and provides a profitability analysis at each decision level so that more profitable alternatives and directions can be…

  19. Radiative characteristics of depleted uranium bomb and it is protection

    International Nuclear Information System (INIS)

    Based on the developing process of depleted uranium bombs described in the first part, the radiative characteristics and mechanism of depleted uranium bombs are analyzed emphatically. The deeper discussion on protection of depleted uranium bombs is proceeded

  20. Depleted uranium management alternatives

    International Nuclear Information System (INIS)

    This report evaluates two management alternatives for Department of Energy depleted uranium: continued storage as uranium hexafluoride, and conversion to uranium metal and fabrication to shielding for spent nuclear fuel containers. The results will be used to compare the costs with other alternatives, such as disposal. Cost estimates for the continued storage alternative are based on a life-cycle of 27 years through the year 2020. Cost estimates for the recycle alternative are based on existing conversion process costs and Capital costs for fabricating the containers. Additionally, the recycle alternative accounts for costs associated with intermediate product resale and secondary waste disposal for materials generated during the conversion process

  1. Pragmatic principles--methodological pragmatism in the principle-based approach to bioethics.

    Science.gov (United States)

    Schmidt-Felzmann, Heike

    2003-01-01

    In this paper it will be argued that Beauchamp and Childress' principle-based approach to bioethics has strongly pragmatic features. Drawing on the writings of William James, I first develop an understanding of methodological pragmatism as a method of justification. On the basis of Beauchamp's and Childress' most recent proposals concerning moral justification in the fifth edition of their Principles of Biomedical Ethics (2001), I then discuss different aspects that the principle-based approach and methodological pragmatism have in common. PMID:14972762

  2. Methodology for Web Services Adoption Based on Technology Adoption Theory and Business Process Analyses

    Institute of Scientific and Technical Information of China (English)

    AN Liping; YAN Jianyuan; TONG Lingyun

    2008-01-01

    Web services use an emerging service-oriented architecture for distributed computing. Many organizations are either in the process of adopting web services technology or evaluating this option for incorporation into their enterprise information architectures. Implementation of this new technology requires careful assessment of the needs and capabilities of an organization to formulate adoption strategies. This paper presents a methodology for web services adoption based on technology adoption theory and business process analyses. The methodology suggests that strategies, business areas, and functions within an organization should be considered based on the existing organizational information technology status during the process of adopting web services to support the business needs and requirements.

  3. A general methodology for mobility analysis of mechanisms based on constraint screw theory

    Institute of Scientific and Technical Information of China (English)

    HUANG Zhen; LIU JingFang; ZENG DaXing

    2009-01-01

    It is well known that the traditional Grubler-Kutzbach formula fails to calculate the mobility of some classical mechanisms or many modern parallel robots, and this situation seriously hampers mechani-cal innovation. To seek an efficient and universal method for mobility calculation has been a heated topic in the sphere of mechanism. The modified Grubler-Kutzbach criterion proposed by us achieved success in calculating the mobility of a lot of highly complicated mechanisms, especially the mobility of all recent parallel mechanisms listed by Gogu, and the Bennett mechanism known for its particular difficulty. With wide applications of the criterion, a systematic methodology has recently formed. This paper systematically presents the methodology based on the screw theory for the first time and ana-lyzes six representative puzzling mechanisms. In addition, the methodology is convenient for judgment of the instantaneous or full-cycle mobility, and has become an effective and general method of great scientific value and practical significance. In the first half, this paper introduces the basic screw theory,then it presents the effective methodology formed within this decade. The second half of this paperpresents how to apply the methodology by analyzing the mobility of several puzzling mechanisms.Finally, this paper contrasts and analyzes some different methods and interprets the essential reason for validity of our methodology.

  4. A general methodology for mobility analysis of mechanisms based on constraint screw theory

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    It is well known that the traditional Grübler-Kutzbach formula fails to calculate the mobility of some classical mechanisms or many modern parallel robots,and this situation seriously hampers mechani-cal innovation.To seek an efficient and universal method for mobility calculation has been a heated topic in the sphere of mechanism.The modified Grübler-Kutzbach criterion proposed by us achieved success in calculating the mobility of a lot of highly complicated mechanisms,especially the mobility of all recent parallel mechanisms listed by Gogu,and the Bennett mechanism known for its particular difficulty.With wide applications of the criterion,a systematic methodology has recently formed.This paper systematically presents the methodology based on the screw theory for the first time and ana-lyzes six representative puzzling mechanisms.In addition,the methodology is convenient for judgment of the instantaneous or full-cycle mobility,and has become an effective and general method of great scientific value and practical significance.In the first half,this paper introduces the basic screw theory,then it presents the effective methodology formed within this decade.The second half of this paper presents how to apply the methodology by analyzing the mobility of several puzzling mechanisms.Finally,this paper contrasts and analyzes some different methods and interprets the essential reason for validity of our methodology.

  5. Design of an Emulsion-based Personal Detergent through a Model-based Chemical Product Design Methodology

    DEFF Research Database (Denmark)

    Mattei, Michele; Hill, Michael; Kontogeorgis, Georgios;

    2013-01-01

    An extended systematic methodology for the design of emulsion-based Chemical products is presented. The methodology consists of a model-based framework involving seven sequential hierarchical steps: starting with the identification of the needs to be satisfied by the product and then adding one......-by-one the different classes of chemicals, until a formulation is obtained, the stability of which as en emulsion is finally checked with appropriate models. Structured databases, appropriate pure component as well as mixture property models, rule-based selection criteria and CAMD techniques are employed...

  6. The Integration of Project-Based Methodology into Teaching in Machine Translation

    Science.gov (United States)

    Madkour, Magda

    2016-01-01

    This quantitative-qualitative analytical research aimed at investigating the effect of integrating project-based teaching methodology into teaching machine translation on students' performance. Data was collected from the graduate students in the College of Languages and Translation, at Imam Muhammad Ibn Saud Islamic University, Riyadh, Saudi…

  7. Project-based learning in organizations : towards a methodology for learning in groups

    NARCIS (Netherlands)

    Poell, R.F.; van der Krogt, F.J.

    2003-01-01

    This article introduces a methodology for employees in organizations to set up and carry out their own group learning projects. It is argued that employees can use project-based learning to make their everyday learning more systematic at times, without necessarily formalizing it. The article emphasi

  8. Teaching Research Methodology Using a Project-Based Three Course Sequence Critical Reflections on Practice

    Science.gov (United States)

    Braguglia, Kay H.; Jackson, Kanata A.

    2012-01-01

    This article presents a reflective analysis of teaching research methodology through a three course sequence using a project-based approach. The authors reflect critically on their experiences in teaching research methods courses in an undergraduate business management program. The introduction of a range of specific techniques including student…

  9. An integrated science-based methodology to assess potential risks and implications of engineered nanomaterials

    Science.gov (United States)

    There is an urgent need for broad and integrated studies that address the risks of engineered nanomaterials (ENMs) along the different endpoints of the society, environment, and economy (SEE) complex adaptive system. This article presents an integrated science-based methodology ...

  10. EA Training 2.0 Newsletter #3 - EA Active, Problem Based Learning Methodology

    DEFF Research Database (Denmark)

    Buus, Lillian; Ryberg, Thomas; Sroga, Magdalena

    2010-01-01

    The main products of the project are innovative, active problem-based learning methodology for EA education and training, EA courses for university students and private and public sector employees, and an Enterprise Architecture competence ontology including a complete specification of skills and...

  11. The Change towards a Teaching Methodology Based on Competences: A Case Study in a Spanish University

    Science.gov (United States)

    Gonzalez, Jose Maria G.; Arquero Montaño, Jose Luis; Hassall, Trevor

    2014-01-01

    The European Higher Education Area (EHEA) has promoted the implementation of a teaching methodology based on competences. Drawing on New Institutional Sociology, the present work aims to identify and improve knowledge concerning the factors which are hindering that change in the Spanish university system. This is investigated using a case study…

  12. REGULATORY FRAMEWORK AND EVALUATION OF HUMAN-MACHINE INTERFACES IMS NPP SAFETY CASE BASED METHODOLOGY

    OpenAIRE

    Харченко, В'ячеслав Сергійович; "Національний аерокосмічний університет ім.М.Є.Жуковського "ХАІ""; Орехова, Анастасія Олександрівна; "Національний аерокосмічний університет ім.М.Є.Жуковського "ХАІ""

    2012-01-01

    The problems associated with safety of human-machine interfaces, information and control systems in NPP are analyzed.. An approach to assess the safety HMI I&C system NPP, based on Safety Case methodology is proposed. The profile of standards for HMI quality requirements is presented. An example of HMI quality assessment is described.

  13. The Idea of National HRD: An Analysis Based on Economics and Theory Development Methodology

    Science.gov (United States)

    Wang, Greg G.; Swanson, Richard A.

    2008-01-01

    Recent human resource development (HRD) literature focuses attention on national HRD (NHRD) research and represents problems in both HRD identity and research methodology. Based on a review of development economics and international development literature, this study analyzes the existing NHRD literature with respect to the theory development…

  14. MRI脑测谎实验方法学%Brain-Based MRI lie detection experiment methodology

    Institute of Scientific and Technical Information of China (English)

    李文石; 张好; 胡清泉; 苏香; 郭亮

    2006-01-01

    The brain-based MRI lie detection experiment methodology is reviewed for the first time, including the magnetic resonance imaging paradigm,the double-block deign,the equidistance hit-ball and the test mechanice,This paper illustrates the research results of 3D MRI lie detection and the contrastive experiment of otopoint mapping brain signature lie detection,ingeminates the lie-Truth Law(PT/PL ≤0.618)which we get from the statistic of the world MRI reports. The conclusion points out the essence of this technology,its advantages and disadvantages,and the evolution of this methodology.

  15. Achieving process intensification form the application of a phenomena based synthesis, Design and intensification methodology

    DEFF Research Database (Denmark)

    Babi, Deenesh Kavi; Lutze, Philip; Woodley, John;

    for a more systematic, efficient and flexible PI methodology covering a wider range of applications which is able to find truly innovative and predictive solutions, not only using knowledge of the existing methods at the Unit-Ops level but also operating at a lower level of aggregation (that is, the...... phenomena level). This enables the use of apriori knowledge of the Unit-Ops as well as the possibility to design new Unit-Ops. A first version for a phenomena-based synthesis/design (PhenPI) methodology has been developed [5] in which a process flowsheet is generated through the use of involved phenomena...... such as mixing, phase transition and phase separation [5]. In principle, generating processes from phenomena leads to a large number of process options and therefore, an efficient solution procedure for the evaluation of these process options is needed. To manage this complexity, the PhenPI methodology...

  16. Methodology to estimate parameters of an excitation system based on experimental conditions

    Energy Technology Data Exchange (ETDEWEB)

    Saavedra-Montes, A.J. [Carrera 80 No 65-223, Bloque M8 oficina 113, Escuela de Mecatronica, Universidad Nacional de Colombia, Medellin (Colombia); Calle 13 No 100-00, Escuela de Ingenieria Electrica y Electronica, Universidad del Valle, Cali, Valle (Colombia); Ramirez-Scarpetta, J.M. [Calle 13 No 100-00, Escuela de Ingenieria Electrica y Electronica, Universidad del Valle, Cali, Valle (Colombia); Malik, O.P. [2500 University Drive N.W., Electrical and Computer Engineering Department, University of Calgary, Calgary, Alberta (Canada)

    2011-01-15

    A methodology to estimate the parameters of a potential-source controlled rectifier excitation system model is presented in this paper. The proposed parameter estimation methodology is based on the characteristics of the excitation system. A comparison of two pseudo random binary signals, two sampling periods for each one, and three estimation algorithms is also presented. Simulation results from an excitation control system model and experimental results from an excitation system of a power laboratory setup are obtained. To apply the proposed methodology, the excitation system parameters are identified at two different levels of the generator saturation curve. The results show that it is possible to estimate the parameters of the standard model of an excitation system, recording two signals and the system operating in closed loop with the generator. The normalized sum of squared error obtained with experimental data is below 10%, and with simulation data is below 5%. (author)

  17. INSTALLING AN ERP SYSTEM WITH A METHODOLOGY BASED ON THE PRINCIPLES OF GOAL DIRECTED PROJECT MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Ioannis Zafeiropoulos

    2010-01-01

    Full Text Available This paper describes a generic methodology to support the process of modelling, adaptation and implementation (MAI of Enterprise Resource Planning Systems (ERPS based on the principles of goal directed project management (GDPM. The proposed methodology guides the project manager through specific stages in order to successfully complete the ERPS implementation. The development of the proper MAI methodology is deemed necessary because it will simplify the installation process of ERPS. The goal directed project management method was chosen since it provides a way of focusing all changes towards a predetermined goal. The main stages of the methodology are the promotion and preparation steps, the proposal, the contract, the implementation and the completion. The methodology was applied as a pilot application by a major ERPS development company. Important benefits were the easy and effective guidance for all installation and analysis stages, the faster installation for the ERPS and the control and cost reduction for the installation, in terms of time, manpower, technological equipment and other resources.

  18. Advanced piloted aircraft flight control system design methodology. Volume 1: Knowledge base

    Science.gov (United States)

    Mcruer, Duane T.; Myers, Thomas T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design stages starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. While theory and associated computational means are an important aspect of the design methodology, the lore, knowledge and experience elements, which guide and govern applications are critical features. This material is presented as summary tables, outlines, recipes, empirical data, lists, etc., which encapsulate a great deal of expert knowledge. Much of this is presented in topical knowledge summaries which are attached as Supplements. The composite of the supplements and the main body elements constitutes a first cut at a a Mark 1 Knowledge Base for manned-aircraft flight control.

  19. Performance analysis of complex repairable industrial systems using PSO and fuzzy confidence interval based methodology.

    Science.gov (United States)

    Garg, Harish

    2013-03-01

    The main objective of the present paper is to propose a methodology for analyzing the behavior of the complex repairable industrial systems. In real-life situations, it is difficult to find the most optimal design policies for MTBF (mean time between failures), MTTR (mean time to repair) and related costs by utilizing available resources and uncertain data. For this, the availability-cost optimization model has been constructed for determining the optimal design parameters for improving the system design efficiency. The uncertainties in the data related to each component of the system are estimated with the help of fuzzy and statistical methodology in the form of the triangular fuzzy numbers. Using these data, the various reliability parameters, which affects the system performance, are obtained in the form of the fuzzy membership function by the proposed confidence interval based fuzzy Lambda-Tau (CIBFLT) methodology. The computed results by CIBFLT are compared with the existing fuzzy Lambda-Tau methodology. Sensitivity analysis on the system MTBF has also been addressed. The methodology has been illustrated through a case study of washing unit, the main part of the paper industry. PMID:23098922

  20. DEPLETED URANIUM TECHNICAL WORK

    Science.gov (United States)

    The Depleted Uranium Technical Work is designed to convey available information and knowledge about depleted uranium to EPA Remedial Project Managers, On-Scene Coordinators, contractors, and other Agency managers involved with the remediation of sites contaminated with this mater...

  1. Design-Based Research: Is This a Suitable Methodology for Short-Term Projects?

    Science.gov (United States)

    Pool, Jessica; Laubscher, Dorothy

    2016-01-01

    This article reports on a design-based methodology of a thesis in which a fully face-to-face contact module was converted into a blended learning course. The purpose of the article is to report on how design-based phases, in the form of micro-, meso- and macro-cycles were applied to improve practice and to generate design principles. Design-based…

  2. Hierachy-based methodology for producing educational contents with maximal reutilization

    OpenAIRE

    Pedraza, Rafael; Valverde Albacete, Francisco; Cid Sueiro, Jes??s; Molina Bulla, Harold; Navia V??zquez, ??ngel

    2002-01-01

    Computer based training or distance education are facing dramatic changes with the advent of standardization efforts, some of them concentrating in maximal reuse. This is of paramount importance for a sustainable -cost affordable- production of educational materials. Reuse in itself should not be a goal, though, since many methodological aspects might be lost. In this paper we propose two content production approaches for the InterMediActor platform under a competence-based ...

  3. A fault diagnosis methodology for rolling element bearings based on advanced signal pretreatment and autoregressive modelling

    Science.gov (United States)

    Al-Bugharbee, Hussein; Trendafilova, Irina

    2016-05-01

    This study proposes a methodology for rolling element bearings fault diagnosis which gives a complete and highly accurate identification of the faults present. It has two main stages: signals pretreatment, which is based on several signal analysis procedures, and diagnosis, which uses a pattern-recognition process. The first stage is principally based on linear time invariant autoregressive modelling. One of the main contributions of this investigation is the development of a pretreatment signal analysis procedure which subjects the signal to noise cleaning by singular spectrum analysis and then stationarisation by differencing. So the signal is transformed to bring it close to a stationary one, rather than complicating the model to bring it closer to the signal. This type of pretreatment allows the use of a linear time invariant autoregressive model and improves its performance when the original signals are non-stationary. This contribution is at the heart of the proposed method, and the high accuracy of the diagnosis is a result of this procedure. The methodology emphasises the importance of preliminary noise cleaning and stationarisation. And it demonstrates that the information needed for fault identification is contained in the stationary part of the measured signal. The methodology is further validated using three different experimental setups, demonstrating very high accuracy for all of the applications. It is able to correctly classify nearly 100 percent of the faults with regard to their type and size. This high accuracy is the other important contribution of this methodology. Thus, this research suggests a highly accurate methodology for rolling element bearing fault diagnosis which is based on relatively simple procedures. This is also an advantage, as the simplicity of the individual processes ensures easy application and the possibility for automation of the entire process.

  4. Validating the history-based diffusion methodology for core tracking using in-core detectors

    International Nuclear Information System (INIS)

    A refinement of the three-dimensional diffusion reactor fuelling simulation program for CANDU reactors (RFSP) was developed. This refinement, called the history-based local-parameter methodology, allows core tracking with lattice properties which take into account the history of local conditions at each individual fuel bundle, including (in addition to fuel irradiation) local values of fuel temperature, coolant density, power level, and concentration of saturated fission products. This paper presents a validation of the history-based methodology, performed by comparing flux shapes calculated for the Point Lepreau CANDU 6 reactor over a tracking period of about 1.5 years (1991 September - 1993 April) to the core fluxes measured by means of in-core vanadium detectors. The results were compared with those obtained from a conventional, i.e., non-history-based, calculation. The standard deviation of differences between calculated and measured fluxes is significantly improved (reduced) when using the history-based methodology. 2 refs., 7 figs

  5. Environmental restoration risk-based prioritization work package planning and risk ranking methodology. Revision 2

    International Nuclear Information System (INIS)

    This document presents the risk-based prioritization methodology developed to evaluate and rank Environmental Restoration (ER) work packages at the five US Department of Energy, Oak Ridge Field Office (DOE-ORO) sites [i.e., Oak Ridge K-25 Site (K-25), Portsmouth Gaseous Diffusion Plant (PORTS), Paducah Gaseous Diffusion Plant (PGDP), Oak Ridge National Laboratory (ORNL), and the Oak Ridge Y-12 Plant (Y-12)], the ER Off-site Program, and Central ER. This prioritization methodology was developed to support the increased rigor and formality of work planning in the overall conduct of operations within the DOE-ORO ER Program. Prioritization is conducted as an integral component of the fiscal ER funding cycle to establish program budget priorities. The purpose of the ER risk-based prioritization methodology is to provide ER management with the tools and processes needed to evaluate, compare, prioritize, and justify fiscal budget decisions for a diverse set of remedial action, decontamination and decommissioning, and waste management activities. The methodology provides the ER Program with a framework for (1) organizing information about identified DOE-ORO environmental problems, (2) generating qualitative assessments of the long- and short-term risks posed by DOE-ORO environmental problems, and (3) evaluating the benefits associated with candidate work packages designed to reduce those risks. Prioritization is conducted to rank ER work packages on the basis of the overall value (e.g., risk reduction, stakeholder confidence) each package provides to the ER Program. Application of the methodology yields individual work package ''scores'' and rankings that are used to develop fiscal budget requests. This document presents the technical basis for the decision support tools and process

  6. Snow cover reconstruction methodology based on historic in situ observations and recent remote sensing data

    Directory of Open Access Journals (Sweden)

    A. Gafurov

    2014-09-01

    Full Text Available Spatially distributed snow cover extent can be derived from remote sensing data with good accuracy. However, such data are available for recent decades only, after satellite missions with proper snow detection capabilities were launched. Yet, longer time series of snow cover area (SCA are usually required e.g. for hydrological model calibration or water availability assessment in the past. We present a methodology to reconstruct historical snow coverage using recently available remote sensing data and long-term point observations of snow depth from existing meteorological stations. The methodology is mainly based on correlations between station records and spatial snow cover patterns. Additionally, topography and temporal persistence of snow patterns are taken into account. The methodology was applied to the Zerafshan River basin in Central Asia – a very data-sparse region. Reconstructed snow cover was cross-validated against independent remote sensing data and shows an accuracy of about 85%. The methodology can be used to overcome the data gap for earlier decades when the availability of remote sensing snow cover data was strongly limited.

  7. A Novel Clustering Methodology Based on Modularity Optimisation for Detecting Authorship Affinities in Shakespearean Era Plays.

    Science.gov (United States)

    Naeni, Leila M; Craig, Hugh; Berretta, Regina; Moscato, Pablo

    2016-01-01

    In this study we propose a novel, unsupervised clustering methodology for analyzing large datasets. This new, efficient methodology converts the general clustering problem into the community detection problem in graph by using the Jensen-Shannon distance, a dissimilarity measure originating in Information Theory. Moreover, we use graph theoretic concepts for the generation and analysis of proximity graphs. Our methodology is based on a newly proposed memetic algorithm (iMA-Net) for discovering clusters of data elements by maximizing the modularity function in proximity graphs of literary works. To test the effectiveness of this general methodology, we apply it to a text corpus dataset, which contains frequencies of approximately 55,114 unique words across all 168 written in the Shakespearean era (16th and 17th centuries), to analyze and detect clusters of similar plays. Experimental results and comparison with state-of-the-art clustering methods demonstrate the remarkable performance of our new method for identifying high quality clusters which reflect the commonalities in the literary style of the plays. PMID:27571416

  8. A Novel Clustering Methodology Based on Modularity Optimisation for Detecting Authorship Affinities in Shakespearean Era Plays

    Science.gov (United States)

    Craig, Hugh; Berretta, Regina; Moscato, Pablo

    2016-01-01

    In this study we propose a novel, unsupervised clustering methodology for analyzing large datasets. This new, efficient methodology converts the general clustering problem into the community detection problem in graph by using the Jensen-Shannon distance, a dissimilarity measure originating in Information Theory. Moreover, we use graph theoretic concepts for the generation and analysis of proximity graphs. Our methodology is based on a newly proposed memetic algorithm (iMA-Net) for discovering clusters of data elements by maximizing the modularity function in proximity graphs of literary works. To test the effectiveness of this general methodology, we apply it to a text corpus dataset, which contains frequencies of approximately 55,114 unique words across all 168 written in the Shakespearean era (16th and 17th centuries), to analyze and detect clusters of similar plays. Experimental results and comparison with state-of-the-art clustering methods demonstrate the remarkable performance of our new method for identifying high quality clusters which reflect the commonalities in the literary style of the plays. PMID:27571416

  9. Extension of direct displacement-based design methodology for bridges to account for higher mode effects

    OpenAIRE

    Kappos, A. J.; Gkatzogias, K.I.; Gidaris, I.G.

    2013-01-01

    An improvement is suggested to the direct displacement-based design (DDBD) procedure for bridges to account for higher mode effects, the key idea being not only the proper prediction of a target-displacement profile through the effective mode shape (EMS) method (wherein all significant modes are considered), but also the proper definition of the corresponding peak structural response. The proposed methodology is then applied to an actual concrete bridge wherein the different pier heights and ...

  10. Eco-efficiency methodology of Hybrid Electric Vehicle based on multidisciplinary multi-objective optimization

    OpenAIRE

    Nzisabira, Jonathan; Louvigny, Yannick; Christiaens, Sébastien; Duysinx, Pierre

    2009-01-01

    The eco-efficiency concept of clean propulsion vehicles aims at is simultaneously reducing the fuel consumption and environment pollutants impact (Eco-score) without decreasing the vehicle performances and other user satisfaction criteria. Based on a simulation model in ADVISOR, one can evaluate the performances, the emissions and then the Ecoscore and the User Satisfaction for different driving scenarios. To establish a rationale methodology for conducting the eco-efficiency design of electr...

  11. A design methodology for delta-sigma converters based on solid-state passive filters

    OpenAIRE

    Benabes, Philippe

    2013-01-01

    Print ISBN : 978-1-4799-0618-5 International audience In the context in the ENIAC ARTEMOS project for the design of agile radio front ends, this paper shows a methodology for the design of agile bandpass continuous-time delta sigma based on acoustic tunable resonators. These resonators use BST materials which have the property to be tunable by an external voltage, allowing changing the resonance frequency of filters by a few percent. Using such filters, the Oversampling ratio of delta s...

  12. An Intuitionistic Fuzzy Methodology for Component-Based Software Reliability Optimization

    DEFF Research Database (Denmark)

    Madsen, Henrik; Grigore, Albeanu; Popenţiuvlǎdicescu, Florin

    2012-01-01

    Component-based software development is the current methodology facilitating agility in project management, software reuse in design and implementation, promoting quality and productivity, and increasing the reliability and performability. This paper illustrates the usage of intuitionistic fuzzy...... degree approach in modelling the quality of entities in imprecise software reliability computing in order to optimize management results. Intuitionistic fuzzy optimization algorithms are proposed to be used for complex software systems reliability optimization under various constraints....

  13. Mergers in Greece: evaluation of the merger related performance of greek companies, accounting based methodology

    OpenAIRE

    Ζούτσου, Ρούλα

    2001-01-01

    This paper evaluates the financial results of 23 Greek merger transactions that were completed between 1993 and 1998 using the accounting based methodology. A set of 20 performance ratios is examined for a period of 5 years to get an indication of the mean weighted industry- adjusted performance difference between the pre- to post-merger period. Additionally, a cross-sectional analysis is performed to conclude on whether special characteristics of the merger participants are associated with i...

  14. Zombie Division : a methodological case study for the evaluation of game-based learning

    OpenAIRE

    Habgood, M. P. Jacob

    2015-01-01

    This paper discusses the methodological designs and technologies used to evaluate an educational videogame in order to support researchers in the design of their own evaluative research in the field of game-based learning. The Zombie Division videogame has been used to empirically evaluate the effectiveness of a more intrinsically integrated approach to creating educational games. It was specifically designed to deliver interventions as part of research studies examining differences in learni...

  15. WEBDATANET : a Network on Web-based Data Collection, Methodological Challenges, Solutions, and Implementation

    OpenAIRE

    Steinmetz, Stephanie; Lars, Kaczmirek; de Pedraza, Pablo; Reips, Ulf-Dietrich; Tijdens, Kea; Lozar Manfreda, Katja; Bernardo, Winer

    2012-01-01

    Do you collect data via the Internet in your research? If you do, this European network is important for you. The network collects and combines experiences and research on the methodology of online data collection. It provides access to expertise that may be important in your research.WEBDATANET is a unique multidisciplinary European network bringing together more than 76 leading web-based data collection experts, (web) survey methodologists, psychologists, sociologists, linguists, economists...

  16. THE METHODOLOGY OF STUDENTS’ SYNERGETIC WORLD OUTLOOK DEVELOPMENT BASED ON THE TRANS-DISCIPLINARY APPROACH

    OpenAIRE

    Y. A. Solodova

    2015-01-01

    The paper discusses the present stage of the world educational system development influenced by the fast increasing flow of information and knowledge. The situation requires the adequate pedagogical technologies for compressing the learning information; one of them is the transdisciplinary technology based on the synergetic methodology identifying the order parameters and general conformities of organizing the academic content. The trans-disciplinary technologies incorporate the general laws ...

  17. How Is the Evaluation Process in a Course Following the PBI (Problems-Based Learning Methodology?

    Directory of Open Access Journals (Sweden)

    Patricia Morales Bueno

    2013-09-01

    Full Text Available This article focuses on the different ways in which the concepts of teaching and learning are conceived. It also defines the conception on which the educational view of the PBL is based, and how it determines its learning goals. Likewise, it shows how evaluation strategies are bonded to each of the stages in the PBL process, noting their features and the relation with learning goals of the methodology.

  18. Comments on "A model-based design methodology for the development of mechatronic systems"

    OpenAIRE

    Thramboulidis, Kleanthis

    2014-01-01

    In the paper by G. Barbieri et al. (Mechatronics (2014), http://dx.doi.org/10.1016/j.mechatronics. 2013.12.004), a design methodology, based on the W life cycle process model, is presented and SysML is proposed as a tool to support the whole development process. In this letter, we discuss the presented approach, we point out technical errors and raise additional issues that might help in making the proposed approach applicable.

  19. AN INDUCTIVE, INTERACTIVE AND ADAPTIVE HYBRID PROBLEM-BASED LEARNING METHODOLOGY: APPLICATION TO STATISTICS

    Directory of Open Access Journals (Sweden)

    ADA ZHENG

    2011-10-01

    Full Text Available We have developed an innovative hybrid problem-based learning (PBL methodology. The methodology has the following distinctive features: i Each complex question was decomposed into a set of coherent finer subquestions by following the carefully designed criteria to maintain a delicate balance between guiding the students and inspiring them to think independently. This learning methodology enabled the students to solve the complex questions progressively in an inductive context. ii Facilitated by the utilization of our web-based learning systems, the teacher was able to interact with the students intensively and could allocate more teaching time to provide tailor-made feedback for individual student. The students were actively engaged in the learning activities, stimulated by the intensive interaction. iii The answers submitted by the students could be automatically consolidated in the report of the Moodle system in real-time. The teacher could adjust the teaching schedule and focus of the class to adapt to the learning progress of the students by analysing the automatically generated report and log files of the web-based learning system. As a result, the attendance rate of the students increased from about 50% to more than 90%, and the students’ learning motivation have been significantly enhanced.

  20. Depleted Uranium Management

    International Nuclear Information System (INIS)

    The paper considers radiological and toxic impact of the depleted uranium on the human health. Radiological influence of depleted uranium is less for 60 % than natural uranium due to the decreasing of short-lived isotopes uranium-234 and uranium-235 after enrichment. The formation of radioactive aerosols and their impact on the human are mentioned. Use of the depleted uranium weapons has also a chemical effect on intake due to possible carcinogenic influence on kidney. Uranium-236 in the substance of the depleted uranium is determined. The fact of beta-radiation formation in the uranium-238 decay is regarded. This effect practically is the same for both depleted and natural uranium. Importance of toxicity of depleted uranium, as the heavier chemical substance, has a considerable contribution to the population health. The paper analyzes risks regarding the use of the depleted uranium weapons. There is international opposition against using weapons with depleted uranium. Resolution on effects of the use of armaments and ammunitions containing depleted uranium was five times supported by the United Nations (USA, United Kingdom, France and Israel did not support). The decision for banning of depleted uranium weapons was supported by the European Parliament

  1. Lithium-ion batteries: Evaluation study of different charging methodologies based on aging process

    International Nuclear Information System (INIS)

    Highlights: • Different charging methodologies have been tested and analyzed. • Battery impedance representation using the Randle’s equivalent circuit. • Investigate the impact of the charging methodology on the battery’s lifetime. • An extended analysis to select the proper charging method that can be used to design an enhanced charging system. - Abstract: In this paper, high power 7 A h LiFePO4-based cells (LFP) have been used to investigate the impact of the charging methodology on the battery’s lifetime. Three charging techniques have been used: Constant Current (CC), Constant Current–Constant Voltage (CC–CV) and Constant Current–Constant Voltage with Negative Pulse (CC–CVNP). A comparative study between these techniques is presented in this research. For this purpose, a characterization of the batteries has been performed using capacity test and electrochemical impedance spectroscopy (EIS). As expected the obtained results showed that the battery’s aging rate depends on the charging methodology. Indeed, it has been shown that a combination of low amplitude and fewest number of negative pulses has a positive effect on battery’s capacity fading. From the impedance measurements, the results have demonstrated that the CC–CVNP technique with low amplitude and fewest number of negative pulses is more effective than the other techniques in reducing the concentration polarization resistance and the diffusion time constant. This research has provided an extended analysis to select the proper charging methodology that can be used to design an enhanced charging system

  2. The development of a neuroscience-based methodology for the nuclear energy learning/teaching process

    International Nuclear Information System (INIS)

    When compared to other energy sources such as fossil fuels, coal, oil, and gas, nuclear energy has perhaps the lowest impact on the environment. Moreover, nuclear energy has also benefited other fields such as medicine, pharmaceutical industry, and agriculture, among others. However, despite all benefits that result from the peaceful uses of nuclear energy, the theme is still addressed with prejudice. Education may be the starting point for public acceptance of nuclear energy as it provides pedagogical approaches, learning environments, and human resources, which are essential conditions for effective learning. So far nuclear energy educational researches have been conducted using only conventional assessment methods. The global educational scenario has demonstrated absence of neuroscience-based methods for the teaching of nuclear energy, and that may be an opportunity for developing new strategic teaching methods that will help demystifying the theme consequently improving public acceptance of this type of energy. This work aims to present the first step of a methodology in progress based on researches in neuroscience to be applied to Brazilian science teachers in order to contribute to an effective teaching/learning process. This research will use the Implicit Association Test (IAT) to verify implicit attitudes of science teachers concerning nuclear energy. Results will provide data for the next steps of the research. The literature has not reported a similar neuroscience-based methodology applied to the nuclear energy learning/teaching process; therefore, this has demonstrated to be an innovating methodology. The development of the methodology is in progress and the results will be presented in future works. (author)

  3. The development of a neuroscience-based methodology for the nuclear energy learning/teaching process

    Energy Technology Data Exchange (ETDEWEB)

    Barabas, Roberta de C.; Sabundjian, Gaiane, E-mail: robertabarabas@usp.br, E-mail: gdjian@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    When compared to other energy sources such as fossil fuels, coal, oil, and gas, nuclear energy has perhaps the lowest impact on the environment. Moreover, nuclear energy has also benefited other fields such as medicine, pharmaceutical industry, and agriculture, among others. However, despite all benefits that result from the peaceful uses of nuclear energy, the theme is still addressed with prejudice. Education may be the starting point for public acceptance of nuclear energy as it provides pedagogical approaches, learning environments, and human resources, which are essential conditions for effective learning. So far nuclear energy educational researches have been conducted using only conventional assessment methods. The global educational scenario has demonstrated absence of neuroscience-based methods for the teaching of nuclear energy, and that may be an opportunity for developing new strategic teaching methods that will help demystifying the theme consequently improving public acceptance of this type of energy. This work aims to present the first step of a methodology in progress based on researches in neuroscience to be applied to Brazilian science teachers in order to contribute to an effective teaching/learning process. This research will use the Implicit Association Test (IAT) to verify implicit attitudes of science teachers concerning nuclear energy. Results will provide data for the next steps of the research. The literature has not reported a similar neuroscience-based methodology applied to the nuclear energy learning/teaching process; therefore, this has demonstrated to be an innovating methodology. The development of the methodology is in progress and the results will be presented in future works. (author)

  4. Temporal Analysis of Snow Cover Depletion in the Eastern Part of Turkey Based on MODIS-Terra and Temperature Data

    Science.gov (United States)

    Akyurek, Z.; Sürer, S.; Bolat, K.

    2012-12-01

    Snow cover is an important feature of mountainous regions. Depending on latitude, the higher altitudes are completely covered by snow for several months in a year. Snow cover is also an important factor for optimum use of water in energy production, flood control, irrigation and reservoir operation optimization, as well as ski tourism. Snow cover depletion curve (SDC) is one of the important variables in snow hydrological applications, and these curves are very much required for snowmelt runoff modeling in a snow-fed catchment. In this study it is aimed to monitor the temporal changes in the snow cover depletion in Upper-Euphrates basin for the period of 2000-2011. Snow mapping was performed by reclassifying the fractional snow cover areas obtained from MODIS-Terra (MOD09GA) data by the algorithm derived for the region. An automatic approach was developed in deriving the snow cover depletion curves. Maximum snow cover occurs in winter months in Upper-Euphrates basin and the amount of maximum snow cover is between 80-90 % of the total area. Approximately 45% of the area is covered with snow in the autumn, the melting occurs in spring and 15% of the area is covered with snow during spring months. At the beginning of April there exists snow generally above 1900 m in the basin, at the lower elevations snow does not stay after the end of February. The previous studies indicate warming trends for the basin's temperatures. Statistically insignificant decreasing trends in precipitation in the basin except autumn season for the period of 1975-2008 were obtained. The major melting period in this basin starts in early April, but in the last three years a shift in snow melting time was detected. When sufficient satellite data are not available due to cloud cover or due to some other reasons, then SDC can be generated using temperature data. Mean cloud coverage for the melting period was obtained as 82% from MODIS-Terra images in the basin. Under changed climate conditions also

  5. A safety assessment methodology applied to CNS/ATM-based air traffic control system

    International Nuclear Information System (INIS)

    In the last decades, the air traffic system has been changing to adapt itself to new social demands, mainly the safe growth of worldwide traffic capacity. Those changes are ruled by the Communication, Navigation, Surveillance/Air Traffic Management (CNS/ATM) paradigm , based on digital communication technologies (mainly satellites) as a way of improving communication, surveillance, navigation and air traffic management services. However, CNS/ATM poses new challenges and needs, mainly related to the safety assessment process. In face of these new challenges, and considering the main characteristics of the CNS/ATM, a methodology is proposed at this work by combining 'absolute' and 'relative' safety assessment methods adopted by the International Civil Aviation Organization (ICAO) in ICAO Doc.9689 , using Fluid Stochastic Petri Nets (FSPN) as the modeling formalism, and compares the safety metrics estimated from the simulation of both the proposed (in analysis) and the legacy system models. To demonstrate its usefulness, the proposed methodology was applied to the 'Automatic Dependent Surveillance-Broadcasting' (ADS-B) based air traffic control system. As conclusions, the proposed methodology assured to assess CNS/ATM system safety properties, in which FSPN formalism provides important modeling capabilities, and discrete event simulation allowing the estimation of the desired safety metric.

  6. Rule-based Expert Systems for Selecting Information Systems Development Methodologies

    Directory of Open Access Journals (Sweden)

    Abdel Nasser H. Zaied

    2013-08-01

    Full Text Available Information Systems (IS are increasingly becoming regarded as crucial to an organization's success. Information Systems Development Methodologies (ISDMs are used by organizations to structure the information system development process. ISDMs are essential for structuring project participants’ thinking and actions; therefore ISDMs play an important role to achieve successful projects. There are different ISDMs and no methodology can claim that it can be applied to any organization. The problem facing decision makers is how to select an appropriate development methodology that may increase the probability of system success. This paper takes this issue into account when study ISDMs and provides a Rule-based Expert System as a tool for selecting appropriate ISDMs. The proposed expert system consists of three main phases to automate the process of selecting ISDMs.Three approaches were used to test the proposed expert system. Face validation through six professors and six IS professionals, predictive validation through twenty four experts and blind validation through nine employees working in IT field.The results show that the proposed system was found to be run without any errors, offered a friendly user interface and its suggestions matching user expectations with 95.8%. It also can help project managers, systems' engineers, systems' developers, consultants, and planners in the process of selecting the suitable ISDM. Finally, the results show that the proposed Rule-based Expert System can facilities the selection process especially for new users and non-specialist in Information System field.

  7. A methodology for collection and analysis of human error data based on a cognitive model: IDA

    International Nuclear Information System (INIS)

    This paper presents a model-based human error taxonomy and data collection. The underlying model, IDA (described in two companion papers), is a cognitive model of behavior developed for analysis of the actions of nuclear power plant operating crew during abnormal situations. The taxonomy is established with reference to three external reference points (i.e. plant status, procedures, and crew) and four reference points internal to the model (i.e. information collected, diagnosis, decision, action). The taxonomy helps the analyst: (1) recognize errors as such; (2) categorize the error in terms of generic characteristics such as 'error in selection of problem solving strategies' and (3) identify the root causes of the error. The data collection methodology is summarized in post event operator interview and analysis summary forms. The root cause analysis methodology is illustrated using a subset of an actual event. Statistics, which extract generic characteristics of error prone behaviors and error prone situations are presented. Finally, applications of the human error data collection are reviewed. A primary benefit of this methodology is to define better symptom-based and other auxiliary procedures with associated training to minimize or preclude certain human errors. It also helps in design of control rooms, and in assessment of human error probabilities in the probabilistic risk assessment framework. (orig.)

  8. IFC BIM-Based Methodology for Semi-Automated Building Energy Performance Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Bazjanac, Vladimir

    2008-07-01

    Building energy performance (BEP) simulation is still rarely used in building design, commissioning and operations. The process is too costly and too labor intensive, and it takes too long to deliver results. Its quantitative results are not reproducible due to arbitrary decisions and assumptions made in simulation model definition, and can be trusted only under special circumstances. A methodology to semi-automate BEP simulation preparation and execution makes this process much more effective. It incorporates principles of information science and aims to eliminate inappropriate human intervention that results in subjective and arbitrary decisions. This is achieved by automating every part of the BEP modeling and simulation process that can be automated, by relying on data from original sources, and by making any necessary data transformation rule-based and automated. This paper describes the new methodology and its relationship to IFC-based BIM and software interoperability. It identifies five steps that are critical to its implementation, and shows what part of the methodology can be applied today. The paper concludes with a discussion of application to simulation with EnergyPlus, and describes data transformation rules embedded in the new Geometry Simplification Tool (GST).

  9. The IAEA Collaborating Centre for Neutron Activation Based Methodologies of Research Reactors

    International Nuclear Information System (INIS)

    The Reactor Institute Delft was inaugurated in May 2009 as a new IAEA Collaborating Centre for Neutron Activation Based Methodologies of Research Reactors. The collaboration involves education, research and development in (i) Production of reactor-produced, no-carrier added radioisotopes of high specific activity via neutron activation; (ii) Neutron activation analysis with emphasis on automation as well as analysis of large samples, and radiotracer techniques; and, as a cross-cutting activity, (iii) Quality assurance and management in research and application of research reactor based techniques and in research reactor operations. (author)

  10. Advanced TEM Characterization for the Development of 28-14nm nodes based on fully-depleted Silicon-on-Insulator Technology

    International Nuclear Information System (INIS)

    The growing demand for wireless multimedia applications (smartphones, tablets, digital cameras) requires the development of devices combining both high speed performances and low power consumption. A recent technological breakthrough making a good compromise between these two antagonist conditions has been proposed: the 28-14nm CMOS transistor generations based on a fully-depleted Silicon-on-Insulator (FD-SOI) performed on a thin Si film of 5-6nm. In this paper, we propose to review the TEM characterization challenges that are essential for the development of extremely power-efficient System on Chip (SoC)

  11. Nonlinear lower hybrid wave depletion

    International Nuclear Information System (INIS)

    Two numerical ray tracing codes with focusing are used to compute lower hybrid daughter wave amplification by quasi-mode parametric decay. The first code, LHPUMP provides a numerical pump model on a grid. This model is used by a second code, LHFQM which computes daughter wave amplification inside the pump extent and follows the rays until their energy is absorbed by the plasma. An analytic model is then used to estimate pump depletion based on the numerical results. Results for PLT indicate strong pump depletion at the plasma edge at high density operation for the 800 Mhz wave frequency, but weak depletion for the 2.45 Ghz experiment. This is proposed to be the mechanism responsible for the high density limit for current drive as well as for the difficulty to heat ions

  12. 5.0. Depletion, activation, and spent fuel source terms

    Energy Technology Data Exchange (ETDEWEB)

    Wieselquist, William A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-04-01

    SCALE’s general depletion, activation, and spent fuel source terms analysis capabilities are enabled through a family of modules related to the main ORIGEN depletion/irradiation/decay solver. The nuclide tracking in ORIGEN is based on the principle of explicitly modeling all available nuclides and transitions in the current fundamental nuclear data for decay and neutron-induced transmutation and relies on fundamental cross section and decay data in ENDF/B VII. Cross section data for materials and reaction processes not available in ENDF/B-VII are obtained from the JEFF-3.0/A special purpose European activation library containing 774 materials and 23 reaction channels with 12,617 neutron-induced reactions below 20 MeV. Resonance cross section corrections in the resolved and unresolved range are performed using a continuous-energy treatment by data modules in SCALE. All nuclear decay data, fission product yields, and gamma-ray emission data are developed from ENDF/B-VII.1 evaluations. Decay data include all ground and metastable state nuclides with half-lives greater than 1 millisecond. Using these data sources, ORIGEN currently tracks 174 actinides, 1149 fission products, and 974 activation products. The purpose of this chapter is to describe the stand-alone capabilities and underlying methodology of ORIGEN—as opposed to the integrated depletion capability it provides in all coupled neutron transport/depletion sequences in SCALE, as described in other chapters.

  13. Visualization of stratospheric ozone depletion and the polar vortex

    Science.gov (United States)

    Treinish, Lloyd A.

    1995-01-01

    Direct analysis of spacecraft observations of stratospheric ozone yields information about the morphology of annual austral depletion. Visual correlation of ozone with other atmospheric data illustrates the diurnal dynamics of the polar vortex and contributions from the upper troposphere, including the formation and breakup of the depletion region each spring. These data require care in their presentation to minimize the introduction of visualization artifacts that are erroneously interpreted as data features. Non geographically registered data of differing mesh structures can be visually correlated via cartographic warping of base geometries without interpolation. Because this approach is independent of the realization technique, it provides a framework for experimenting with many visualization strategies. This methodology preserves the fidelity of the original data sets in a coordinate system suitable for three-dimensional, dynamic examination of atmospheric phenomena.

  14. A Novel Water Supply Network Sectorization Methodology Based on a Complete Economic Analysis, Including Uncertainties

    Directory of Open Access Journals (Sweden)

    Enrique Campbell

    2016-04-01

    Full Text Available The core idea behind sectorization of Water Supply Networks (WSNs is to establish areas partially isolated from the rest of the network to improve operational control. Besides the benefits associated with sectorization, some drawbacks must be taken into consideration by water operators: the economic investment associated with both boundary valves and flowmeters and the reduction of both pressure and system resilience. The target of sectorization is to properly balance these negative and positive aspects. Sectorization methodologies addressing the economic aspects mainly consider costs of valves and flowmeters and of energy, and the benefits in terms of water saving linked to pressure reduction. However, sectorization entails other benefits, such as the reduction of domestic consumption, the reduction of burst frequency and the enhanced capacity to detect and intervene over future leakage events. We implement a development proposed by the International Water Association (IWA to estimate the aforementioned benefits. Such a development is integrated in a novel sectorization methodology based on a social network community detection algorithm, combined with a genetic algorithm optimization method and Monte Carlo simulation. The methodology is implemented over a fraction of the WSN of Managua city, capital of Nicaragua, generating a net benefit of 25,572 $/year.

  15. Efficient Finite Element Methodology Based on Cartesian Grids: Application to Structural Shape Optimization

    Directory of Open Access Journals (Sweden)

    E. Nadal

    2013-01-01

    Full Text Available This work presents an analysis methodology based on the use of the Finite Element Method (FEM nowadays considered one of the main numerical tools for solving Boundary Value Problems (BVPs. The proposed methodology, so-called cg-FEM (Cartesian grid FEM, has been implemented for fast and accurate numerical analysis of 2D linear elasticity problems. The traditional FEM uses geometry-conforming meshes; however, in cg-FEM the analysis mesh is not conformal to the geometry. This allows for defining very efficient mesh generation techniques and using a robust integration procedure, to accurately integrate the domain’s geometry. The hierarchical data structure used in cg-FEM together with the Cartesian meshes allow for trivial data sharing between similar entities. The cg-FEM methodology uses advanced recovery techniques to obtain an improved solution of the displacement and stress fields (for which a discretization error estimator in energy norm is available that will be the output of the analysis. All this results in a substantial increase in accuracy and computational efficiency with respect to the standard FEM. cg-FEM has been applied in structural shape optimization showing robustness and computational efficiency in comparison with FEM solutions obtained with a commercial code, despite the fact that cg-FEM has been fully implemented in MATLAB.

  16. A SystemC-Based Design Methodology for Digital Signal Processing Systems

    Directory of Open Access Journals (Sweden)

    Christian Haubelt

    2007-03-01

    Full Text Available Digital signal processing algorithms are of big importance in many embedded systems. Due to complexity reasons and due to the restrictions imposed on the implementations, new design methodologies are needed. In this paper, we present a SystemC-based solution supporting automatic design space exploration, automatic performance evaluation, as well as automatic system generation for mixed hardware/software solutions mapped onto FPGA-based platforms. Our proposed hardware/software codesign approach is based on a SystemC-based library called SysteMoC that permits the expression of different models of computation well known in the domain of digital signal processing. It combines the advantages of executability and analyzability of many important models of computation that can be expressed in SysteMoC. We will use the example of an MPEG-4 decoder throughout this paper to introduce our novel methodology. Results from a five-dimensional design space exploration and from automatically mapping parts of the MPEG-4 decoder onto a Xilinx FPGA platform will demonstrate the effectiveness of our approach.

  17. Modal macro-strain vector based damage detection methodology with long-gauge FBG sensors

    Science.gov (United States)

    Xu, Bin; Liu, Chongwu W.; Masri, Sami F.

    2009-07-01

    Advances in optic fiber sensing technology provide easy and reliable way for the vibration-based strain measurement of engineering structures. As a typical optic fiber sensing techniques with high accuracy and resolution, long-gauge Fiber Bragg Grating (FBG) sensors have been widely employed in health monitoring of civil engineering structures. Therefore, the development of macro strain-based identification methods is crucial for damage detection and structural condition evaluation. In the previous study by the authors, a damage detection algorithm for a beam structure with the direct use of vibration-based macro-strain measurement time history with neural networks had been proposed and validated with experimental measurements. In this paper, a damage locating and quantifying method was proposed using modal macrostrain vectors (MMSVs) which can be extracted from vibration induced macro-strain response measurement time series from long-gage FBG sensors. The performance of the proposed methodology for damage detection of a beam with different damage scenario was studied with numerical simulation firstly. Then, dynamic tests on a simply-supported steel beam with different damage scenarios were carried out and macro-strain measurements were employed to detect the damage severity. Results show that the proposed MMSV based structural identification and damage detection methodology can locate and identify the structural damage severity with acceptable accuracy.

  18. Systematic Analysis of an IEED Unit Based in a New Methodology for M&S

    Directory of Open Access Journals (Sweden)

    Jesus Carlos Pedraza Ortega

    2011-01-01

    Full Text Available In the field of modeling and simulation of mechatronics designs of high complexity, is presented a systematic analysis of an IEDD unit [1] (improvised explosive device disposal based in a new methodology for modeling and simulation divided into 6 stages in order to increase the accuracy of validation of whole system, this mechatronic unit is a Non-holonomic Unmanned wheeled mobile manipulator, MU-NH-WMM, formed by a Differential Traction and a manipulator arm with 4 degrees of freedom mounted on wheeled mobile base, hence the contribution of this work is a novel methodology based on a practice proposal of philosophy of mechatronics design, which establishes the suitable kinematics to coupled wheeled mobile manipulator, where the motion equations and kinematics transformations are the base of the specific stages in order to obtain the dynamic of coupled system, validating the behavior and the trajectories tracking, in order to achieve the complex tasks of approaching to work area and the appropiatehandling of explosive device, this work is focused in the first of them; such that the errors in the model can be detected and later confined by proposed control.

  19. 77 FR 70451 - Report of the Evidence-Based Methodology Workshop on Polycystic Ovary Syndrome-Request for Comments

    Science.gov (United States)

    2012-11-26

    ... HUMAN SERVICES National Institutes of Health Report of the Evidence-Based Methodology Workshop on... Methodology Workshop on Polycystic Ovary Syndrome, to be held December ] 3-5, 2012. The purpose of the report... held a conference on PCOS to create both a working definition of the disorder and diagnostic...

  20. Depleted uranium in Japan

    International Nuclear Information System (INIS)

    In Japan, depleted uranium ammunition is regarded as nuclear weapons and meets with fierce opposition. The fact that US Marines mistakenly fired bullets containing depleted uranium on an island off Okinawa during training exercises in December 1995 and January 1996, also contributes. The overall situation in this area in Japan is outlined. (P.A.)

  1. Towards A Model-Based Prognostics Methodology for Electrolytic Capacitors: A Case Study Based on Electrical Overstress Accelerated Aging

    Science.gov (United States)

    Celaya, Jose R.; Kulkarni, Chetan S.; Biswas, Gautam; Goebel, Kai

    2012-01-01

    A remaining useful life prediction methodology for electrolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical degradation model. Electrolytic capacitors are used in several applications ranging from power supplies on critical avionics equipment to power drivers for electro-mechanical actuators. These devices are known for their comparatively low reliability and given their criticality in electronics subsystems they are a good candidate for component level prognostics and health management. Prognostics provides a way to assess remaining useful life of a capacitor based on its current state of health and its anticipated future usage and operational conditions. We present here also, experimental results of an accelerated aging test under electrical stresses. The data obtained in this test form the basis for a remaining life prediction algorithm where a model of the degradation process is suggested. This preliminary remaining life prediction algorithm serves as a demonstration of how prognostics methodologies could be used for electrolytic capacitors. In addition, the use degradation progression data from accelerated aging, provides an avenue for validation of applications of the Kalman filter based prognostics methods typically used for remaining useful life predictions in other applications.

  2. Optical fibre-based methodology for screening the effect of probiotic bacteria on conjugated linoleic acid (CLA) in curdled milk

    OpenAIRE

    Silva, Lurdes I. B.; Rodrigues, Dina M.; Freitas, Ana C.; Gomes, Ana M; Teresa A P Rocha-Santos; Pereira, M. E.; Duarte, A.C.

    2011-01-01

    A methodology based on optical fibre (OF) detection was developed for screening the potential of CLA production by Lactobacillus casei-01, Lactobacillus acidophilus La-5 and Bifidobacterium lactis B94 in probiotic curdled milk. The OF based methodology was validated by comparison with an analytical method based on gas chromatography–mass spectrometry (GC–MS) and it showed comparable linearity (between 5 and 130 lg), accuracy and detection limits, which ranged from 1.92 to 2.56 lg ...

  3. The API methodology for risk-based inspection (RBI) analysis for the petroleum and petrochemical industry

    International Nuclear Information System (INIS)

    Twenty-one petroleum and petrochemical companies are currently sponsoring a project within the American Petroleum Institute (API) to develop risk-based inspection (RBI) methodology for application in the refining and petrochemical industry. This paper describes that particular RBI methodology and provides a summary of the three levels of RBI analysis developed by the project. Also included is a review of the first pilot project to validate the methodology by applying RBI to several existing refining units. The failure for pressure equipment in a process unit can have several undesirable effects. For the purpose of RBI analysis, the API RBI program categorizes these effects into four basic risk outcomes: flammable events, toxic releases, major environmental damage, and business interruption losses. API RBI is a strategic process, both qualitative and quantitative, for understanding and reducing these risks associated with operating pressure equipment. This paper will show how API RBI assesses the potential consequences of a failure of the pressure boundary, as well as assessing the likelihood (probability) of failure. Risk-based inspection also prioritizes risk levels in a systematic manner so that the owner-user can then plan an inspection program that focuses more resources on the higher risk equipment; while possibly saving inspection resources that are not doing an effective job of reducing risk. At the same time, if consequence of failure is a significant driving force for high risk equipment items, plant management also has the option of applying consequence mitigation steps to minimize the impact of a hazardous release, should one occur. The target audience for this paper is engineers, inspectors, and managers who want to understand what API Risk-Based Inspection is all about, what are the benefits and limitations of RBI, and how inspection practices can be changed to reduce risks and/or save costs without impacting safety risk. (Author)

  4. A segmental hidden semi-Markov model (HSMM)-based diagnostics and prognostics framework and methodology

    Science.gov (United States)

    Dong, Ming; He, David

    2007-07-01

    Diagnostics and prognostics are two important aspects in a condition-based maintenance (CBM) program. However, these two tasks are often separately performed. For example, data might be collected and analysed separately for diagnosis and prognosis. This practice increases the cost and reduces the efficiency of CBM and may affect the accuracy of the diagnostic and prognostic results. In this paper, a statistical modelling methodology for performing both diagnosis and prognosis in a unified framework is presented. The methodology is developed based on segmental hidden semi-Markov models (HSMMs). An HSMM is a hidden Markov model (HMM) with temporal structures. Unlike HMM, an HSMM does not follow the unrealistic Markov chain assumption and therefore provides more powerful modelling and analysis capability for real problems. In addition, an HSMM allows modelling the time duration of the hidden states and therefore is capable of prognosis. To facilitate the computation in the proposed HSMM-based diagnostics and prognostics, new forward-backward variables are defined and a modified forward-backward algorithm is developed. The existing state duration estimation methods are inefficient because they require a huge storage and computational load. Therefore, a new approach is proposed for training HSMMs in which state duration probabilities are estimated on the lattice (or trellis) of observations and states. The model parameters are estimated through the modified forward-backward training algorithm. The estimated state duration probability distributions combined with state-changing point detection can be used to predict the useful remaining life of a system. The evaluation of the proposed methodology was carried out through a real world application: health monitoring of hydraulic pumps. In the tests, the recognition rates for all states are greater than 96%. For each individual pump, the recognition rate is increased by 29.3% in comparison with HMMs. Because of the temporal

  5. Regression Tree-Based Methodology for Customizing Building Energy Benchmarks to Individual Commercial Buildings

    Science.gov (United States)

    Kaskhedikar, Apoorva Prakash

    According to the U.S. Energy Information Administration, commercial buildings represent about 40% of the United State's energy consumption of which office buildings consume a major portion. Gauging the extent to which an individual building consumes energy in excess of its peers is the first step in initiating energy efficiency improvement. Energy Benchmarking offers initial building energy performance assessment without rigorous evaluation. Energy benchmarking tools based on the Commercial Buildings Energy Consumption Survey (CBECS) database are investigated in this thesis. This study proposes a new benchmarking methodology based on decision trees, where a relationship between the energy use intensities (EUI) and building parameters (continuous and categorical) is developed for different building types. This methodology was applied to medium office and school building types contained in the CBECS database. The Random Forest technique was used to find the most influential parameters that impact building energy use intensities. Subsequently, correlations which were significant were identified between EUIs and CBECS variables. Other than floor area, some of the important variables were number of workers, location, number of PCs and main cooling equipment. The coefficient of variation was used to evaluate the effectiveness of the new model. The customization technique proposed in this thesis was compared with another benchmarking model that is widely used by building owners and designers namely, the ENERGY STAR's Portfolio Manager. This tool relies on the standard Linear Regression methods which is only able to handle continuous variables. The model proposed uses data mining technique and was found to perform slightly better than the Portfolio Manager. The broader impacts of the new benchmarking methodology proposed is that it allows for identifying important categorical variables, and then incorporating them in a local, as against a global, model framework for EUI

  6. Verification of the COCAGNE core code using cluster depletion calculations

    International Nuclear Information System (INIS)

    EDF/R and D is developing a new calculation scheme based on the transport- Simplified Pn (SPn) approach. The lattice code used is the deterministic code APOLLO2, developed at CEA with the support of EDF and AREVA-NP. The core code is the code COCAGNE, developed at EDF R and D. The latter can take advantage of a microscopic depletion solver which improves the treatment of spectral history effects. This solver can resort to a specific correction based on the use of the Pu239 concentration as a spectral indicator. In order to evaluate the improvements brought by this Pu239 correction model, one uses (3x3 assemblies) cluster depletion calculations as test-cases. UOX and UOX/MOX clusters are both considered. As a reference, APOLLO2 depletion calculations of these clusters, using a critical boron (CB) search scheme at each calculation step, are performed. This choice of methodology (using CB search instead of a fixed average CB) enables to highlight historical spectral effects related to the boron concentration. This methodology is also more consistent with the depletion calculation of real cores. Pin by pin COCAGNE calculations are performed and compared with the APOLLO2 results. The analysis of the results obtained shows that the boron concentration computed by COCAGNE gets more consistent with APOLLO2 when the Pu239 corrector is used, especially for UOX/MOX clusters. As for pin power distribution, the use of the Pu239 model also enables to reduce slightly the gap between APOLLO2 and COCAGNE. This work will be extended to clusters with gadolinium-poisoned fuel assemblies and reflector regions. (author)

  7. Management of depleted uranium

    International Nuclear Information System (INIS)

    Large stocks of depleted uranium have arisen as a result of enrichment operations, especially in the United States and the Russian Federation. Countries with depleted uranium stocks are interested in assessing strategies for the use and management of depleted uranium. The choice of strategy depends on several factors, including government and business policy, alternative uses available, the economic value of the material, regulatory aspects and disposal options, and international market developments in the nuclear fuel cycle. This report presents the results of a depleted uranium study conducted by an expert group organised jointly by the OECD Nuclear Energy Agency and the International Atomic Energy Agency. It contains information on current inventories of depleted uranium, potential future arisings, long term management alternatives, peaceful use options and country programmes. In addition, it explores ideas for international collaboration and identifies key issues for governments and policy makers to consider. (authors)

  8. New methodology to determine air quality in urban areas based on runs rules for functional data

    Science.gov (United States)

    Sancho, J.; Martínez, J.; Pastor, J. J.; Taboada, J.; Piñeiro, J. I.; García-Nieto, P. J.

    2014-02-01

    Functional data appear in a multitude of industrial applications and processes. However, in many cases at present, such data continue to be studied from the conventional standpoint based on Statistical Process Control (SPC), losing the capacity of analysing different aspects over the time. In this study, the well-known runs rules for Shewhart Type Control Charts are adapted to the case of functional data. Also, in the application of this functional approach, a number of advantages over the classical one are described. Furthermore, the results of applying this new methodology are analysed to determine the air quality of urban areas from the gas emissions at different weather stations.

  9. Technology-based risk calculation methodology in coastal container liner shipping

    OpenAIRE

    Bukša, Juraj; Frančić, Vlado; Bukša, Tomislav

    2015-01-01

    The methodology of business and technology risk evaluation and management in shipping is based on three key factors: the voyage duration, the detected spots of technological differences and the spots of consequence costs. The lowest costs of a vessel on a voyage or on a segment of a voyage are considered to be the optimal costs of a certain vessel on the voyage or on the segment of the voyage. Each cost that arises on a voyage or on a segment of a voyage which is higher than the lowest rec...

  10. A Methodology to Evaluate Object oriented Software Systems Using Change Requirement Traceability Based on Impact Analysis

    Directory of Open Access Journals (Sweden)

    Sunil T. D

    2014-06-01

    Full Text Available It is a well known fact that software maintenance plays a major role and finds importance in software development life cycle. As object - oriented programming has become the standard, it is very important to understand the problems of maintaining object -oriented software systems. This paper aims at evaluating object - oriented software system through change requirement traceability – based impact analysis methodology for non functional requirements using functional requirements . The major issues have been related to change impact algorithms and inheritance of functionality.

  11. Smart learning objects for smart education in computer science theory, methodology and robot-based implementation

    CERN Document Server

    Stuikys, Vytautas

    2015-01-01

    This monograph presents the challenges, vision and context to design smart learning objects (SLOs) through Computer Science (CS) education modelling and feature model transformations. It presents the latest research on the meta-programming-based generative learning objects (the latter with advanced features are treated as SLOs) and the use of educational robots in teaching CS topics. The introduced methodology includes the overall processes to develop SLO and smart educational environment (SEE) and integrates both into the real education setting to provide teaching in CS using constructivist a

  12. Location of Bioelectricity Plants in the Madrid Community Based on Triticale Crop: A Multicriteria Methodology

    Directory of Open Access Journals (Sweden)

    L. Romero

    2015-01-01

    Full Text Available This paper presents a work whose objective is, first, to quantify the potential of the triticale biomass existing in each of the agricultural regions in the Madrid Community through a crop simulation model based on regression techniques and multiple correlation. Second, a methodology for defining which area has the best conditions for the installation of electricity plants from biomass has been described and applied. The study used a methodology based on compromise programming in a discrete multicriteria decision method (MDM context. To make a ranking, the following criteria were taken into account: biomass potential, electric power infrastructure, road networks, protected spaces, and urban nuclei surfaces. The results indicate that, in the case of the Madrid Community, the Campiña region is the most suitable for setting up plants powered by biomass. A minimum of 17,339.9 tons of triticale will be needed to satisfy the requirements of a 2.2 MW power plant. The minimum range of action for obtaining the biomass necessary in Campiña region would be 6.6 km around the municipality of Algete, based on Geographic Information Systems. The total biomass which could be made available in considering this range in this region would be 18,430.68 t.

  13. Water Depletion Threatens Agriculture

    Science.gov (United States)

    Brauman, K. A.; Richter, B. D.; Postel, S.; Floerke, M.; Malsy, M.

    2014-12-01

    Irrigated agriculture is the human activity that has by far the largest impact on water, constituting 85% of global water consumption and 67% of global water withdrawals. Much of this water use occurs in places where water depletion, the ratio of water consumption to water availability, exceeds 75% for at least one month of the year. Although only 17% of global watershed area experiences depletion at this level or more, nearly 30% of total cropland and 60% of irrigated cropland are found in these depleted watersheds. Staple crops are particularly at risk, with 75% of global irrigated wheat production and 65% of irrigated maize production found in watersheds that are at least seasonally depleted. Of importance to textile production, 75% of cotton production occurs in the same watersheds. For crop production in depleted watersheds, we find that one half to two-thirds of production occurs in watersheds that have not just seasonal but annual water shortages, suggesting that re-distributing water supply over the course of the year cannot be an effective solution to shortage. We explore the degree to which irrigated production in depleted watersheds reflects limitations in supply, a byproduct of the need for irrigation in perennially or seasonally dry landscapes, and identify heavy irrigation consumption that leads to watershed depletion in more humid climates. For watersheds that are not depleted, we evaluate the potential impact of an increase in irrigated production. Finally, we evaluate the benefits of irrigated agriculture in depleted and non-depleted watersheds, quantifying the fraction of irrigated production going to food production, animal feed, and biofuels.

  14. Systematic screening methodology and energy efficient design of ionic liquid-based separation processes

    DEFF Research Database (Denmark)

    Kulajanpeng, Kusuma; Suriyapraphadilok, Uthaiporn; Gani, Rafiqul

    2016-01-01

    based on a combination of criteria such as stability, toxicity, and their environmental impacts. All best ILs were used as entrainers, and an extractive distillation column (EDC) and ionic liquid recovery column were designed and simulated with a process simulator to determine the overall energy...... consumption of the ILs-based separation processes. Among all candidates, the best IL was selected based on the minimum energy requirement obtained from the simulation. Finally, the modification of the separation process to obtain design flexibility for other azeotropic series with respect to the change in...... size of the target solute was investigated using the same separation process and IL entrainer to obtain the same product purity. The proposed methodology has been evaluated through a case study of binary alcoholic aqueous azeotropic separation: water+ethanol and water+isopropanol....

  15. Risk Based Inspection Methodology and Software Applied to Atmospheric Storage Tanks

    International Nuclear Information System (INIS)

    A new risk-based inspection (RBI) methodology and software is presented in this paper. The objective of this work is to allow management of the inspections of atmospheric storage tanks in the most efficient way, while, at the same time, accident risks are minimized. The software has been built on the new risk framework architecture, a generic platform facilitating efficient and integrated development of software applications using risk models. The framework includes a library of risk models and the user interface is automatically produced on the basis of editable schemas. This risk-framework-based RBI tool has been applied in the context of RBI for above-ground atmospheric storage tanks (AST) but it has been designed with the objective of being generic enough to allow extension to the process plants in general. This RBI methodology is an evolution of an approach and mathematical models developed for Det Norske Veritas (DNV) and the American Petroleum Institute (API). The methodology assesses damage mechanism potential, degradation rates, probability of failure (PoF), consequence of failure (CoF) in terms of environmental damage and financial loss, risk and inspection intervals and techniques. The scope includes assessment of the tank floor for soil-side external corrosion and product-side internal corrosion and the tank shell courses for atmospheric corrosion and internal thinning. It also includes preliminary assessment for brittle fracture and cracking. The data are structured according to an asset hierarchy including Plant, Production Unit, Process Unit, Tag, Part and Inspection levels and the data are inherited / defaulted seamlessly from a higher hierarchy level to a lower level. The user interface includes synchronized hierarchy tree browsing, dynamic editor and grid-view editing and active reports with drill-in capability.

  16. Risk Based Inspection Methodology and Software Applied to Atmospheric Storage Tanks

    Science.gov (United States)

    Topalis, P.; Korneliussen, G.; Hermanrud, J.; Steo, Y.

    2012-05-01

    A new risk-based inspection (RBI) methodology and software is presented in this paper. The objective of this work is to allow management of the inspections of atmospheric storage tanks in the most efficient way, while, at the same time, accident risks are minimized. The software has been built on the new risk framework architecture, a generic platform facilitating efficient and integrated development of software applications using risk models. The framework includes a library of risk models and the user interface is automatically produced on the basis of editable schemas. This risk-framework-based RBI tool has been applied in the context of RBI for above-ground atmospheric storage tanks (AST) but it has been designed with the objective of being generic enough to allow extension to the process plants in general. This RBI methodology is an evolution of an approach and mathematical models developed for Det Norske Veritas (DNV) and the American Petroleum Institute (API). The methodology assesses damage mechanism potential, degradation rates, probability of failure (PoF), consequence of failure (CoF) in terms of environmental damage and financial loss, risk and inspection intervals and techniques. The scope includes assessment of the tank floor for soil-side external corrosion and product-side internal corrosion and the tank shell courses for atmospheric corrosion and internal thinning. It also includes preliminary assessment for brittle fracture and cracking. The data are structured according to an asset hierarchy including Plant, Production Unit, Process Unit, Tag, Part and Inspection levels and the data are inherited / defaulted seamlessly from a higher hierarchy level to a lower level. The user interface includes synchronized hierarchy tree browsing, dynamic editor and grid-view editing and active reports with drill-in capability.

  17. A proposal on teaching methodology: cooperative learning by peer tutoring based on the case method

    Science.gov (United States)

    Pozo, Antonio M.; Durbán, Juan J.; Salas, Carlos; del Mar Lázaro, M.

    2014-07-01

    The European Higher Education Area (EHEA) proposes substantial changes in the teaching-learning model, moving from a model based mainly on the activity of teachers to a model in which the true protagonist is the student. This new framework requires that students develop new abilities and acquire specific skills. This also implies that the teacher should incorporate new methodologies in class. In this work, we present a proposal on teaching methodology based on cooperative learning and peer tutoring by case study. A noteworthy aspect of the case-study method is that it presents situations that can occur in real life. Therefore, students can acquire certain skills that will be useful in their future professional practice. An innovative aspect in the teaching methodology that we propose is to form work groups consisting of students from different levels in the same major. In our case, the teaching of four subjects would be involved: one subject of the 4th year, one subject of the 3rd year, and two subjects of the 2nd year of the Degree in Optics and Optometry of the University of Granada, Spain. Each work group would consist of a professor and a student of the 4th year, a professor and a student of the 3rd year, and two professors and two students of the 2nd year. Each work group would have a tutoring process from each professor for the corresponding student, and a 4th-year student providing peer tutoring for the students of the 2nd and 3rd year.

  18. Optimization of cocoa nib roasting based on sensory properties and colour using response surface methodology

    Directory of Open Access Journals (Sweden)

    D.M.H. A.H. Farah

    2012-05-01

    Full Text Available Roasting of cocoa beans is a critical stage for development of its desirable flavour, aroma and colour. Prior to roasting, cocoa bean may taste astringent, bitter, acidy, musty, unclean, nutty or even chocolate-like, depends on the bean sources and their preparations. After roasting, the bean possesses a typical intense cocoa flavour. The Maillard or non-enzymatic browning reactions is a very important process for the development of cocoa flavor, which occurs primarily during the roasting process and it has generally been agreed that the main flavor components, pyrazines formation is associated within this reaction involving amino acids and reducing sugars. The effect of cocoa nib roasting conditions on sensory properties and colour of cocoa beans were investigated in this study. Roasting conditions in terms of temperature ranged from 110 to 160OC and time ranged from 15 to 40 min were optimized by using Response Surface Methodology based on the cocoa sensory characteristics including chocolate aroma, acidity, astringency, burnt taste and overall acceptability. The analyses used 9- point hedonic scale with twelve trained panelist. The changes in colour due to the roasting condition were also monitored using chromameter. Result of this study showed that sensory quality of cocoa liquor increased with the increase in roasting time and temperature up to 160OC and up to 40 min, respectively. Based on the Response Surface Methodology, the optimised operating condition for the roaster was at temperature of 127OC and time of 25 min. The proposed roasting conditions were able to produce superior quality cocoa beans that will be very useful for cocoa manufactures.Key words : Cocoa, cocoa liquor, flavour, aroma, colour, sensory characteristic, response surface methodology.

  19. Flow methodology for methanol determination in biodiesel exploiting membrane-based extraction

    International Nuclear Information System (INIS)

    A methodology based in flow analysis and membrane-based extraction has been applied to the determination of methanol in biodiesel samples. A hydrophilic membrane was used to perform the liquid-liquid extraction in the system with the organic sample fed to the donor side of the membrane and the methanol transfer to an aqueous acceptor buffer solution. The quantification of the methanol was then achieved in aqueous solution by the combined use of immobilised alcohol oxidase (AOD), soluble peroxidase and 2,2'-azino-bis(3-ethylbenzothiazoline-6-sulfonic acid) (ABTS). The optimization of parameters such as the type of membrane, the groove volume and configuration of the membrane unit, the appropriate organic solvent, sample injection volume, as well as immobilised packed AOD reactor was performed. Two dynamic analytical working ranges were achieved, up to 0.015% and up to 0.200% (m/m) methanol concentrations, just by changing the volume of acceptor aqueous solution. Detection limits of 0.0002% (m/m) and 0.007% (m/m) methanol were estimated, respectively. The decision limit (CCα) and the detection capacity (CCβ) were 0.206 and 0.211% (m/m), respectively. The developed methodology showed good precision, with a relative standard deviation (R.S.D.) <5.0% (n = 10). Biodiesel samples from different sources were then directly analyzed without any sample pre-treatment. Statistical evaluation showed good compliance, for a 95% confidence level, between the results obtained with the flow system and those furnished by the gas chromatography reference method. The proposed methodology turns out to be more environmental friendly and cost-effective than the reference method

  20. Model-driven methodology for rapid deployment of smart spaces based on resource-oriented architectures.

    Science.gov (United States)

    Corredor, Iván; Bernardos, Ana M; Iglesias, Josué; Casar, José R

    2012-01-01

    Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT) and Web of Things (WoT) are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i) to integrate sensing and actuating functionalities into everyday objects, and (ii) to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD) methodology based on the Model Driven Architecture (MDA). This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym. PMID:23012544

  1. Model-Driven Methodology for Rapid Deployment of Smart Spaces Based on Resource-Oriented Architectures

    Directory of Open Access Journals (Sweden)

    José R. Casar

    2012-07-01

    Full Text Available Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT and Web of Things (WoT are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i to integrate sensing and actuating functionalities into everyday objects, and (ii to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD methodology based on the Model Driven Architecture (MDA. This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.

  2. Recursion-based depletion of human immunodeficiency virus-specific naive CD4(+) T cells may facilitate persistent viral replication and chronic viraemia leading to acquired immunodeficiency syndrome.

    Science.gov (United States)

    Tsukamoto, Tetsuo; Yamamoto, Hiroyuki; Okada, Seiji; Matano, Tetsuro

    2016-09-01

    Although antiretroviral therapy has made human immunodeficiency virus (HIV) infection a controllable disease, it is still unclear how viral replication persists in untreated patients and causes CD4(+) T-cell depletion leading to acquired immunodeficiency syndrome (AIDS) in several years. Theorists tried to explain it with the diversity threshold theory in which accumulated mutations in the HIV genome make the virus so diverse that the immune system will no longer be able to recognize all the variants and fail to control the viraemia. Although the theory could apply to a number of cases, macaque AIDS models using simian immunodeficiency virus (SIV) have shown that failed viral control at the set point is not always associated with T-cell escape mutations. Moreover, even monkeys without a protective major histocompatibility complex (MHC) allele can contain replication of a super infected SIV following immunization with a live-attenuated SIV vaccine, while those animals are not capable of fighting primary SIV infection. Here we propose a recursion-based virus-specific naive CD4(+) T-cell depletion hypothesis through thinking on what may happen in individuals experiencing primary immunodeficiency virus infection. This could explain the mechanism for impairment of virus-specific immune response in the course of HIV infection. PMID:27515208

  3. GIS-based decision support methodology for the assessment of the impacts of mining

    Energy Technology Data Exchange (ETDEWEB)

    Daniel Palamara; Ernest Baafi; Phil Flentje [University of Wollongong, NSW (Australia)

    2007-05-15

    The objectives of this project were to develop and demonstrate practical decision support methodology for the assessment of the impacts of mining subsidence on natural features. The decision support tools were developed within the flexibility of the Geographic Information System (GIS) environment and uses relevant case studies to demonstrate the usefulness of GIS tools. The use of GIS was prompted by the fact that the process of understanding and managing coalmine subsidence impacts is, to a large part, a spatial one and that many of the factors that are critical to the assessment of subsidence impacts have a strong spatial component. Features identified using spatial data and GIS (or traditional field-based methods) can be evaluated for potential impact susceptibility based on either knowledge-based or data-driven methodologies. Case studies are presented based on both methods, focussing on the cliffs of the Nepean River near the proposed BHPB Illawarra Coal Douglas mine. The knowledge-based case study employs a number of spatial data layers and the factors identified in the 'Management Information Handbook - The Undermining of Cliffs, Gorges, and River Systems' to produce an assessment of expected cliff impacts. The results of this case study demonstrate the advantages of a digital, spatial approach to impact assessment. The main recommendations to come from this report is that the coalmining Industry as a whole should encourage and facilitate the development of subsidence impact databases, consisting of mapped and annotated impacts associated with past and current mining activities. The value of collecting accurate spatial records of subsidence impacts, such as surface and underground fracturing, upsidence, cliff falls and so on is demonstrated in numerous case studies throughout this report.

  4. Attitudes toward simulation-based learning in nursing students: an application of Q methodology.

    Science.gov (United States)

    Yeun, Eun Ja; Bang, Ho Yoon; Ryoo, Eon Na; Ha, Eun-Ho

    2014-07-01

    SBL is a highly advanced educational method that promotes technical/non-technical skills, increases team competency, and increases health care team interaction in a safe health care environment with no potential for harm to the patient. Even though students may experience the same simulation, their reactions are not necessarily uniform. This study aims at identifying the diversely perceived attitudes of undergraduate nursing students toward simulation-based learning. This study design was utilized using a Q methodology, which analyzes the subjectivity of each type of attitude. Data were collected from 22 undergraduate nursing students who had an experience of simulation-based learning before going to the clinical setting. The 45 selected Q-statements from each of 22 participants were classified into the shape of a normal distribution using a 9-point scale. The collected data was analyzed using the pc-QUANL program. The results revealed two discrete groups of students toward simulation-based learning: 'adventurous immersion' and 'constructive criticism'. The findings revealed that teaching and learning strategies based on the two factors of attitudes could beneficially contribute to the customization of simulation-based learning. In nursing education and clinical practice, teaching and learning strategies based on types I and II can be used to refine an alternative learning approach that supports and complements clinical practice. Recommendations have been provided based on the findings. PMID:24629271

  5. Analysis of Distortional Effects of Taxation on Financial and Investment Decision Based on the Methodology of Effective Tax Rates Calculation

    OpenAIRE

    Jaroslava Holečková

    2012-01-01

    The objective of this paper is to examine the use of effective tax rates on different types of capital assets and sources of financing and to assess on the base of calculation of the tax wedges the degree to which taxation affects the incentive to undertake investment. The methodology used to calculate effective tax rates on investments is based on an approach developed by the King and Fullerton methodology (1984), which has become the most widely accepted method adopted to calculating effect...

  6. Technology-Enhanced Problem-Based Learning Methodology in Geographically Dispersed Learners of Tshwane University of Technology

    OpenAIRE

    Sibitse M. Tlhapane; Sibongile Simelane

    2010-01-01

    Improving teaching and learning methodologies is not just a wish but rather strife for most educational institutions globally. To attain this, the Adelaide Tambo School of Nursing Science implemented a Technology-enhanced Problem-Based Learning methodology in the programme B Tech Occupational Nursing, in 2006. This is a two-year post-basic nursing program. The students are geographically dispersed and the curriculum design is the typically student-centred outcomes-based education. The researc...

  7. Delivering spacecraft control centers with embedded knowledge-based systems: The methodology issue

    Science.gov (United States)

    Ayache, S.; Haziza, M.; Cayrac, D.

    1994-01-01

    Matra Marconi Space (MMS) occupies a leading place in Europe in the domain of satellite and space data processing systems. The maturity of the knowledge-based systems (KBS) technology, the theoretical and practical experience acquired in the development of prototype, pre-operational and operational applications, make it possible today to consider the wide operational deployment of KBS's in space applications. In this perspective, MMS has to prepare the introduction of the new methods and support tools that will form the basis of the development of such systems. This paper introduces elements of the MMS methodology initiatives in the domain and the main rationale that motivated the approach. These initiatives develop along two main axes: knowledge engineering methods and tools, and a hybrid method approach for coexisting knowledge-based and conventional developments.

  8. Vision-based methodology for collaborative management of qualitative criteria in design

    DEFF Research Database (Denmark)

    Tollestrup, Christian

    2006-01-01

    A Vision-based methodology is proposed as part of the management of qualitative criteria for design in early phases of the product development process for team based organisations. Focusing on abstract values and qualities for the product establishes a shared vision for the product amongst team...... members. Two anchor points are used for representing these values and qualities, the Value Mission and the Interaction Vision. Qualifying the meaning of these words trough triangulation of methods develops a shared mental model within the team. The composition of keywords within the Vision and Mission...... establishes a field of tension that summarises the abstract criteria and pinpoints the desired uniqueness of the product. The Interaction Vision allows the team members to design the behaviour of the product without deciding on physical features, thus focusing on the cognitive aspects of the product as a...

  9. A Visual Analytics Based Decision Support Methodology For Evaluating Low Energy Building Design Alternatives

    Science.gov (United States)

    Dutta, Ranojoy

    The ability to design high performance buildings has acquired great importance in recent years due to numerous federal, societal and environmental initiatives. However, this endeavor is much more demanding in terms of designer expertise and time. It requires a whole new level of synergy between automated performance prediction with the human capabilities to perceive, evaluate and ultimately select a suitable solution. While performance prediction can be highly automated through the use of computers, performance evaluation cannot, unless it is with respect to a single criterion. The need to address multi-criteria requirements makes it more valuable for a designer to know the "latitude" or "degrees of freedom" he has in changing certain design variables while achieving preset criteria such as energy performance, life cycle cost, environmental impacts etc. This requirement can be met by a decision support framework based on near-optimal "satisficing" as opposed to purely optimal decision making techniques. Currently, such a comprehensive design framework is lacking, which is the basis for undertaking this research. The primary objective of this research is to facilitate a complementary relationship between designers and computers for Multi-Criterion Decision Making (MCDM) during high performance building design. It is based on the application of Monte Carlo approaches to create a database of solutions using deterministic whole building energy simulations, along with data mining methods to rank variable importance and reduce the multi-dimensionality of the problem. A novel interactive visualization approach is then proposed which uses regression based models to create dynamic interplays of how varying these important variables affect the multiple criteria, while providing a visual range or band of variation of the different design parameters. The MCDM process has been incorporated into an alternative methodology for high performance building design referred to as

  10. Uses of depleted uranium

    International Nuclear Information System (INIS)

    The depleted uranium is that in which percentage of uranium-235 fission executable is less than 0.2% or 0.3%. It is usually caused by the process of reprocessing the nuclear fuel burning, and also mixed with some other radioactive elements such as uranium 236, 238 and plutonium 239. The good features of the depleted uranium are its high density, low price and easily mined. So, the specifications for depleted uranium make it one of the best materials in case you need to have objects small in size, but quite heavy regarding its size. Uses of deplet ed uranium were relatively increased in domestic industrial uses as well as some uses in nuclear industry in the last few years. So it has increased uses in many areas of military and peaceful means such as: in balancing the giant air crafts, ships and missiles and in the manufacture of some types of concrete with severe hardness. (author)

  11. Evaluation methodology based on physical security assessment results: a utility theory approach

    International Nuclear Information System (INIS)

    This report describes an evaluation methodology which aggregates physical security assessment results for nuclear facilities into an overall measure of adequacy. This methodology utilizes utility theory and conforms to a hierarchical structure developed by the NRC. Implementation of the methodology is illustrated by several examples. Recommendations for improvements in the evaluation process are given

  12. Computationally based methodology for reengineering the high-level waste planning process at SRS

    International Nuclear Information System (INIS)

    The Savannah River Site (SRS) has started processing its legacy of 34 million gallons of high-level radioactive waste into its final disposable form. The SRS high-level waste (HLW) complex consists of 51 waste storage tanks, 3 evaporators, 6 waste treatment operations, and 2 waste disposal facilities. It is estimated that processing wastes to clean up all tanks will take 30+ yr of operation. Integrating all the highly interactive facility operations through the entire life cycle in an optimal fashion-while meeting all the budgetary, regulatory, and operational constraints and priorities-is a complex and challenging planning task. The waste complex operating plan for the entire time span is periodically published as an SRS report. A computationally based integrated methodology has been developed that has streamlined the planning process while showing how to run the operations at economically and operationally optimal conditions. The integrated computational model replaced a host of disconnected spreadsheet calculations and the analysts' trial-and-error solutions using various scenario choices. This paper presents the important features of the integrated computational methodology and highlights the parameters that are core components of the planning process

  13. Vocalist - an international programme for the validation of constraint based methodology in structural integrity

    International Nuclear Information System (INIS)

    The pattern of crack-tip stresses and strains causing plastic flow and fracture in components is different to that in test specimens. This gives rise to the so-called constraint effect. Crack-tip constraint in components is generally lower than in test specimens. Effective toughness is correspondingly higher. The fracture toughness measured on test specimens is thus likely to underestimate that exhibited by cracks in components. A 36-month programme was initiated in October 2000 as part of the Fifth Framework of the European Atomic Energy Community (EURATOM), with the objective of achieving (i) an improved defect assessment methodology for predicting safety margins; (ii) improved lifetime management arguments. The programme VOCALIST (Validation of Constraint Based Methodology in Structural Integrity) is one of a 'cluster' of Fifth Framework projects in the area of Plant Life Management (Nuclear Fission). VOCALIST is also an associated project of NESC (Network for Evaluating Steel Components). The present paper describes the aims and objectives of VOCALIST, its interactions with NESC, and gives details of its various Work Packages. (authors)

  14. Rapid and Robust PCR-Based All-Recombinant Cloning Methodology.

    Science.gov (United States)

    Dubey, Abhishek Anil; Singh, Manika Indrajit; Jain, Vikas

    2016-01-01

    We report here a PCR-based cloning methodology that requires no post-PCR modifications such as restriction digestion and phosphorylation of the amplified DNA. The advantage of the present method is that it yields only recombinant clones thus eliminating the need for screening. Two DNA amplification reactions by PCR are performed wherein the first reaction amplifies the gene of interest from a source template, and the second reaction fuses it with the designed expression vector fragments. These vector fragments carry the essential elements that are required for the fusion product selection. The entire process can be completed in less than 8 hours. Furthermore, ligation of the amplified DNA by a DNA ligase is not required before transformation, although the procedure yields more number of colonies upon transformation if ligation is carried out. As a proof-of-concept, we show the cloning and expression of GFP, adh, and rho genes. Using GFP production as an example, we further demonstrate that the E. coli T7 express strain can directly be used in our methodology for the protein expression immediately after PCR. The expressed protein is without or with 6xHistidine tag at either terminus, depending upon the chosen vector fragments. We believe that our method will find tremendous use in molecular and structural biology. PMID:27007922

  15. THE METHODOLOGY OF STUDENTS’ SYNERGETIC WORLD OUTLOOK DEVELOPMENT BASED ON THE TRANS-DISCIPLINARY APPROACH

    Directory of Open Access Journals (Sweden)

    Y. A. Solodova

    2015-03-01

    Full Text Available The paper discusses the present stage of the world educational system development influenced by the fast increasing flow of information and knowledge. The situation requires the adequate pedagogical technologies for compressing the learning information; one of them is the transdisciplinary technology based on the synergetic methodology identifying the order parameters and general conformities of organizing the academic content. The trans-disciplinary technologies incorporate the general laws of evolution, Bohr’s principle of complementarity, fundamental concepts of nonlinearity, fractality, actual and potential infinity, etc. As an illustration to the trans-disciplinary approach, the author analyzes the fundamental methodology principles of Aristotle and Newton’s mechanics. The author points out the equal importance of understanding the asymptotic adequacy principle by students of the natural sciences and humanities profiles; implementation of the trans-disciplinary approach being regarded as a way for the fundamental knowledge acquisition and the world outlook development. The research findings are addressed to the higher school academic staff for theoretical and practical applications

  16. THE METHODOLOGY OF STUDENTS’ SYNERGETIC WORLD OUTLOOK DEVELOPMENT BASED ON THE TRANS-DISCIPLINARY APPROACH

    Directory of Open Access Journals (Sweden)

    Y. A. Solodova

    2014-01-01

    Full Text Available The paper discusses the present stage of the world educational system development influenced by the fast increasing flow of information and knowledge. The situation requires the adequate pedagogical technologies for compressing the learning information; one of them is the transdisciplinary technology based on the synergetic methodology identifying the order parameters and general conformities of organizing the academic content. The trans-disciplinary technologies incorporate the general laws of evolution, Bohr’s principle of complementarity, fundamental concepts of nonlinearity, fractality, actual and potential infinity, etc. As an illustration to the trans-disciplinary approach, the author analyzes the fundamental methodology principles of Aristotle and Newton’s mechanics. The author points out the equal importance of understanding the asymptotic adequacy principle by students of the natural sciences and humanities profiles; implementation of the trans-disciplinary approach being regarded as a way for the fundamental knowledge acquisition and the world outlook development. The research findings are addressed to the higher school academic staff for theoretical and practical applications

  17. A methodology for evaluating potential KBS (Knowledge-Based Systems) applications

    Energy Technology Data Exchange (ETDEWEB)

    Melton, R.B.; DeVaney, D.M.; Whiting, M.A.; Laufmann, S.C.

    1989-06-01

    It is often difficult to assess how well Knowledge-Based Systems (KBS) techniques and paradigms may be applied to automating various tasks. This report describes the approach and organization of an assessment procedure that involves two levels of analysis. Level One can be performed by individuals with little technical expertise relative to KBS development, while Level Two is intended to be used by experienced KBS developers. The two levels review four groups of issues: goals, appropriateness, resources, and non-technical considerations. Those criteria are identified which are important at each step in the assessment. A qualitative methodology for scoring the task relative to the assessment criteria is provided to alloy analysts to make better informed decisions with regard to the potential effectiveness of applying KBS technology. In addition to this documentation, the assessment methodology has been implemented for personal computers use using the HYPERCARD{trademark} software on a Macintosh{trademark} computer. This interactive mode facilities small group analysis of potential KBS applications and permits a non-sequential appraisal with provisions for automated note-keeping and question scoring. The results provide a useful tool for assessing the feasibility of using KBS techniques in performing tasks in support of treaty verification or IC functions. 13 refs., 3 figs.

  18. Assessment of bioenergy potential in Sicily: A GIS-based support methodology

    International Nuclear Information System (INIS)

    A Geographical Information System (GIS) supported methodology has been developed in order to assess the technical and economic potential of biomass exploitation for energy production in Sicily. The methodology was based on the use of agricultural, economic, climatic, and infrastructural data in a GIS. Data about land use, transportation facilities, urban cartography, regional territorial planning, terrain digital model, lithology, climatic types, and civil and industrial users have been stored in the GIS to define potential areas for gathering the residues coming from the pruning of olive groves, vineyards, and other agricultural crops, and to assess biomass available for energy cultivation. Further, it was possible to assess the potential of biodiesel production, supposing the cultivation of rapeseed in arable crop areas. For the biomass used for direct combustion purposes, the economic availability has been assessed assuming a price of the biomass and comparing it with other fuels. This assessment has shown the strong competitiveness of firewood in comparison with traditional fossil fuels when the collection system is implemented in an efficient way. Moreover, the economic potential of biodiesel was assessed considering the on-going financial regime for fuel. At the same time, the study has shown a significant competitiveness of the finished biomass (pellets), and good potential for a long-term development of this market. An important result was the determination of biofuel production potential in Sicily. An outcome of the study was to show the opportunities stemming from the harmonisation of Energy Policy with the Waste Management System and Rural Development Plan. (author)

  19. Ni-based Superalloy Development for VHTR - Methodology Using Design of Experiments and Thermodynamic Calculation

    International Nuclear Information System (INIS)

    In this work, to develop novel structural materials for the IHX of a VHTR, a more systematic methodology using the design of experiments (DOE) and thermodynamic calculations was proposed. For 32 sets of designs of Ni-Cr-Co-Mo alloys with minor elements of W and Ta, the mass fraction of TCP phases and mechanical properties were calculated, and finally the chemical composition was optimized for further experimental studies by applying the proposed . The highly efficient generation of electricity and the production of massive hydrogen are possible using a very high temperature gas-cooled reactor (VHTR) among generation IV nuclear power plants. The structural material for an intermediate heat exchanger (IHX) among numerous components should be endurable at high temperature of up to 950 .deg. C during long-term operation. Impurities inevitably introduced in helium as a coolant facilitate the material degradation by corrosion at high temperature. This work is concerning a methodology of Ni-Cr-Co-Mo based superalloy developed for VHTR using the design of experiments (DOE) and thermodynamic calculationsmethodology

  20. An Evidence Based Methodology to Facilitate Public Library Non-fiction Collection Development

    Directory of Open Access Journals (Sweden)

    Matthew Kelly

    2015-12-01

    Full Text Available Objective – This research was designed as a pilot study to test a methodology for subject based collection analysis for public libraries. Methods – WorldCat collection data from eight Australian public libraries was extracted using the Collection Evaluation application. The data was aggregated and filtered to assess how the sample’s titles could be compared against the OCLC Conspectus subject categories. A hierarchy of emphasis emerged and this was divided into tiers ranging from 1% of the sample. These tiers were further analysed to quantify their representativeness against both the sample’s titles and the subject categories taken as a whole. The interpretive aspect of the study sought to understand the types of knowledge embedded in the tiers and was underpinned by hermeneutic phenomenology. Results – The study revealed that there was a marked tendency for a small percentage of subject categories to constitute a large proportion of the potential topicality that might have been represented in these types of collections. The study also found that distribution of the aggregated collection conformed to a Power Law distribution (80/20 so that approximately 80% of the collection was represented by 20% of the subject categories. The study also found that there were significant commonalities in the types of subject categories that were found in the designated tiers and that it may be possible to develop ontologies that correspond to the collection tiers. Conclusions – The evidence-based methodology developed in this pilot study has the potential for further development to help to improve the practice of collection development. The introduction of the concept of the epistemic role played by collection tiers is a promising aid to inform our understanding of knowledge organization for public libraries. The research shows a way forward to help to link subjective decision making with a scientifically based approach to managing knowledge

  1. Design and Development of a Maintenance Knowledge-Base System Based on Common KADS Methodology

    OpenAIRE

    Arab Maki, Alireza; Shariat Zadeh, Navid

    2010-01-01

    The objective of this thesis is to design and develop a knowledge base model to support the maintenance system structure. The aim of this model is to identify the failure modes which are the heart of maintenance system through the functional analysis and then serves as adecision support system to define the maintenance tasks and finally to implement a preventive maintenance task. This knowledge base management system is suitable to design and develop maintenance system since it encompasses al...

  2. Hydrologic transport of depleted uranium associated with open air dynamic range testing at Los Alamos National Laboratory, New Mexico, and Eglin Air Force Base, Florida

    Energy Technology Data Exchange (ETDEWEB)

    Becker, N.M. [Los Alamos National Lab., NM (United States); Vanta, E.B. [Wright Laboratory Armament Directorate, Eglin Air Force Base, FL (United States)

    1995-05-01

    Hydrologic investigations on depleted uranium fate and transport associated with dynamic testing activities were instituted in the 1980`s at Los Alamos National Laboratory and Eglin Air Force Base. At Los Alamos, extensive field watershed investigations of soil, sediment, and especially runoff water were conducted. Eglin conducted field investigations and runoff studies similar to those at Los Alamos at former and active test ranges. Laboratory experiments complemented the field investigations at both installations. Mass balance calculations were performed to quantify the mass of expended uranium which had transported away from firing sites. At Los Alamos, it is estimated that more than 90 percent of the uranium still remains in close proximity to firing sites, which has been corroborated by independent calculations. At Eglin, we estimate that 90 to 95 percent of the uranium remains at test ranges. These data demonstrate that uranium moves slowly via surface water, in both semi-arid (Los Alamos) and humid (Eglin) environments.

  3. Hydrologic transport of depleted uranium associated with open air dynamic range testing at Los Alamos National Laboratory, New Mexico, and Eglin Air Force Base, Florida

    International Nuclear Information System (INIS)

    Hydrologic investigations on depleted uranium fate and transport associated with dynamic testing activities were instituted in the 1980's at Los Alamos National Laboratory and Eglin Air Force Base. At Los Alamos, extensive field watershed investigations of soil, sediment, and especially runoff water were conducted. Eglin conducted field investigations and runoff studies similar to those at Los Alamos at former and active test ranges. Laboratory experiments complemented the field investigations at both installations. Mass balance calculations were performed to quantify the mass of expended uranium which had transported away from firing sites. At Los Alamos, it is estimated that more than 90 percent of the uranium still remains in close proximity to firing sites, which has been corroborated by independent calculations. At Eglin, we estimate that 90 to 95 percent of the uranium remains at test ranges. These data demonstrate that uranium moves slowly via surface water, in both semi-arid (Los Alamos) and humid (Eglin) environments

  4. Towards A Model-based Prognostics Methodology for Electrolytic Capacitors: A Case Study Based on Electrical Overstress Accelerated Aging

    Directory of Open Access Journals (Sweden)

    Gautam Biswas

    2012-12-01

    Full Text Available This paper presents a model-driven methodology for predict- ing the remaining useful life of electrolytic capacitors. This methodology adopts a Kalman filter approach in conjunction with an empirical state-based degradation model to predict the degradation of capacitor parameters through the life of the capacitor. Electrolytic capacitors are important components of systems that range from power supplies on critical avion- ics equipment to power drivers for electro-mechanical actuators. These devices are known for their comparatively low reliability and given their critical role in the system, they are good candidates for component level prognostics and health management. Prognostics provides a way to assess remain- ing useful life of a capacitor based on its current state of health and its anticipated future usage and operational conditions. This paper proposes and empirical degradation model and discusses experimental results for an accelerated aging test performed on a set of identical capacitors subjected to electrical stress. The data forms the basis for developing the Kalman-filter based remaining life prediction algorithm.

  5. Depletion analysis on long-term operation of the conceptual Molten Salt Actinide Recycler and Transmuter (MOSART) by using a special sequence based on SCALE6/TRITON

    International Nuclear Information System (INIS)

    Highlights: ► An automatic computation and control sequence has been developed for MSR neutronics and depletion analyses. ► The method was developed based on a series of stepwise SCALE6/TRITON calculations. ► A detailed reexamination of the MOSART operation in 30 years was performed. ► Clean-up scenarios of fission products have a significant impact on the MOSART operation. - Abstract: A special sequence based on SCALE6/TRITON was developed to perform fuel cycle analysis of the Molten Salt Actinide Recycler and Transmuter (MOSART), with emphasis on the simulation of its dynamic refueling and salt reprocessing scheme during long-term operation. MOSART is one of conceptual designs in the molten salt reactor (MSR) category of the Generation-IV systems. This type of reactors is distinguished by the use of liquid fuel circulating in and out of the core, which offers many unique advantages but complicates the modeling and simulation of core behavior using conventional reactor physics codes. The TRITON control module in SCALE6 can perform reliable depletion and decay analysis for many reactor physics applications due to its problem-dependent cross-section processing and rigorous treatment of neutron transport. In order to accommodate a simulation of on-line refueling and reprocessing scenarios, several in-house programs together with a run script were developed to integrate a series of stepwise TRITON calculations; the result greatly facilitates the neutronics analyses of long-term MSR operation. Using this method, a detailed reexamination of the MOSART operation in 30 years was performed to investigate the neutronic characteristics of the core design, the change of fuel salt composition from start-up to equilibrium, the effects of various salt reprocessing scenarios, the performance of actinide transmutation, and the radiotoxicity reduction

  6. Failure detection and isolation methodology based on the sequential analysis and extended Kalman filter technique

    International Nuclear Information System (INIS)

    A nuclear power plant operation relies on accurate and precise response of the monitoring system in order to assure a safety operational standard during the most predictable operational transients. The signal from the sensor are in general contaminated with noise and also with the randomic fluctuations making a precise plant assessment uncertain, thus with the possibility of erroneous operator decision or even with the false alarm actuation. In practice the noisy environment could even overcome the sensor malfunction misreading the plant operational status. In the present work a new failure detection and isolation (FDI) algorithm has been developed based on the sequential analysis and extended Kalman filter residue monitoring. The present methodology has been applied to both highly redundant monitoring systems and to non redundant systems where high signal reliability is required. (C.M.)

  7. A Model-Based Methodology for Integrated Design and Operation of Reactive Distillation Processes

    DEFF Research Database (Denmark)

    Mansouri, Seyed Soheil; Sales-Cruz, Mauricio; Huusom, Jakob Kjøbsted;

    2015-01-01

    methodology, production of methyl-tert-butyl-ether (MTBE) using are active distillation column (RDC) is considered. Simple graphical design methods that are similar inconcept to non-reactive distillation processes are used. The methods are based on the element concept, which is used to translate a ternary......Process intensification is a new approach that has the potential to improve existing processes as well as new designs of processes to achieve more profitable and sustainable production. However, many issues with respect to their implementation and operation is not clear; for example, the question...... of operability and controllability. Traditionally process design and process control are considered as independent problems and are solved sequentially. The process design problem is usually solved to achieve the design objectives,and then, the operability and process control issues are identified, analyzed...

  8. A Methodology for Assessing Skill-Based Educational Outcomes in a Pharmacy Course.

    Science.gov (United States)

    Alston, Gregory L; Griffiths, Carrie L

    2015-09-25

    Objective. To develop a methodology for assessing skill development in a course while providing objective evidence of success and actionable data to improve instructional effectiveness. Design. Course objectives were recast as skills to be demonstrated. Confidence in these skills was surveyed before and after the course. Student skills were demonstrated using 4 work products and a multiple-choice examination. Assessment. The change from precourse survey to postcourse survey was analyzed with a paired t test. Quality of the student work product was assessed using scoring guides. All students demonstrated skill mastery by scoring 70% or above on the work product, and 87/88 demonstrated individual progress on the surveyed skills during the 15-week course. Conclusion. This assessment strategy is based on sound design principles and provides robust multi-modal evidence of student achievement in skill development, which is not currently available using traditional student course evaluation surveys. PMID:27168618

  9. Methodology for reliability allocation based on fault tree analysis and dualistic contrast

    Institute of Scientific and Technical Information of China (English)

    TONG Lili; CAO Xuewu

    2008-01-01

    Reliability allocation is a difficult multi-objective optimization problem.This paper presents a methodology for reliability allocation that can be applied to determine the reliability characteristics of reactor systems or subsystems.The dualistic contrast,known as one of the most powerful tools for optimization problems,is applied to the reliability allocation model of a typical system in this article.And the fault tree analysis,deemed to be one of the effective methods of reliability analysis,is also adopted.Thus a failure rate allocation model based on the fault tree analysis and dualistic contrast is achieved.An application on the emergency diesel generator in the nuclear power plant is given to illustrate the proposed method.

  10. AREVA LOCA and non-LOCA realistic methodology development strategy based on CATHARE

    International Nuclear Information System (INIS)

    The CATHARE code developed since 1979 by AREVA, Cea, EDF and IRSN is one of the major thermal-hydraulic system codes worldwide. The paper gives an overview of CATHARE 2 Version 2.5 based realistic methodologies elaborated by AREVA for LOCA and non-LOCA and the underlying process (called DRM) applied for that purpose, the special features and improvements implemented in the code to handle additional needs and possible future requirements for industrial applications such as the effect of high Burn-up on fuel and cladding behaviour during LOCAs, coupling with core thermal-hydraulics, 3-dimensional core physics and instrumentation and control, capability to account for asymmetric reactor coolant system flow transients by means of dedicated vessel mixing matrices, second order numerical resolution scheme for boron front propagation for non-LOCA transients. (Author)

  11. Generic System Verilog Universal Verification Methodology Based Reusable Verification Environment for Efficient Verification of Image Signal Processing IPS/SOCS

    Directory of Open Access Journals (Sweden)

    Abhishek Jain

    2013-01-01

    Full Text Available In this paper, we present Generic System Verilog Universal Verification Methodology based ReusableVerification Environment for efficient verification of Image Signal Processing IP’s/SoC’s. With the tightschedules on all projects it is important to have a strong verification methodology which contributes toFirst Silicon Success. Deploy methodologies which enforce full functional coverage and verification ofcorner cases through pseudo random test scenarios is required. Also, standardization of verification flow isneeded. Previously, inside imaging group of ST, Specman (e/Verilog based Verification Environment forIP/Subsystem level verification and C/C++/Verilog based Directed Verification Environment for SoC LevelVerification was used for Functional Verification. Different Verification Environments were used at IPlevel and SoC level. Different Verification/Validation Methodologies were used for SoC Verification acrossmultiple sites. Verification teams were also looking for the ways how to catch bugs early in the designcycle? Thus, Generic System Verilog Universal Verification Methodology (UVM based ReusableVerification Environment is required to avoid the problem of having so many methodologies and provides astandard unified solution which compiles on all tools.

  12. Generic System Verilog Universal Verification Methodology Based Reusable Verification Environment for Efficient Verification of Image Signal Processing IPS/SOCS

    Directory of Open Access Journals (Sweden)

    Abhishek Jain

    2012-12-01

    Full Text Available In this paper, we present Generic System Verilog Universal Verification Methodology based Reusable Verification Environment for efficient verification of Image Signal Processing IP’s/SoC’s. With the tight schedules on all projects it is important to have a strong verification methodology which contributes to First Silicon Success. Deploy methodologies which enforce full functional coverage and verification of corner cases through pseudo random test scenarios is required. Also, standardization of verification flow is needed. Previously, inside imaging group of ST, Specman (e/Verilog based Verification Environment forIP/Subsystem level verification and C/C++/Verilog based Directed Verification Environment for SoC Level Verification was used for Functional Verification. Different Verification Environments were used at IP level and SoC level. Different Verification/Validation Methodologies were used for SoC Verification across multiple sites. Verification teams were also looking for the ways how to catch bugs early in the design cycle? Thus, Generic System Verilog Universal Verification Methodology (UVM based Reusable Verification Environment is required to avoid the problem of having so many methodologies and provides a standard unified solution which compiles on all tools.

  13. Civil use of depleted uranium

    International Nuclear Information System (INIS)

    In this paper the civilian exploitation of depleted uranium is briefly reviewed. Different scenarios relevant to its use are discussed in terms of radiation exposure for workers and the general public. The case of the aircraft accident which occurred in Amsterdam in 1992 involving a fire, is discussed in terms of the radiological exposure to bystanders. All information given has been obtained on the basis of an extensive literature search and are not based on measurements performed at the Institute for Transuranium Elements

  14. Software development methodology for computer based I&C systems of prototype fast breeder reactor

    International Nuclear Information System (INIS)

    Highlights: • Software development methodology adopted for computer based I&C systems of PFBR is detailed. • Constraints imposed as part of software requirements and coding phase are elaborated. • Compliance to safety and security requirements are described. • Usage of CASE (Computer Aided Software Engineering) tools during software design, analysis and testing phase are explained. - Abstract: Prototype Fast Breeder Reactor (PFBR) is sodium cooled reactor which is in the advanced stage of construction in Kalpakkam, India. Versa Module Europa bus based Real Time Computer (RTC) systems are deployed for Instrumentation & Control of PFBR. RTC systems have to perform safety functions within the stipulated time which calls for highly dependable software. Hence, well defined software development methodology is adopted for RTC systems starting from the requirement capture phase till the final validation of the software product. V-model is used for software development. IEC 60880 standard and AERB SG D-25 guideline are followed at each phase of software development. Requirements documents and design documents are prepared as per IEEE standards. Defensive programming strategies are followed for software development using C language. Verification and validation (V&V) of documents and software are carried out at each phase by independent V&V committee. Computer aided software engineering tools are used for software modelling, checking for MISRA C compliance and to carry out static and dynamic analysis. Various software metrics such as cyclomatic complexity, nesting depth and comment to code are checked. Test cases are generated using equivalence class partitioning, boundary value analysis and cause and effect graphing techniques. System integration testing is carried out wherein functional and performance requirements of the system are monitored

  15. Software development methodology for computer based I&C systems of prototype fast breeder reactor

    Energy Technology Data Exchange (ETDEWEB)

    Manimaran, M., E-mail: maran@igcar.gov.in; Shanmugam, A.; Parimalam, P.; Murali, N.; Satya Murty, S.A.V.

    2015-10-15

    Highlights: • Software development methodology adopted for computer based I&C systems of PFBR is detailed. • Constraints imposed as part of software requirements and coding phase are elaborated. • Compliance to safety and security requirements are described. • Usage of CASE (Computer Aided Software Engineering) tools during software design, analysis and testing phase are explained. - Abstract: Prototype Fast Breeder Reactor (PFBR) is sodium cooled reactor which is in the advanced stage of construction in Kalpakkam, India. Versa Module Europa bus based Real Time Computer (RTC) systems are deployed for Instrumentation & Control of PFBR. RTC systems have to perform safety functions within the stipulated time which calls for highly dependable software. Hence, well defined software development methodology is adopted for RTC systems starting from the requirement capture phase till the final validation of the software product. V-model is used for software development. IEC 60880 standard and AERB SG D-25 guideline are followed at each phase of software development. Requirements documents and design documents are prepared as per IEEE standards. Defensive programming strategies are followed for software development using C language. Verification and validation (V&V) of documents and software are carried out at each phase by independent V&V committee. Computer aided software engineering tools are used for software modelling, checking for MISRA C compliance and to carry out static and dynamic analysis. Various software metrics such as cyclomatic complexity, nesting depth and comment to code are checked. Test cases are generated using equivalence class partitioning, boundary value analysis and cause and effect graphing techniques. System integration testing is carried out wherein functional and performance requirements of the system are monitored.

  16. Specification for the VERA Depletion Benchmark Suite

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kang Seog [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-12-17

    CASL-X-2015-1014-000 iii Consortium for Advanced Simulation of LWRs EXECUTIVE SUMMARY The CASL neutronics simulator MPACT is under development for the neutronics and T-H coupled simulation for the pressurized water reactor. MPACT includes the ORIGEN-API and internal depletion module to perform depletion calculations based upon neutron-material reaction and radioactive decay. It is a challenge to validate the depletion capability because of the insufficient measured data. One of the detoured methods to validate it is to perform a code-to-code comparison for benchmark problems. In this study a depletion benchmark suite has been developed and a detailed guideline has been provided to obtain meaningful computational outcomes which can be used in the validation of the MPACT depletion capability.

  17. Application of Rejection Sampling based methodology to variance based parametric sensitivity analysis

    International Nuclear Information System (INIS)

    For estimating the effect of uncertain distribution parameter on the variance of failure probability function (FPF), the map from distribution parameters to FPF is built and the high efficient approximation form is extended to solve the parametric variance-based sensitivity index. Then the parametric variance-based sensitivity index can be firstly expressed as the moments of the FPF, and the FPF is approximated by a product of the univariate functions of the distribution parameters, on which the moments of the FPF approximated by the univariate functions can be easily evaluated by the Gaussian integration using the values of the FPF at the Gaussian nodes. Thus the primary task of evaluating the parametric variance-based sensitivity is transformed to calculate the FPF at Gaussian nodes of the univariate functions, for which Monte Carlo (MC), Extended Monte Carlo (EMC) and Rejection Sampling (RS) are employed and compared here. Only one set of samples of inputs are needed in either EMC or RS. Several numerical and engineering examples are presented to verify the accuracy and efficiency of the proposed approximate methods. Additionally, the results also reveal the virtue of RS which can be more accurate and more unlimited than EMC. - Highlights: • An efficient approximate form is applied for parametric sensitivity analysis. • Gaussian integration techniques are used to compute the moments of FPF. • Extended Monte Carlo is used to compute the FPF with only one set of samples. • Rejection Sampling is applied to estimate FPF by reusing the original samples

  18. A grey-based group decision-making methodology for the selection of hydrogen technologiess in Life Cycle Sustainability perspective

    DEFF Research Database (Denmark)

    Manzardo, Alessandro; Ren, Jingzheng; Mazzi, Anna; Scipioni, Antonio

    2012-01-01

    The objective of this research is to develop a grey-based group decision-making methodology for the selection of the best renewable energy technology (including hydrogen) using a life cycle sustainability perspective. The traditional grey relational analysis has been modified to better address th...... using the proposed methodology, electrolysis of water technology by hydropower has been considered to be the best technology for hydrogen production according to the decision-making group....

  19. A Methodology for Developing Web-based CAD/CAM Systems: Case Studies on Gear Shaper Cutters

    OpenAIRE

    Malahova, Anna

    2014-01-01

    The research establishes a methodology for developing Web-based CAD/CAM software systems to industrial quality standards in a time and cost e ective manner. The methodology de nes the scope of applicability, outlines major considerations and key principles to follow when developing this kind of software, describes an approach to requirements elicitation, resource allocation and collaboration, establishes strategies for overcoming uncertainty and describes the design concerns fo...

  20. Satellite based radar interferometry to estimate large-scale soil water depletion from clay shrinkage: possibilities and limitations

    NARCIS (Netherlands)

    Brake, te B.; Hanssen, R.F.; Ploeg, van der M.J.; Rooij, de G.H.

    2013-01-01

    Satellite-based radar interferometry is a technique capable of measuring small surface elevation changes at large scales and with a high resolution. In vadose zone hydrology, it has been recognized for a long time that surface elevation changes due to swell and shrinkage of clayey soils can serve as

  1. A GIS-based methodology for safe site selection of a building in a hilly region

    Directory of Open Access Journals (Sweden)

    Satish Kumar

    2016-03-01

    Full Text Available Worker safety during construction is widely accepted, but the selection of safe sites for a building is generally not considered. Safe site selection (SSS largely depends upon compiling, analyzing, and refining the information of an area where a building is likely to be located. The locational and topographical aspects of an area located in hilly regions play a major role in SSS, but are generally neglected in traditional and CAD-based systems used for site selection. Architects and engineers select a site based on their judgment, knowledge, and experience, but issues related to site safety are generally ignored. This study reviewed the existing literature on site selection techniques, building codes, and approaches of existing standards to identify various aspects crucial for SSS in hilly regions. A questionnaire survey was conducted to identify various aspects that construction professionals consider critical for SSS. This study explored the application of geographic information systems (GIS in modeling the locational and topographical aspects to identify areas of suitability. A GIS-based methodology for locating a safe site that satisfies various spatial safety aspects was developed.

  2. Competency-based curriculum and active methodology: perceptions of nursing students.

    Science.gov (United States)

    Paranhos, Vania Daniele; Mendes, Maria Manuela Rino

    2010-01-01

    This study identifies the perceptions of undergraduate students at the University of São Paulo at Ribeirão Preto, Brazil, College of Nursing (EERP-USP) concerning the teaching-learning process in two courses: Integrated Seminar: Health-Disease/Care Process in Health Services Policies and Organization, which was offered to first-year students in 2005 and 2006 and Integrality in Health Care I and II, which was offered to second-year students in 2006. The courses proposal was to adopt active methodology and competency-based curriculum. Data were collected from written tests submitted to 62 students at the end of the curse, focusing on the tests pertinence, development of performance, structure and pedagogical dynamics, organization and settings. Thematic analysis indicated that students enjoyed the courses, highlighted the role of the professor/facilitator at points of the pedagogical cycle and learning recorded in students portfolios. Students valued their experience in the Primary Health Care setting, which was based on, and has since the beginning of the program been based on, the theory-professional practice interlocution and closeness to the principles of the Unified Health System (SUS). PMID:20428705

  3. Methodology of Ni-base Superalloy Development for VHTR using Design of Experiments and Thermodynamic Calculation

    International Nuclear Information System (INIS)

    This work is concerning a methodology of Ni-base superalloy development for a very high temperature gas-cooled reactor(VHTR) using design of experiments(DOE) and thermodynamic calculations. Total 32 sets of the Ni-base superalloys with various chemical compositions were formulated based on a fractional factorial design of DOE, and the thermodynamic stability of topologically close-packed(TCP) phases of those alloys was calculated by using the THERMO-CALC software. From the statistical evaluation of the effect of the chemical composition on the formation of TCP phase up to a temperature of 950 .deg. C, which should be suppressed for prolonged service life when it used as the structural components of VHTR, 16 sets were selected for further calculation of the mechanical properties. Considering the yield and ultimate tensile strengths of the selected alloys estimated by using the JMATPRO software, the optimized chemical composition of the alloys for VHTR application, especially intermediate heat exchanger, was proposed for a succeeding experimental study

  4. Grouting design based on characterization of the fractured rock. Presentation and demonstration of a methodology

    International Nuclear Information System (INIS)

    The design methodology presented in this document is based on an approach that considers the individual fractures. The observations and analyses made during production enable the design to adapt to the encountered conditions. The document is based on previously published material and overview flow charts are used to show the different steps. Parts of or the full methodology has been applied for a number of tunneling experiments and projects. SKB projects in the Aespoe tunnel include a pillar experiment and pre-grouting of a 70 meter long tunnel (TASQ). Further, for Hallandsas railway tunnel (Skaane south Sweden), a field pre-grouting experiment and design and post-grouting of a section of 133 meters have been made. For the Nygard railway tunnel (north of Goeteborg, Sweden), design and grouting of a section of 86 meters (pre-grouting) and 60 meters (post-grouting) have been performed. Finally, grouting work at the Tornskog tunnel (Stockholm, Sweden) included design and grouting along a 100 meter long section of one of the two tunnel tubes. Of importance to consider when doing a design and evaluating the result are: - The identification of the extent of the grouting needed based on inflow requirements and estimates of tunnel inflow before grouting. - The selection of grout and performance of grouting materials including penetration ability and length. The penetration length is important for the fan geometry design. - The ungrouted compared to the grouted and excavated rock mass conditions: estimates of tunnel inflow and (if available) measured inflows after grouting and excavation. Identify if possible explanations for deviations. For the Hallandsas, Nygard and Tornskog tunnel sections, the use of a Pareto distribution and the estimate of tunnel inflow identified a need for sealing small aperture fractures (< 50 - 100 μm) to meet the inflow requirements. The tunneling projects show that using the hydraulic aperture as a basis for selection of grout is a good

  5. Research on Part Experssion Methodology of Part Library Based on ISO13584

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    ISO13584 (PLIB) is an international standard, which is established to realize computer's identification ofdata expression and data exchange of part library. In this international standard, part expression methodology ofpart library is an important characteristic, which distinguishes itself from STEP. So, the methodology is a focus inthe research of part library. This article describes the principles of part information expression of part library basedon ISO13584, and the research results of the methodology in details.

  6. Low-Cost Fault Tolerant Methodology for Real Time MPSoC Based Embedded System

    OpenAIRE

    2014-01-01

    We are proposing a design methodology for a fault tolerant homogeneous MPSoC having additional design objectives that include low hardware overhead and performance. We have implemented three different FT methodologies on MPSoCs and compared them against the defined constraints. The comparison of these FT methodologies is carried out by modelling their architectures in VHDL-RTL, on Spartan 3 FPGA. The results obtained through simulations helped us to identify the most relevant scheme in terms ...

  7. A Probabilistic Transmission Expansion Planning Methodology based on Roulette Wheel Selection and Social Welfare

    OpenAIRE

    Gupta, Neeraj; Shekhar, Rajiv; Kalra, Prem Kumar

    2012-01-01

    A new probabilistic methodology for transmission expansion planning (TEP) that does not require a priori specification of new/additional transmission capacities and uses the concept of social welfare has been proposed. Two new concepts have been introduced in this paper: (i) roulette wheel methodology has been used to calculate the capacity of new transmission lines and (ii) load flow analysis has been used to calculate expected demand not served (EDNS). The overall methodology has been imple...

  8. Miedema model based methodology to predict amorphous-forming-composition range in binary and ternary systems

    Energy Technology Data Exchange (ETDEWEB)

    Das, N., E-mail: nirupamd@barc.gov.in [Materials Science Division, Bhabha Atomic Research Centre, Trombay, Mumbai 400 085 (India); Mittra, J. [Materials Science Division, Bhabha Atomic Research Centre, Trombay, Mumbai 400 085 (India); Murty, B.S. [Department of Metallurgical and Materials Engineering, IIT Madras, Chennai 600 036 (India); Pabi, S.K. [Department of Metallurgical and Materials Engineering, IIT Kharagpur, Kharagpur 721 302 (India); Kulkarni, U.D.; Dey, G.K. [Materials Science Division, Bhabha Atomic Research Centre, Trombay, Mumbai 400 085 (India)

    2013-02-15

    Highlights: Black-Right-Pointing-Pointer A methodology was proposed to predict amorphous forming compositions (AFCs). Black-Right-Pointing-Pointer Chemical contribution to enthalpy of mixing {proportional_to} enthalpy of amorphous for AFCs. Black-Right-Pointing-Pointer Accuracy in the prediction of AFC-range was noticed in Al-Ni-Ti system. Black-Right-Pointing-Pointer Mechanical alloying (MA) results of Al-Ni-Ti followed the predicted AFC-range. Black-Right-Pointing-Pointer Earlier MA results of Al-Ni-Ti also conformed to the predicted AFC-range. - Abstract: From the earlier works on the prediction of amorphous forming composition range (AFCR) using Miedema based model and also, on mechanical alloying experiments it has been observed that all amorphous forming compositions of a given alloy system falls within a linear band when the chemical contribution to enthalpy of the solid solution ({Delta}H{sup ss}) is plotted against the enthalpy of mixing in the amorphous phase ({Delta}H{sup amor}). On the basis of this observation, a methodology has been proposed in this article to identify the AFCR of a ternary system that is likely to be more precise than what can be obtained using {Delta}H{sup amor} - {Delta}H{sup ss} < 0 criterion. MA experiments on various compositions of Al-Ni-Ti system, producing amorphous, crystalline, and mixture of amorphous plus crystalline phases have been carried out and the phases have been characterized using X-ray diffraction and transmission electron microscopy techniques. Data from the present MA experiments and, also, from the literature have been used to validate the proposed approach. Also, the proximity of compositions, producing a mixture of amorphous and crystalline phases to the boundary of AFCR in the Al-Ni-Ti ternary has been found useful to validate the effectiveness of the prediction.

  9. The physical vulnerability of elements at risk: a methodology based on fluid and classical mechanics

    Science.gov (United States)

    Mazzorana, B.; Fuchs, S.; Levaggi, L.

    2012-04-01

    The impacts of the flood events occurred in autumn 2011 in the Italian regions Liguria and Tuscany revived the engagement of the public decision makers to enhance in synergy flood control and land use planning. In this context, the design of efficient flood risk mitigation strategies and their subsequent implementation critically relies on a careful vulnerability analysis of both, the immobile and mobile elements at risk potentially exposed to flood hazards. Based on fluid and classical mechanics notions we developed computation schemes enabling for a dynamic vulnerability and risk analysis facing a broad typological variety of elements at risk. The methodological skeleton consists of (1) hydrodynamic computation of the time-varying flood intensities resulting for each element at risk in a succession of loading configurations; (2) modelling the mechanical response of the impacted elements through static, elasto-static and dynamic analyses; (3) characterising the mechanical response through proper structural damage variables and (4) economic valuation of the expected losses as a function of the quantified damage variables. From a computational perspective we coupled the description of the hydrodynamic flow behaviour and the induced structural modifications of the elements at risk exposed. Valuation methods, suitable to support a correct mapping from the value domains of the physical damage variables to the economic loss values are discussed. In such a way we target to complement from a methodological perspective the existing, mainly empirical, vulnerability and risk assessment approaches to refine the conceptual framework of the cost-benefit analysis. Moreover, we aim to support the design of effective flood risk mitigation strategies by diminishing the main criticalities within the systems prone to flood risk.

  10. Artificial Neural Network and Response Surface Methodology Modeling in Ionic Conductivity Predictions of Phthaloylchitosan-Based Gel Polymer Electrolyte

    Directory of Open Access Journals (Sweden)

    Ahmad Danial Azzahari

    2016-01-01

    Full Text Available A gel polymer electrolyte system based on phthaloylchitosan was prepared. The effects of process variables, such as lithium iodide, caesium iodide, and 1-butyl-3-methylimidazolium iodide were investigated using a distance-based ternary mixture experimental design. A comparative approach was made between response surface methodology (RSM and artificial neural network (ANN to predict the ionic conductivity. The predictive capabilities of the two methodologies were compared in terms of coefficient of determination R2 based on the validation data set. It was shown that the developed ANN model had better predictive outcome as compared to the RSM model.

  11. Towards A Model-based Prognostics Methodology for Electrolytic Capacitors: A Case Study Based on Electrical Overstress Accelerated Aging

    Data.gov (United States)

    National Aeronautics and Space Administration — A remaining useful life prediction methodology for elec- trolytic capacitors is presented. This methodology adopts a Kalman filter approach in conjunction with an...

  12. Towards A Model-Based Prognostics Methodology For Electrolytic Capacitors: A Case Study Based On Electrical Overstress Accelerated Aging

    Data.gov (United States)

    National Aeronautics and Space Administration — This paper presents a model-driven methodology for predict- ing the remaining useful life of electrolytic capacitors. This methodology adopts a Kalman filter...

  13. Antarctic ozone depletion

    International Nuclear Information System (INIS)

    Antarctic ozone depletion is most severe during the southern hemisphere spring, when the local reduction in the column amount may be as much as 50 percent. The extent to which this ozone poor air contributes to the observed global ozone loss is a matter of debate, but there is some evidence that fragments of the 'ozone hole' can reach lower latitudes following its breakup in summer. Satellite data show the seasonal evolution of the ozone hole. A new dimension has been added to Antarctic ozone depletion with the advent of large volcanic eruptions such as that from Mount Pinatubo in 1991. (author). 5 refs., 1 fig

  14. Stratospheric ozone depletion.

    Science.gov (United States)

    Rowland, F Sherwood

    2006-05-29

    Solar ultraviolet radiation creates an ozone layer in the atmosphere which in turn completely absorbs the most energetic fraction of this radiation. This process both warms the air, creating the stratosphere between 15 and 50 km altitude, and protects the biological activities at the Earth's surface from this damaging radiation. In the last half-century, the chemical mechanisms operating within the ozone layer have been shown to include very efficient catalytic chain reactions involving the chemical species HO, HO2, NO, NO2, Cl and ClO. The NOX and ClOX chains involve the emission at Earth's surface of stable molecules in very low concentration (N2O, CCl2F2, CCl3F, etc.) which wander in the atmosphere for as long as a century before absorbing ultraviolet radiation and decomposing to create NO and Cl in the middle of the stratospheric ozone layer. The growing emissions of synthetic chlorofluorocarbon molecules cause a significant diminution in the ozone content of the stratosphere, with the result that more solar ultraviolet-B radiation (290-320 nm wavelength) reaches the surface. This ozone loss occurs in the temperate zone latitudes in all seasons, and especially drastically since the early 1980s in the south polar springtime-the 'Antarctic ozone hole'. The chemical reactions causing this ozone depletion are primarily based on atomic Cl and ClO, the product of its reaction with ozone. The further manufacture of chlorofluorocarbons has been banned by the 1992 revisions of the 1987 Montreal Protocol of the United Nations. Atmospheric measurements have confirmed that the Protocol has been very successful in reducing further emissions of these molecules. Recovery of the stratosphere to the ozone conditions of the 1950s will occur slowly over the rest of the twenty-first century because of the long lifetime of the precursor molecules. PMID:16627294

  15. A Methodology for Protective Vibration Monitoring of Hydropower Units Based on the Mechanical Properties.

    Science.gov (United States)

    Nässelqvist, Mattias; Gustavsson, Rolf; Aidanpää, Jan-Olov

    2013-07-01

    It is important to monitor the radial loads in hydropower units in order to protect the machine from harmful radial loads. Existing recommendations in the standards regarding the radial movements of the shaft and bearing housing in hydropower units, ISO-7919-5 (International Organization for Standardization, 2005, "ISO 7919-5: Mechanical Vibration-Evaluation of Machine Vibration by Measurements on Rotating Shafts-Part 5: Machine Sets in Hydraulic Power Generating and Pumping Plants," Geneva, Switzerland) and ISO-10816-5 (International Organization for Standardization, 2000, "ISO 10816-5: Mechanical Vibration-Evaluation of Machine Vibration by Measurements on Non-Rotating Parts-Part 5: Machine Sets in Hydraulic Power Generating and Pumping Plants," Geneva, Switzerland), have alarm levels based on statistical data and do not consider the mechanical properties of the machine. The synchronous speed of the unit determines the maximum recommended shaft displacement and housing acceleration, according to these standards. This paper presents a methodology for the alarm and trip levels based on the design criteria of the hydropower unit and the measured radial loads in the machine during operation. When a hydropower unit is designed, one of its design criteria is to withstand certain loads spectra without the occurrence of fatigue in the mechanical components. These calculated limits for fatigue are used to set limits for the maximum radial loads allowed in the machine before it shuts down in order to protect itself from damage due to high radial loads. Radial loads in hydropower units are caused by unbalance, shape deviations, dynamic flow properties in the turbine, etc. Standards exist for balancing and manufacturers (and power plant owners) have recommendations for maximum allowed shape deviations in generators. These standards and recommendations determine which loads, at a maximum, should be allowed before an alarm is sent that the machine needs maintenance. The radial

  16. The Analysis of Polish Economy’s Transformation to Knowledge Based Economy on the Basis of Knowledge Assessment Methodology

    OpenAIRE

    Sokołowska-Woźniak, Justyna

    2015-01-01

    The main aim of this paper is to analyze the transformation of Poland to knowledge based economy on the basis of World Bank’s Knowledge Assessment Methodology. The analysis of change will be used to compare the performance (strengths and weaknesses) of Polish economy in 1994, 2004 and 2014 with regard to the main aspects of knowledge based economy.

  17. A Platform-Based Methodology for System-Level Mixed-Signal Design

    Directory of Open Access Journals (Sweden)

    Alberto Sangiovanni-Vincentelli

    2010-01-01

    Full Text Available The complexity of today's embedded electronic systems as well as their demanding performance and reliability requirements are such that their design can no longer be tackled with ad hoc techniques while still meeting tight time to-market constraints. In this paper, we present a system level design approach for electronic circuits, utilizing the platform-based design (PBD paradigm as the natural framework for mixed-domain design formalization. In PBD, a meet-in-the-middle approach allows systematic exploration of the design space through a series of top-down mapping of system constraints onto component feasibility models in a platform library, which is based on bottom-up characterizations. In this framework, new designs can be assembled from the precharacterized library components, giving the highest priority to design reuse, correct assembly, and efficient design flow from specifications to implementation. We apply concepts from design centering to enforce robustness to modeling errors as well as process, voltage, and temperature variations, which are currently plaguing embedded system design in deep-submicron technologies. The effectiveness of our methodology is finally shown on the design of a pipeline A/D converter and two receiver front-ends for UMTS and UWB communications.

  18. Three-dimensional design methodologies for tree-based FPGA architecture

    CERN Document Server

    Pangracious, Vinod; Mehrez, Habib

    2015-01-01

    This book focuses on the development of 3D design and implementation methodologies for Tree-based FPGA architecture. It also stresses the needs for new and augmented 3D CAD tools to support designs such as, the design for 3D, to manufacture high performance 3D integrated circuits and reconfigurable FPGA-based systems. This book was written as a text that covers the foundations of 3D integrated system design and FPGA architecture design. It was written for the use in an elective or core course at the graduate level in field of Electrical Engineering, Computer Engineering and Doctoral Research programs. No previous background on 3D integration is required, nevertheless fundamental understanding of 2D CMOS VLSI design is required. It is assumed that reader has taken the core curriculum in Electrical Engineering or Computer Engineering, with courses like CMOS VLSI design, Digital System Design and Microelectronics Circuits being the most important. It is accessible for self-study by both senior students and profe...

  19. A Test Methodology for Determining Space-Readiness of Xilinx SRAM-Based FPGA Designs

    Energy Technology Data Exchange (ETDEWEB)

    Quinn, Heather M [Los Alamos National Laboratory; Graham, Paul S [Los Alamos National Laboratory; Morgan, Keith S [Los Alamos National Laboratory; Caffrey, Michael P [Los Alamos National Laboratory

    2008-01-01

    Using reconfigurable, static random-access memory (SRAM) based field-programmable gate arrays (FPGAs) for space-based computation has been an exciting area of research for the past decade. Since both the circuit and the circuit's state is stored in radiation-tolerant memory, both could be alterd by the harsh space radiation environment. Both the circuit and the circuit's state can be prote cted by triple-moduler redundancy (TMR), but applying TMR to FPGA user designs is often an error-prone process. Faulty application of TMR could cause the FPGA user circuit to output incorrect data. This paper will describe a three-tiered methodology for testing FPGA user designs for space-readiness. We will describe the standard approach to testing FPGA user designs using a particle accelerator, as well as two methods using fault injection and a modeling tool. While accelerator testing is the current 'gold standard' for pre-launch testing, we believe the use of fault injection and modeling tools allows for easy, cheap and uniform access for discovering errors early in the design process.

  20. Temperature-based estimation of global solar radiation using soft computing methodologies

    Science.gov (United States)

    Mohammadi, Kasra; Shamshirband, Shahaboddin; Danesh, Amir Seyed; Abdullah, Mohd Shahidan; Zamani, Mazdak

    2016-07-01

    Precise knowledge of solar radiation is indeed essential in different technological and scientific applications of solar energy. Temperature-based estimation of global solar radiation would be appealing owing to broad availability of measured air temperatures. In this study, the potentials of soft computing techniques are evaluated to estimate daily horizontal global solar radiation (DHGSR) from measured maximum, minimum, and average air temperatures ( T max, T min, and T avg) in an Iranian city. For this purpose, a comparative evaluation between three methodologies of adaptive neuro-fuzzy inference system (ANFIS), radial basis function support vector regression (SVR-rbf), and polynomial basis function support vector regression (SVR-poly) is performed. Five combinations of T max, T min, and T avg are served as inputs to develop ANFIS, SVR-rbf, and SVR-poly models. The attained results show that all ANFIS, SVR-rbf, and SVR-poly models provide favorable accuracy. Based upon all techniques, the higher accuracies are achieved by models (5) using T max- T min and T max as inputs. According to the statistical results, SVR-rbf outperforms SVR-poly and ANFIS. For SVR-rbf (5), the mean absolute bias error, root mean square error, and correlation coefficient are 1.1931 MJ/m2, 2.0716 MJ/m2, and 0.9380, respectively. The survey results approve that SVR-rbf can be used efficiently to estimate DHGSR from air temperatures.

  1. A NOVEL METHODOLOGY FOR CONSTRUCTING RULE-BASED NAÏVE BAYESIAN CLASSIFIERS

    Directory of Open Access Journals (Sweden)

    Abdallah Alashqur

    2015-02-01

    Full Text Available Classification is an important data mining technique that is used by many applications. Several types of classifiers have been described in the research literature. Example classifiers are decision tree classifiers, rule-based classifiers, and neural networks classifiers. Another popular classification technique is naïve Bayesian classification. Naïve Bayesian classification is a probabilistic classification approach that uses Bayesian Theorem to predict the classes of unclassified records. A drawback of Naïve Bayesian Classification is that every time a new data record is to be classified, the entire dataset needs to be scanned in order to apply a set of equations that perform the classification. Scanning the dataset is normally a very costly step especially if the dataset is very large. To alleviate this problem, a new approach for using naïve Bayesian classification is introduced in this study. In this approach, a set of classification rules is constructed on top of naïve Bayesian classifier. Hence we call this approach Rule-based Naïve Bayesian Classifier (RNBC. In RNBC, the dataset is canned only once, off-line, at the time of building the classification rule set. Subsequent scanning of the dataset, is avoided. Furthermore, this study introduces a simple three-step methodology for constructing the classification rule set.

  2. Predicting pedestrian flow: a methodology and a proof of concept based on real-life data.

    Science.gov (United States)

    Davidich, Maria; Köster, Gerta

    2013-01-01

    Building a reliable predictive model of pedestrian motion is very challenging: Ideally, such models should be based on observations made in both controlled experiments and in real-world environments. De facto, models are rarely based on real-world observations due to the lack of available data; instead, they are largely based on intuition and, at best, literature values and laboratory experiments. Such an approach is insufficient for reliable simulations of complex real-life scenarios: For instance, our analysis of pedestrian motion under natural conditions at a major German railway station reveals that the values for free-flow velocities and the flow-density relationship differ significantly from widely used literature values. It is thus necessary to calibrate and validate the model against relevant real-life data to make it capable of reproducing and predicting real-life scenarios. In this work we aim at constructing such realistic pedestrian stream simulation. Based on the analysis of real-life data, we present a methodology that identifies key parameters and interdependencies that enable us to properly calibrate the model. The success of the approach is demonstrated for a benchmark model, a cellular automaton. We show that the proposed approach significantly improves the reliability of the simulation and hence the potential prediction accuracy. The simulation is validated by comparing the local density evolution of the measured data to that of the simulated data. We find that for our model the most sensitive parameters are: the source-target distribution of the pedestrian trajectories, the schedule of pedestrian appearances in the scenario and the mean free-flow velocity. Our results emphasize the need for real-life data extraction and analysis to enable predictive simulations. PMID:24386186

  3. Predicting pedestrian flow: a methodology and a proof of concept based on real-life data.

    Directory of Open Access Journals (Sweden)

    Maria Davidich

    Full Text Available Building a reliable predictive model of pedestrian motion is very challenging: Ideally, such models should be based on observations made in both controlled experiments and in real-world environments. De facto, models are rarely based on real-world observations due to the lack of available data; instead, they are largely based on intuition and, at best, literature values and laboratory experiments. Such an approach is insufficient for reliable simulations of complex real-life scenarios: For instance, our analysis of pedestrian motion under natural conditions at a major German railway station reveals that the values for free-flow velocities and the flow-density relationship differ significantly from widely used literature values. It is thus necessary to calibrate and validate the model against relevant real-life data to make it capable of reproducing and predicting real-life scenarios. In this work we aim at constructing such realistic pedestrian stream simulation. Based on the analysis of real-life data, we present a methodology that identifies key parameters and interdependencies that enable us to properly calibrate the model. The success of the approach is demonstrated for a benchmark model, a cellular automaton. We show that the proposed approach significantly improves the reliability of the simulation and hence the potential prediction accuracy. The simulation is validated by comparing the local density evolution of the measured data to that of the simulated data. We find that for our model the most sensitive parameters are: the source-target distribution of the pedestrian trajectories, the schedule of pedestrian appearances in the scenario and the mean free-flow velocity. Our results emphasize the need for real-life data extraction and analysis to enable predictive simulations.

  4. A Methodology for Equitable Performance Assessment and Presentation of Wave Energy Converters Based on Sea Trials

    DEFF Research Database (Denmark)

    Kofoed, Jens Peter; Pecher, Arthur; Margheritini, Lucia;

    2013-01-01

    This paper provides a methodology for the analysis and presentation of data obtained from sea trials of wave energy converters (WEC). The equitable aspect of this methodology lies in its wide application, as any WEC at any scale or stage of development can be considered as long as the tests are p...

  5. A Model-Based Methodology for Simultaneous Design and Control of a Bioethanol Production Process

    DEFF Research Database (Denmark)

    Alvarado-Morales, Merlin; Abd.Hamid, Mohd-Kamaruddin; Sin, Gürkan;

    2010-01-01

    . The PGC methodology is used to generate more efficient separation designs in terms of energy consumption by targeting the separation task at the largest DF. Both methodologies are highlighted through the application of two case studies, a bioethanol production process and a succinic acid production...

  6. Progressive design methodology for complex engineering systems based on multiobjective genetic algorithms and linguistic decision making

    NARCIS (Netherlands)

    Kumar, P.; Bauer, P.

    2008-01-01

    This work focuses on a design methodology that aids in design and development of complex engineering systems. This design methodology consists of simulation, optimization and decision making. Within this work a framework is presented in which modelling, multi-objective optimization and multi criteri

  7. The Application of Atlas.ti and NVivo Software in Conducting Researches Based on Grounded Theory Methodology

    Directory of Open Access Journals (Sweden)

    Jakub Niedbalski

    2014-05-01

    Full Text Available This article raises a topic of special software applied to support the analysis of qualitative data in research which is based on the procedures of grounded theory methodology (GTM. The purpose of this article is to demonstrate what kind of relations occur between the methodology procedures of grounded theory and two popular programs of CAQDA group: NVivo and Atlas.ti. This article is intended to show in what ways these two programs can be used to provide a GT-based analysis. In the article, there is a demonstration of both technical and applicable possibilities of NVivo and Atlas.ti software. Moreover, this article points out a degree of adequacy of technical solutions applied in both programs in order to meet the requirements of grounded theory methodology, as well as some restrictions and barriers which can be encountered by a researcher who uses a particular computer program in GT-based research.

  8. Comparison of background ozone estimates over the western United States based on two separate model methodologies

    Science.gov (United States)

    Dolwick, Pat; Akhtar, Farhan; Baker, Kirk R.; Possiel, Norm; Simon, Heather; Tonnesen, Gail

    2015-05-01

    Two separate air quality model methodologies for estimating background ozone levels over the western U.S. are compared in this analysis. The first approach is a direct sensitivity modeling approach that considers the ozone levels that would remain after certain emissions are entirely removed (i.e., zero-out modeling). The second approach is based on an instrumented air quality model which tracks the formation of ozone within the simulation and assigns the source of that ozone to pre-identified categories (i.e., source apportionment modeling). This analysis focuses on a definition of background referred to as U.S. background (USB) which is designed to represent the influence of all sources other than U.S. anthropogenic emissions. Two separate modeling simulations were completed for an April-October 2007 period, both focused on isolating the influence of sources other than domestic manmade emissions. The zero-out modeling was conducted with the Community Multiscale Air Quality (CMAQ) model and the source apportionment modeling was completed with the Comprehensive Air Quality Model with Extensions (CAMx). Our analysis shows that the zero-out and source apportionment techniques provide relatively similar estimates of the magnitude of seasonal mean daily 8-h maximum U.S. background ozone at locations in the western U.S. when base case model ozone biases are considered. The largest differences between the two sets of USB estimates occur in urban areas where interactions with local NOx emissions can be important, especially when ozone levels are relatively low. Both methodologies conclude that seasonal mean daily 8-h maximum U.S. background ozone levels can be as high as 40-45 ppb over rural portions of the western U.S. Background fractions tend to decrease as modeled total ozone concentrations increase, with typical fractions of 75-100 percent on the lowest ozone days (<25 ppb) and typical fractions between 30 and 50% on days with ozone above 75 ppb. The finding that

  9. Willpower depletion and framing effects

    OpenAIRE

    de Haan, Thomas; van Veldhuizen, Roel

    2013-01-01

    We investigate whether depleting people's cognitive resources (or willpower) affects the degree to which they are susceptible to framing effects. Recent research in social psychology and economics has suggested that willpower is a resource that can be temporarily depleted and that a depleted level of willpower is associated with self-control problems in a variety of contexts. In this study, we extend the willpower depletion paradigm to framing effects and argue that willpower depletion should...

  10. A game-based decision support methodology for competitive systems design

    Science.gov (United States)

    Briceno, Simon Ignacio

    This dissertation describes the development of a game-based methodology that facilitates the exploration and selection of research and development (R&D) projects under uncertain competitive scenarios. The proposed method provides an approach that analyzes competitor positioning and formulates response strategies to forecast the impact of technical design choices on a project's market performance. A critical decision in the conceptual design phase of propulsion systems is the selection of the best architecture, centerline, core size, and technology portfolio. This selection can be challenging when considering evolving requirements from both the airframe manufacturing company and the airlines in the market. Furthermore, the exceedingly high cost of core architecture development and its associated risk makes this strategic architecture decision the most important one for an engine company. Traditional conceptual design processes emphasize performance and affordability as their main objectives. These areas alone however, do not provide decision-makers with enough information as to how successful their engine will be in a competitive market. A key objective of this research is to examine how firm characteristics such as their relative differences in completing R&D projects, differences in the degree of substitutability between different project types, and first/second-mover advantages affect their product development strategies. Several quantitative methods are investigated that analyze business and engineering strategies concurrently. In particular, formulations based on the well-established mathematical field of game theory are introduced to obtain insights into the project selection problem. The use of game theory is explored in this research as a method to assist the selection process of R&D projects in the presence of imperfect market information. The proposed methodology focuses on two influential factors: the schedule uncertainty of project completion times and

  11. DESIGN METHODOLOGY OF NETWORKED SOFTWARE EVOLUTION GROWTH BASED ON SOFTWARE PATTERNS

    Institute of Scientific and Technical Information of China (English)

    Keqing HE; Rong PENG; Jing LIU; Fei HE; Peng LIANG; Bing LI

    2006-01-01

    Recently, some new characteristics of complex networks attract the attentions of scientists in different fields, and lead to many kinds of emerging research directions. So far, most of the research work has been limited in discovery of complex network characteristics by structure analysis in large-scale software systems. This paper presents the theoretical basis, design method, algorithms and experiment results of the research. It firstly emphasizes the significance of design method of evolution growth for network topology of Object Oriented (OO) software systems, and argues that the selection and modulation of network models with various topology characteristics will bring un-ignorable effect on the process of design and implementation of OO software systems. Then we analyze the similar discipline of "negation of negation and compromise" between the evolution of network models with different topology characteristics and the development of software modelling methods. According to the analysis of the growth features of software patterns, we propose an object-oriented software network evolution growth method and its algorithms in succession. In addition, we also propose the parameter systems for Oosoftware system metrics based on complex network theory. Based on these parameter systems, it can analyze the features of various nodes, links and local-world, modulate the network topology and guide the software metrics. All these can be helpful to the detailed design, implementation and performance analysis. Finally, we focus on the application of the evolution algorithms and demonstrate it by a case study.Comparing the results from our early experiments with methodologies in empirical software engineering, we believe that the proposed software engineering design method is a computational software engineering approach based on complex network theory. We argue that this method should be greatly beneficial for the design, implementation, modulation and metrics of

  12. Developing More Insights on Sustainable Consumption in China Based on Q Methodology

    Directory of Open Access Journals (Sweden)

    Ying Qu

    2015-10-01

    Full Text Available Being an important aspect of sustainable development, sustainable consumption has attracted great attention among Chinese politicians and academia, and Chinese governments have established policies that encourage sustainable consumption behaviors. However, unsustainable consumption behavior still remains predominant in China. This paper aims to classify consumers with similar traits, in terms of the characteristics of practicing sustainable consumption, into one group, so that their traits can be clearly understood, to enable governments to establish pointed policies for different groups of consumers. Q methodology, generally used to reveal the subjectivity of human beings involved in any situation, is applied in this paper to classify Chinese consumers based on Q sample design and data collection and analysis. Next, the traits of each group are analyzed in detail and comparison analyses are also conducted to compare the common and differentiating factors among the three groups. The results show that Chinese consumers can be classified into three groups: sustainable (Group 1, potential sustainable (Group 2 and unsustainable consumers (Group 3, according to their values and attitudes towards sustainable consumption. As such, Group 1 cares for the environment and has strong environmental values. They understand sustainable consumption and its functions. Group 2 needs more enlightenments and external stimuli to motivate them to consume sustainably. Group 3 needs to be informed about and educated on sustainable consumption to enable them to change their consumption behavior from unsustainable to sustainable. Suggestions and implications of encouraging each group of consumers to engage in sustainable consumption are also provided.

  13. Cerebral methodology based computing to estimate real phenomena from large-scale nuclear simulation

    International Nuclear Information System (INIS)

    Our final goal is to estimate real phenomena from large-scale nuclear simulations by using computing processes. Large-scale simulations mean that they include scale variety and physical complexity so that corresponding experiments and/or theories do not exist. In nuclear field, it is indispensable to estimate real phenomena from simulations in order to improve the safety and security of nuclear power plants. Here, the analysis of uncertainty included in simulations is needed to reveal sensitivity of uncertainty due to randomness, to reduce the uncertainty due to lack of knowledge and to lead a degree of certainty by verification and validation (V and V) and uncertainty quantification (UQ) processes. To realize this, we propose 'Cerebral Methodology based Computing (CMC)' as computing processes with deductive and inductive approaches by referring human reasoning processes. Our idea is to execute deductive and inductive simulations contrasted with deductive and inductive approaches. We have established its prototype system and applied it to a thermal displacement analysis of a nuclear power plant. The result shows that our idea is effective to reduce the uncertainty and to get the degree of certainty. (author)

  14. a Risk Based Methodology to Assess the Energy Efficiency Improvements in Traditionally Constructed Buildings

    Science.gov (United States)

    Herrera, D.; Bennadji, A.

    2013-07-01

    In order to achieve the CO2 reduction targets set by the Scottish government, it will be necessary to improve the energy efficiency of existing buildings. Within the total Scottish building stock, historic and traditionally constructed buildings are an important proportion, in the order of 19 % (Curtis, 2010), and represent cultural, emotional and identity values that should be protected. However, retrofit interventions could be a complex operation because of the several aspects that are involved in the hygrothermal performance of traditional buildings. Moreover, all these factors interact with each other and therefore need to be analysed as a whole. Upgrading the envelope of traditional buildings may produce severe changes to the moisture migration leading to superficial or interstitial condensation and thus fabric decay and mould growth. Retrofit projects carried out in the past have failed because of the misunderstanding, or the lack of expert prediction, of the potential consequences associated to the envelope's alteration. The evaluation of potential risks, prior to any alteration on building's physics in order to improve its energy efficiency, is critical to avoid future damage on the wall's performance or occupants' health and well being. The aim of this PhD research project is to point out the most critical aspects related to the energy efficiency improvement of traditional buildings and to develop a risk based methodology that helps owners and practitioners during the decision making process.

  15. Measuring resource inequalities. The concepts and methodology for an area-based Gini coefficient

    International Nuclear Information System (INIS)

    Although inequalities in income and expenditure are relatively well researched, comparatively little attention has been paid, to date, to inequalities in resource use. This is clearly a shortcoming when it comes to developing informed policies for sustainable consumption and social justice. This paper describes an indicator of inequality in resource use called the AR-Gini. The AR-Gini is an area-based measure of resource inequality that estimates inequalities between neighbourhoods with regard to the consumption of specific consumer goods. It is also capable of estimating inequalities in the emissions resulting from resource use, such as carbon dioxide emissions from energy use, and solid waste arisings from material resource use. The indicator is designed to be used as a basis for broadening the discussion concerning 'food deserts' to inequalities in other types of resource use. By estimating the AR-Gini for a wide range of goods and services we aim to enhance our understanding of resource inequalities and their drivers, identify which resources have highest inequalities, and to explore trends in inequalities. The paper describes the concepts underlying the construction of the AR-Gini and its methodology. Its use is illustrated by pilot applications (specifically, men's and boys' clothing, carpets, refrigerators/freezers and clothes washer/driers). The results illustrate that different levels of inequality are associated with different commodities. The paper concludes with a brief discussion of some possible policy implications of the AR-Gini. (author)

  16. Development of a risk monitoring system for nuclear power plants based on GO-FLOW methodology

    International Nuclear Information System (INIS)

    Highlights: • A method for developing Living PSA is proposed. • Living PSA is easy to update with online modification to system model file. • A risk monitoring system is designed and developed using the GO-FLOW. • The risk monitoring system is useful for plant daily operation risk management. - Abstract: The paper presents a risk monitoring system developed based on GO-FLOW methodology which is a success-oriented system reliability modeling technique for phased mission as well as time-dependent problems analysis. The risk monitoring system is designed to receive information on plant configuration changes either from equipment failures, operator interventions, or maintenance activities, then update the Living PSA model with online modification to the system GO-FLOW model file which contains all the functional modes of equipment represented by a proposed generalized GO-FLOW modeling structure, and display risk values graphically. The risk monitoring system can be used to assist safety engineers and plant operators in their maintenance management and daily operation risk management at NPPs

  17. ANN-GA hybrid methodology based optimization study for microbial production of CoQ10

    Directory of Open Access Journals (Sweden)

    Shruti Bajpai

    2015-01-01

    Full Text Available Ubiquinone-10 also known as CoQ10 is a potent antioxidant which is found at membrane-bound electron transport system and has a wide range of therapeutic use. Purpose: The purpose of this study was to implement fermentation process optimization for production of CoQ10 by using Pseudomonas diminuta NCIM 2865. Methods: Significant medium components with respect to CoQ10 production were identified using Plackett- Burman design wherein their interaction was studied using response surface methodology (RSM. CoQ10 production increased considerably from 10.8 to 18.57 mg/l when fermentation was carried out in RSM optimised medium. Further, production of CoQ10 was increased by using the predictive results of ANN-GA (artificial neural network and genetic algorithm hybrid method. Results and Conclusions: This increased the yield of CoQ10 18.57 to 27.9 mg/l. The experimental study using ANN-GA based optimized medium condition in the presence of carrot juice as precursor for the CoQ10 production reported as yield of 34.4 mg/l, quite high compared to the earlier studies.

  18. IIR filtering based adaptive active vibration control methodology with online secondary path modeling using PZT actuators

    Science.gov (United States)

    Boz, Utku; Basdogan, Ipek

    2015-12-01

    Structural vibrations is a major cause for noise problems, discomfort and mechanical failures in aerospace, automotive and marine systems, which are mainly composed of plate-like structures. In order to reduce structural vibrations on these structures, active vibration control (AVC) is an effective approach. Adaptive filtering methodologies are preferred in AVC due to their ability to adjust themselves for varying dynamics of the structure during the operation. The filtered-X LMS (FXLMS) algorithm is a simple adaptive filtering algorithm widely implemented in active control applications. Proper implementation of FXLMS requires availability of a reference signal to mimic the disturbance and model of the dynamics between the control actuator and the error sensor, namely the secondary path. However, the controller output could interfere with the reference signal and the secondary path dynamics may change during the operation. This interference problem can be resolved by using an infinite impulse response (IIR) filter which considers feedback of the one or more previous control signals to the controller output and the changing secondary path dynamics can be updated using an online modeling technique. In this paper, IIR filtering based filtered-U LMS (FULMS) controller is combined with online secondary path modeling algorithm to suppress the vibrations of a plate-like structure. The results are validated through numerical and experimental studies. The results show that the FULMS with online secondary path modeling approach has more vibration rejection capabilities with higher convergence rate than the FXLMS counterpart.

  19. Performance-based, cost- and time-effective PCB analytical methodology

    International Nuclear Information System (INIS)

    Laboratory applications for the analysis of PCBs (polychlorinated biphenyls) in environmental matrices such as soil/sediment/sludge and oil/waste oil were evaluated for potential reduction in waste, source reduction, and alternative techniques for final determination. As a consequence, new procedures were studied for solvent substitution, miniaturization of extraction and cleanups, minimization of reagent consumption, reduction of cost per analysis, and reduction of time. These new procedures provide adequate data that meet all the performance requirements for the determination of PCBs. Use of the new procedures reduced costs for all sample preparation techniques. Time and cost were also reduced by combining the new sample preparation procedures with the power of fast gas chromatography. Separation of Aroclor 1254 was achieved in less than 6 min by using DB-1 and SPB-608 columns. With the greatly shortened run times, reproducibility can be tested quickly and consequently with low cost. With performance-based methodology, the applications presented here can be applied now, without waiting for regulatory approval

  20. Designing reasonable accommodation of the workplace: a new methodology based on risk assessment.

    Science.gov (United States)

    Pigini, L; Andrich, R; Liverani, G; Bucciarelli, P; Occhipinti, E

    2010-05-01

    If working tasks are carried out in inadequate conditions, workers with functional limitations may, over time, risk developing further disabilities. While several validated risk assessment methods exist for able-bodied workers, few studies have been carried out for workers with disabilities. This article, which reports the findings of a Study funded by the Italian Ministry of Labour, proposes a general methodology for the technical and organisational re-design of a worksite, based on risk assessment and irrespective of any worker disability. To this end, a sample of 16 disabled workers, composed of people with either mild or severe motor disabilities, was recruited. Their jobs include business administration (5), computer programmer (1), housewife (1), mechanical worker (2), textile worker (1), bus driver (1), nurse (2), electrical worker (1), teacher (1), warehouseman (1). By using a mix of risk assessment methods and the International Classification of Functioning (ICF) taxonomy, their worksites were re-designed in view of a reasonable accommodation, and prospective evaluation was carried out to check whether the new design would eliminate the risks. In one case - a man with congenital malformations who works as a help-desk operator for technical assistance in the Information and Communication Technology (ICT) department of a big organisation - the accommodation was actually carried out within the time span of the study, thus making it possible to confirm the hypotheses raised in the prospective assessment. PMID:20131973

  1. A Preisach-Based Nonequilibrium Methodology for Simulating Performance of Hysteretic Magnetic Refrigeration Cycles

    Science.gov (United States)

    Brown, Timothy D.; Bruno, Nickolaus M.; Chen, Jing-Han; Karaman, Ibrahim; Ross, Joseph H.; Shamberger, Patrick J.

    2015-09-01

    In giant magnetocaloric effect (GMCE) materials a large entropy change couples to a magnetostructural first-order phase transition, potentially providing a basis for magnetic refrigeration cycles. However, hysteresis loss greatly reduces the availability of refrigeration work in such cycles. Here, we present a methodology combining a Preisach model for rate-independent hysteresis with a thermodynamic analysis of nonequilibrium phase transformations which, for GMCE materials exhibiting hysteresis, allows an evaluation of refrigeration work and efficiency terms for an arbitrary cycle. Using simplified but physically meaningful descriptors for the magnetic and thermal properties of a Ni45Co5Mn36.6In13.4 at.% single-crystal alloy, we relate these work/efficiency terms to fundamental material properties, demonstrating the method's use as a materials design tool. Following a simple two-parameter model for the alloy's hysteresis properties, we compute and interpret the effect of each parameter on the cyclic refrigeration work and efficiency terms. We show that hysteresis loss is a critical concern in cycles based on GMCE systems, since the resultant lost work can reduce the refrigeration work to zero; however, we also find that the lost work may be mitigated by modifying other aspects of the transition, such as the width over which the one-way transformation occurs.

  2. Looking for phase-space structures in star-forming regions: an MST-based methodology

    Science.gov (United States)

    Alfaro, Emilio J.; González, Marta

    2016-03-01

    We present a method for analysing the phase space of star-forming regions. In particular we are searching for clumpy structures in the 3D sub-space formed by two position coordinates and radial velocity. The aim of the method is the detection of kinematic segregated radial velocity groups, that is, radial velocity intervals whose associated stars are spatially concentrated. To this end we define a kinematic segregation index, tilde{Λ }(RV), based on the Minimum Spanning Tree graph algorithm, which is estimated for a set of radial velocity intervals in the region. When tilde{Λ }(RV) is significantly greater than 1 we consider that this bin represents a grouping in the phase space. We split a star-forming region into radial velocity bins and calculate the kinematic segregation index for each bin, and then we obtain the spectrum of kinematic groupings, which enables a quick visualization of the kinematic behaviour of the region under study. We carried out numerical models of different configurations in the sub-space of the phase space formed by the coordinates and the that various case studies illustrate. The analysis of the test cases demonstrates the potential of the new methodology for detecting different kind of groupings in phase space.

  3. METHODOLOGICAL BASES OF THE OPTIMIZATION OF ORGANIZATIONAL MANAGEMENT STRUCTURE AT IMPLEMENTING THE MAJOR CONSTRUCTION ENTERPRISE STRATEGY

    Directory of Open Access Journals (Sweden)

    Rodionova Svetlana Vladimirovna

    2015-09-01

    Full Text Available Planning and implementation of innovations on the microlevel of management and on the higher levels is a process of innovative projects portfolio implementation. Project management is aimed at some goal; therefore, defining the mission and aims of implementation is of primary importance. These are the part of the notion of development strategy of an enterprise. Creating a strategy for big construction holding companies is complicated by the necessity to account for different factors effecting each business-block and subsidiary companies. The authors specify an algorithm of development and implementation of the activity strategy of a big construction enterprise. A special importance of the correspondence of organizational management structure to the implemented strategy is shown. The innovative character of organizational structure change is justified. The authors offer methods to optimize the organizational management structure based on communication approach with the use of the elements graph theory. The offered methodological provisions are tested on the example of the Russian JSC “RZhDstroy”.

  4. Development of evidence-based performance measures for bipolar disorder: overview of methodology.

    Science.gov (United States)

    Golden, William E; Hermann, Richard C; Jewell, Mark; Brewster, Cheryl

    2008-05-01

    The STAndards for BipoLar Excellence (STABLE) Project was organized in 2005 to improve quality of care for bipolar disorder by developing and testing a set of evidence-based clinical process performance measures related to identifying, assessing, managing, and coordinating care for bipolar disorder. This article first briefly reviews the literature on the science of performance measurement and the use of performance measures as a tool for quality improvement. It then presents a detailed overview of the methodology used to develop the STABLE performance measures. Steps included choosing a clinical area to be measured, selecting key aspects of care for measurement, designing specifications for the measures, developing a data collection strategy, testing the scientific strength (validity, reliability, feasibility) of the measures, and obtaining, analyzing, and reporting conformance findings for the measures. Five of the STABLE measures have been endorsed by the National Quality Forum as part of their Standardizing Ambulatory Care Performance Measures project: screening for bipolar mania/hypomania in patients diagnosed with depression, assessment for risk of suicide, assessment for substance use, screening for hyperglycemia when atypical antipsychotic agents are prescribed, and monitoring change in level of functioning in response to treatment. Additional STABLE measures will be submitted to appropriate national organizations in the future. It is hoped that these measures will be used in quality assessment activities and that the results will inform efforts to improve care for individuals with bipolar disorder. PMID:18677196

  5. Low-Cost Fault Tolerant Methodology for Real Time MPSoC Based Embedded System

    Directory of Open Access Journals (Sweden)

    Mohsin Amin

    2014-01-01

    Full Text Available We are proposing a design methodology for a fault tolerant homogeneous MPSoC having additional design objectives that include low hardware overhead and performance. We have implemented three different FT methodologies on MPSoCs and compared them against the defined constraints. The comparison of these FT methodologies is carried out by modelling their architectures in VHDL-RTL, on Spartan 3 FPGA. The results obtained through simulations helped us to identify the most relevant scheme in terms of the given design constraints.

  6. A Probabilistic Transmission Expansion Planning Methodology based on Roulette Wheel Selection and Social Welfare

    CERN Document Server

    Gupta, Neeraj; Kalra, Prem Kumar

    2012-01-01

    A new probabilistic methodology for transmission expansion planning (TEP) that does not require a priori specification of new/additional transmission capacities and uses the concept of social welfare has been proposed. Two new concepts have been introduced in this paper: (i) roulette wheel methodology has been used to calculate the capacity of new transmission lines and (ii) load flow analysis has been used to calculate expected demand not served (EDNS). The overall methodology has been implemented on a modified IEEE 5-bus test system. Simulations show an important result: addition of only new transmission lines is not sufficient to minimize EDNS.

  7. Forecasting dose-time profiles of solar particle events using a dosimetry-based forecasting methodology

    Science.gov (United States)

    Neal, John Stuart

    2001-10-01

    A dosimetery-based Bayesian methodology for forecasting astronaut radiation doses in deep space due to radiologically significant solar particle event proton fluences is developed. Three non-linear sigmoidal growth curves (Gompertz, Weibull, logistic) are used with hierarchical, non-linear, regression models to forecast solar particle event dose-time profiles from doses obtained early in the development of the event. Since there are no detailed measurements of dose versus time for actual events, surrogate dose data are provided by calculational methods. Proton fluence data are used as input to the deterministic, coupled neutron-proton space radiation computer code, BRYNTRN, for transporting protons and their reaction products (protons, neutrons, 2H, 3H, 3He, and 4He) through aluminum shielding material and water. Calculated doses and dose rates for ten historical solar particle events are used as the input data by grouping similar historical solar particle events, using asymptotic dose and maximum dose rate as the grouping criteria. These historical data are then used to lend strength to predictions of dose and dose rate-time profiles for new solar particle events. Bayesian inference techniques are used to make parameter estimates and predictive forecasts. Markov Chain Monte Carlo (MCMC) methods are used to sample from the posterior distributions. Hierarchical, non-linear regression models provide useful predictions of asymptotic dose and dose-time profiles for the November 8, 2000 and August 12, 1989 solar particle events. Predicted dose rate-time profiles are adequate for the November 8, 2000 solar particle event. Predictions of dose rate-time profiles for the August 12, 1989 solar particle event suffer due to a more complex dose rate-time profile. Forecasts provide a valuable tool to space operations planners when making recommendations concerning operations in which radiological exposure might jeopardize personal safety or mission completion. This work

  8. An AFM-based methodology for measuring axial and radial error motions of spindles

    International Nuclear Information System (INIS)

    This paper presents a novel atomic force microscopy (AFM)-based methodology for measurement of axial and radial error motions of a high precision spindle. Based on a modified commercial AFM system, the AFM tip is employed as a cutting tool by which nano-grooves are scratched on a flat surface with the rotation of the spindle. By extracting the radial motion data of the spindle from the scratched nano-grooves, the radial error motion of the spindle can be calculated after subtracting the tilting errors from the original measurement data. Through recording the variation of the PZT displacement in the Z direction in AFM tapping mode during the spindle rotation, the axial error motion of the spindle can be obtained. Moreover the effects of the nano-scratching parameters on the scratched grooves, the tilting error removal method for both conditions and the method of data extraction from the scratched groove depth are studied in detail. The axial error motion of 124 nm and the radial error motion of 279 nm of a commercial high precision air bearing spindle are achieved by this novel method, which are comparable with the values provided by the manufacturer, verifying this method. This approach does not need an expensive standard part as in most conventional measurement approaches. Moreover, the axial and radial error motions of the spindle can both be obtained, indicating that this is a potential means of measuring the error motions of the high precision moving parts of ultra-precision machine tools in the future. (paper)

  9. Methodological Bases for Ranking the European Union Countries in Terms of Macroeconomic Security

    Directory of Open Access Journals (Sweden)

    Tymoshenko Olena V.

    2015-11-01

    Full Text Available The fundamental contradictions of existing methodical approaches to assessing the level of the state economic security have been substantiated and proposals on the introduction of a unified methodology for its assessment, which would be acceptable for use at the international level or for a specific cluster of countries, have been developed. Based on the conducted researches it has been found that the there are no unified signs for such classification of countries. To determine the most significant coefficients and critical values of the indicators of economic security, it is appropriate that the countries should be grouped in terms of the level of the economic development proposed by the UN Commission and the IMF. Analysis of the economic security level has been conducted for the countries-members of the European Union as a separate cluster of countries on the example of macroeconomic security indicators. Based on the evaluation it has been found that the proposed list of indicators and their critical values is economically sound and built on the principle of adequacy, representativeness and comprehensiveness. In 2004 the most secure countries of the EU corresponding to the macroeconomic security standards were Austria, Denmark, Sweden, Finland, and as in 2014 the percentage of absolutely secure countries decreased from 14.3 to 7.1%, only Denmark and Sweden remained in the ranking. During the analyzed period Bulgaria and Croatia got into the risk zone, Estonia, Lithuania, Latvia, Romania were in a danger zone. In 2014 Ukraine in terms of its macroeconomic security was in a critical state, which testified about serious structural and system imbalances in its development.

  10. Combined methodology of optimization and life cycle inventory for a biomass gasification based BCHP system

    International Nuclear Information System (INIS)

    Biomass gasification based building cooling, heating, and power (BCHP) system is an effective distributed energy system to improve the utilization of biomass resources. This paper proposes a combined methodology of optimization method and life cycle inventory (LCI) for the biomass gasification based BCHP system. The life cycle models including biomass planting, biomass collection-storage-transportation, BCHP plant construction and operation, and BCHP plant demolition and recycle, are constructed to obtain economic cost, energy consumption and CO2 emission in the whole service-life. Then, the optimization model for the biomass BCHP system including variables, objective function and solution method are presented. Finally, a biomass BCHP case in Harbin, China, is optimized under different optimization objectives, the life-cycle performances including cost, energy and CO2 emission are obtained and the grey incidence approach is employed to evaluate their comprehensive performances of the biomass BCHP schemes. The results indicate that the life-cycle cost, energy efficiency and CO2 emission of the biomass BCHP system are about 41.9 $ MWh−1, 41% and 59.60 kg MWh−1 respectively. The optimized biomass BCHP configuration to minimize the life-cycle cost is the best scheme to achieve comprehensive benefit including cost, energy consumption, renewable energy ratio, steel consumption, and CO2 emission. - Highlights: • Propose the combined method of optimization and LCI for biomass BCHP system. • Optimize the biomass BCHP system to minimize the life-cycle cost, energy and emission. • Obtain the optimized life-cycle cost, energy efficiency and CO2 emission. • Select the best biomass BCHP scheme using grey incidence approach

  11. Monitoring and analysis of nuclear power plant signals based on nonlinear dynamical methodology

    Energy Technology Data Exchange (ETDEWEB)

    Suzudo, Tomoaki [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Tuerkcan, E.; Verhoef, H.

    1997-03-01

    The spatial correlation of a trajectory in the state space drawn by a dynamical system gives the information (fractal) dimension of the system, and enables the onset of the limit-cycle (or persistent) oscillation to be examined. On-line monitoring system with this methodology was established for a PWR-type nuclear power plant (NPP). The potential use of this methodology in monitoring NPP was tested from two actual situations in which anomaly developed. The spatial correlations of various NPP signals under normal operating conditions were calculated, yielding information unobtainable by conventional linear methodology. For example, results indicated that the coolant pump vibration was not linearly stabilized, that is, it was a limit-cycle oscillation. Swelling oscillation in the pressurizer pressure was also discovered by this methodology. (author)

  12. Application of a new methodology based on the uncertainties statistical propagation for evaluation of DNB limits

    International Nuclear Information System (INIS)

    This work uses a new methodology to evaluate DNBR, named mini-RTDP. The standard thermal design procedure method (STDP) currently in use establishes a limit design value which cannot be surpassed. This limit value is determined taking into account the uncertainties of the empirical correlation used in COBRAIIIC/MIT code, modified do Angra-1 conditions. The correlation used is the Westinghouse's W-3 and the minimum DNBR (MDNBR) value cannot be less than 1.3. The new methodology reduces the excessive level of conservatism associated with the parameters used in the DNBR calculation, which are in their most unfavorable values of the STDP methodology, by using their best estimate values. The final goal is to obtain a new DNBR design limit which will provide a margin gain due to more realistic parameters values used in the methodology. (author). 11 refs., 2 tabs

  13. Testing fully depleted CCD

    Science.gov (United States)

    Casas, Ricard; Cardiel-Sas, Laia; Castander, Francisco J.; Jiménez, Jorge; de Vicente, Juan

    2014-08-01

    The focal plane of the PAU camera is composed of eighteen 2K x 4K CCDs. These devices, plus four spares, were provided by the Japanese company Hamamatsu Photonics K.K. with type no. S10892-04(X). These detectors are 200 μm thick fully depleted and back illuminated with an n-type silicon base. They have been built with a specific coating to be sensitive in the range from 300 to 1,100 nm. Their square pixel size is 15 μm. The read-out system consists of a Monsoon controller (NOAO) and the panVIEW software package. The deafualt CCD read-out speed is 133 kpixel/s. This is the value used in the calibration process. Before installing these devices in the camera focal plane, they were characterized using the facilities of the ICE (CSIC- IEEC) and IFAE in the UAB Campus in Bellaterra (Barcelona, Catalonia, Spain). The basic tests performed for all CCDs were to obtain the photon transfer curve (PTC), the charge transfer efficiency (CTE) using X-rays and the EPER method, linearity, read-out noise, dark current, persistence, cosmetics and quantum efficiency. The X-rays images were also used for the analysis of the charge diffusion for different substrate voltages (VSUB). Regarding the cosmetics, and in addition to white and dark pixels, some patterns were also found. The first one, which appears in all devices, is the presence of half circles in the external edges. The origin of this pattern can be related to the assembly process. A second one appears in the dark images, and shows bright arcs connecting corners along the vertical axis of the CCD. This feature appears in all CCDs exactly in the same position so our guess is that the pattern is due to electrical fields. Finally, and just in two devices, there is a spot with wavelength dependence whose origin could be the result of a defectous coating process.

  14. Rule-based Expert Systems for Selecting Information Systems Development Methodologies

    OpenAIRE

    Abdel Nasser H. Zaied; Samah Ibrahim Abdel Aal; Mohamed Monir Hassan

    2013-01-01

    Information Systems (IS) are increasingly becoming regarded as crucial to an organization's success. Information Systems Development Methodologies (ISDMs) are used by organizations to structure the information system development process. ISDMs are essential for structuring project participants’ thinking and actions; therefore ISDMs play an important role to achieve successful projects. There are different ISDMs and no methodology can claim that it can be applied to any organization. The probl...

  15. Integration of process design and controller design for chemical processes using model-based methodology

    DEFF Research Database (Denmark)

    Abd.Hamid, Mohd-Kamaruddin; Sin, Gürkan; Gani, Rafiqul

    2010-01-01

    constraints) problem. Accordingly the optimization problem is decomposed into four sub-problems: (i) pre-analysis, (ii) design analysis, (iii) controller design analysis, and (iv) final selection and verification, which are relatively easier to solve. The methodology makes use of thermodynamic...... satisfy design, control and cost criteria. The advantage of the proposed methodology is that it is systematic, makes use of thermodynamic-process knowledge and provides valuable insights to the solution of IPDC problems in chemical engineering practice....

  16. Towards a common methodology to simulate tree mortality based on ring-width data

    Science.gov (United States)

    Cailleret, Maxime; Bigler, Christof; Bugmann, Harald; Davi, Hendrik; Minunno, Francesco; Peltoniemi, Mikko; Martínez-Vilalta, Jordi

    2015-04-01

    Individual mortality is a key process of population and community dynamics, especially for long-lived species such as trees. As the rates of vegetation background mortality and of massive diebacks accelerated during the last decades and would continue in the future due to rising temperature and increasing drought, there is a growing demand of early warning signals that announce that the likelihood of death is very high. If physiological indicators have a high potential to predict tree mortality, their development requires an intensive tree monitoring which cannot be currently done on a representative sample of a population and on several species. An easier approach is to use radial growth data such as tree ring-widths measurements. During the last decades, an increasing number of studies aimed to derive these growth-mortality functions. However, as they followed different approaches concerning the choice of the sampling strategy (number of dead and living trees), of the type of growth explanatory variables (growth level, growth trend variables…), and of the length of the time-window (number of rings before death) used to calculate them, it makes difficult to compare results among studies and a subsequent biological interpretation. We detailed a new methodology for assessing reliable tree-ring based growth-mortality relationships using binomial logistic regression models. As examples we used published tree-ring datasets from Abies alba growing in 13 different sites, and from Nothofagus dombeyi and Quercus petraea located in one single site. Our first approach, based on constant samplings, aims to (1) assess the dependency of growth-mortality relationships on the statistical sampling scheme used; (2) determine the best length of the time-window used to calculate each growth variable; and (3) reveal the presence of intra-specific shifts in growth-mortality relationships. We also followed a Bayesian approach to build the best multi-variable logistic model considering

  17. Application of a new methodology to evaluate Dnb limits based on statistical propagation of uncertainties

    International Nuclear Information System (INIS)

    One of the most important thermalhydraulics safety parameters is the DNBR (Departure from Nucleate Boiling Ratio). The current methodology in use at Eletronuclear to determine DNBR is extremely conservative and may result in penalties to the reactor power due to an increase plugging level of steam generator tubes. This work uses a new methodology to evaluate DNBR, named mini-RTDP. The standard methodology (STDP) currently in use establishes a limit design value which cannot be surpassed. This limit value is determined taking into account the uncertainties of the empirical correlation used in COBRA IIC/MIT code, modified to Angra 1 conditions. The correlation used is the Westinghouse's W-3 and the minimum DNBR (MDBR) value cannot be less than 1.3. The new methodology reduces the excessive level of conservatism associated with the parameters used in the DNBR calculation, which take most unfavorable values in the STDP methodology, by using their best estimate values. The final goal is to obtain a new DNBR design limit which will provide a margin gain due to more realistic parameters values used in the methodology. (author)

  18. Intrinsic Depletion or Not

    DEFF Research Database (Denmark)

    Klösgen, Beate; Bruun, Sara; Hansen, Søren;

    in between he polymer cushion and bulk water the layer was attributed to water of reduced density and was called "depletion layer".  Impurities or preparative artefacts were excluded as its origin. Later on, the formation of nanobubbles from this vapour-like water phase was initiated by tipping the......  The presence of a depletion layer of water along extended hydrophobic interfaces, and a possibly related formation of nanobubbles, is an ongoing discussion. The phenomenon was initially reported when we, years ago, chose thick films (~300-400Å) of polystyrene as cushions between a crystalline...... carrier and biomimetic membranes deposited thereupon and exposed to bulk water. While monitoring the sequential build-up of the sandwiched composite structure by continuous neutron reflectivity experiments the formation of an unexpected additional layer was detected (1). Located at the polystyrene surface...

  19. Depletion of intense fields

    CERN Document Server

    Seipt, D; Marklund, M; Bulanov, S S

    2016-01-01

    The interaction of charged particles and photons with intense electromagnetic fields gives rise to multi-photon Compton and Breit-Wheeler processes. These are usually described in the framework of the external field approximation, where the electromagnetic field is assumed to have infinite energy. However, the multi-photon nature of these processes implies the absorption of a significant number of photons, which scales as the external field amplitude cubed. As a result, the interaction of a highly charged electron bunch with an intense laser pulse can lead to significant depletion of the laser pulse energy, thus rendering the external field approximation invalid. We provide relevant estimates for this depletion and find it to become important in the interaction between fields of amplitude $a_0 \\sim 10^3$ and electron bunches with charges of the order of nC.

  20. Capital expenditure and depletion

    International Nuclear Information System (INIS)

    In the future, the increase in oil demand will be covered for the most part by non conventional oils, but conventional sources will continue to represent a preponderant share of the world oil supply. Their depletion represents a complex challenge involving technological, economic and political factors. At the same time, there is reason for concern about the decrease in exploration budgets at the major oil companies. (author)

  1. Revised Design-Based Research Methodology for College Course Improvement and Application to Education Courses in Japan

    Science.gov (United States)

    Akahori, Kanji

    2011-01-01

    The author describes a research methodology for college course improvement, and applies the results to education courses. In Japan, it is usually difficult to carry out research on college course improvement, because faculty cannot introduce experimental design approaches based on control and treatment groupings of students in actual classroom…

  2. Fully Depleted Charge-Coupled Devices

    International Nuclear Information System (INIS)

    We have developed fully depleted, back-illuminated CCDs that build upon earlier research and development efforts directed towards technology development of silicon-strip detectors used in high-energy-physics experiments. The CCDs are fabricated on the same type of high-resistivity, float-zone-refined silicon that is used for strip detectors. The use of high-resistivity substrates allows for thick depletion regions, on the order of 200-300 um, with corresponding high detection efficiency for near-infrared and soft x-ray photons. We compare the fully depleted CCD to the p-i-n diode upon which it is based, and describe the use of fully depleted CCDs in astronomical and x-ray imaging applications

  3. Effective diagnosis of Alzheimer’s disease by means of large margin-based methodology

    Directory of Open Access Journals (Sweden)

    Chaves Rosa

    2012-07-01

    Full Text Available Abstract Background Functional brain images such as Single-Photon Emission Computed Tomography (SPECT and Positron Emission Tomography (PET have been widely used to guide the clinicians in the Alzheimer’s Disease (AD diagnosis. However, the subjectivity involved in their evaluation has favoured the development of Computer Aided Diagnosis (CAD Systems. Methods It is proposed a novel combination of feature extraction techniques to improve the diagnosis of AD. Firstly, Regions of Interest (ROIs are selected by means of a t-test carried out on 3D Normalised Mean Square Error (NMSE features restricted to be located within a predefined brain activation mask. In order to address the small sample-size problem, the dimension of the feature space was further reduced by: Large Margin Nearest Neighbours using a rectangular matrix (LMNN-RECT, Principal Component Analysis (PCA or Partial Least Squares (PLS (the two latter also analysed with a LMNN transformation. Regarding the classifiers, kernel Support Vector Machines (SVMs and LMNN using Euclidean, Mahalanobis and Energy-based metrics were compared. Results Several experiments were conducted in order to evaluate the proposed LMNN-based feature extraction algorithms and its benefits as: i linear transformation of the PLS or PCA reduced data, ii feature reduction technique, and iii classifier (with Euclidean, Mahalanobis or Energy-based methodology. The system was evaluated by means of k-fold cross-validation yielding accuracy, sensitivity and specificity values of 92.78%, 91.07% and 95.12% (for SPECT and 90.67%, 88% and 93.33% (for PET, respectively, when a NMSE-PLS-LMNN feature extraction method was used in combination with a SVM classifier, thus outperforming recently reported baseline methods. Conclusions All the proposed methods turned out to be a valid solution for the presented problem. One of the advances is the robustness of the LMNN algorithm that not only provides higher separation rate between

  4. Innovative development methodology based on the Toyota Way; Innovative Entwicklungsmethodik basierend auf dem Toyota Way

    Energy Technology Data Exchange (ETDEWEB)

    Ueda, T. [Toyota Motor Corp., Aichi (Japan)

    2007-07-01

    Since its foundation, Toyota has been innovating in each process of automobile production based on the ideas shown in the Toyoda Precepts. These innovations are not only innovations in the methodology for various development processes, but also in company philosophy models cultivated by development engineers during their daily job. These are known as the ''Toyota Way'' and its elements which include ''Continuous Improvement'', ''Challenge'', ''Respect for People'', ''Genchi Genbutsu'' and ''Teamwork''. Various technological innovations have already progressed in the process from engine development to production. Examples are the use of CAD (computer aided design) and structure analysis based on 3-D CAD data in hardware development, CFD (computational fluid dynamics) and visualization techniques in combustion system development, MBD (model based development), the related DOE (design of engineering) and new algorithms in software development, as well as new production strategies linked to the new techniques mentioned above. The development of Toyota's direct injection gasoline engine (Toyota D-4 engine) originated from a pre-combustion chamber system in the 1970's with the aim of purifying the exhaust gases when the future of catalyst technology was unpromising. This technology could not be mass-produced, but could subsequently go on to the, R and D of DISC (direct injection stratified charge). In 1996, the stratified charge lean combustion gasoline DI engine could then be launched as the first-generation Toyota D-4 engine. After the second-generation D-4 engine was developed, this technology has now acceded to the dual injection D-4 system (D-4S), which combines it with homogeneous charged stoichiometric combustion and port fuel injection. During this progress, innovative development methods have been used and improvements in production

  5. A study of polar ozone depletion based on sequential assimilation of satellite data from the ENVISAT/MIPAS and Odin/SMR instruments

    Directory of Open Access Journals (Sweden)

    J. D. Rösevall

    2007-01-01

    Full Text Available The objective of this study is to demonstrate how polar ozone depletion can be mapped and quantified by assimilating ozone data from satellites into the wind driven transport model DIAMOND, (Dynamical Isentropic Assimilation Model for OdiN Data. By assimilating a large set of satellite data into a transport model, ozone fields can be built up that are less noisy than the individual satellite ozone profiles. The transported fields can subsequently be compared to later sets of incoming satellite data so that the rates and geographical distribution of ozone depletion can be determined. By tracing the amounts of solar irradiation received by different air parcels in a transport model it is furthermore possible to study the photolytic reactions that destroy ozone. In this study, destruction of ozone that took place in the Antarctic winter of 2003 and in the Arctic winter of 2002/2003 have been examined by assimilating ozone data from the ENVISAT/MIPAS and Odin/SMR satellite-instruments. Large scale depletion of ozone was observed in the Antarctic polar vortex of 2003 when sunlight returned after the polar night. By mid October ENVISAT/MIPAS data indicate vortex ozone depletion in the ranges 80–100% and 70–90% on the 425 and 475 K potential temperature levels respectively while the Odin/SMR data indicates depletion in the ranges 70–90% and 50–70%. The discrepancy between the two instruments has been attributed to systematic errors in the Odin/SMR data. Assimilated fields of ENVISAT/MIPAS data indicate ozone depletion in the range 10–20% on the 475 K potential temperature level, (~19 km altitude, in the central regions of the 2002/2003 Arctic polar vortex. Assimilated fields of Odin/SMR data on the other hand indicate ozone depletion in the range 20–30%.

  6. A methodological approach to characterise Landslide Periods based on historical series of rainfall and landslide damage

    Directory of Open Access Journals (Sweden)

    O. Petrucci

    2009-10-01

    Full Text Available Landslide Periods (LPs are defined as periods, shorter than a hydrological year, during which one or more landslide damage events occur in one or more sectors of a study area. In this work, we present a methodological approach, based on the comparative analysis of historical series of landslide damage and daily rainfall data, aiming to characterise the main types of LPs affecting selected areas. Cumulative rainfall preceding landslide activation is assessed for short (1, 2, 3, and 5 days, medium (7, 10, and 30 days and long (60, 90, and 180 days durations, and their Return Periods (RPs are assessed and ranked into three classes (Class 1: RP=5-10 years; Class 2: RP=11-15; Class 3: RP>15 years. To assess landslide damage, the Simplified Damage Index (SDI is introduced. This represents classified landslide losses and is obtained by multiplying the value of the damaged element and the percentage of damage affecting it. The comparison of the RP of rainfall and the SDI allows us to indentify the different types of LPs that affected the study area in the past and that could affect it again in the future.

    The results of this activity can be used for practical purposes to define scenarios and strategies for risk management, to suggest priorities in policy towards disaster mitigation and preparedness and to predispose defensive measures and civil protection plans ranked according to the types of LPs that must be managed.

    We present an application, performed for a 39-year series of rainfall/landslide damage data and concerning a study area located in NE Calabria (Italy; in this case study, we identify four main types of LPs, which are ranked according to damage severity.

  7. A human error probability estimate methodology based on fuzzy inference and expert judgment on nuclear plants

    International Nuclear Information System (INIS)

    Recent studies point human error as an important factor for many industrial and nuclear accidents: Three Mile Island (1979), Bhopal (1984), Chernobyl and Challenger (1986) are classical examples. Human contribution to these accidents may be better understood and analyzed by using Human Reliability Analysis (HRA), which has being taken as an essential part on Probabilistic Safety Analysis (PSA) of nuclear plants. Both HRA and PSA depend on Human Error Probability (HEP) for a quantitative analysis. These probabilities are extremely affected by the Performance Shaping Factors (PSF), which has a direct effect on human behavior and thus shape HEP according with specific environment conditions and personal individual characteristics which are responsible for these actions. This PSF dependence raises a great problem on data availability as turn these scarcely existent database too much generic or too much specific. Besides this, most of nuclear plants do not keep historical records of human error occurrences. Therefore, in order to overcome this occasional data shortage, a methodology based on Fuzzy Inference and expert judgment was employed in this paper in order to determine human error occurrence probabilities and to evaluate PSF's on performed actions by operators in a nuclear power plant (IEA-R1 nuclear reactor). Obtained HEP values were compared with reference tabled data used on current literature in order to show method coherence and valid approach. This comparison leads to a conclusion that this work results are able to be employed both on HRA and PSA enabling efficient prospection of plant safety conditions, operational procedures and local working conditions potential improvements (author)

  8. Study on dynamic team performance evaluation methodology based on team situation awareness model

    International Nuclear Information System (INIS)

    The purpose of this thesis is to provide a theoretical framework and its evaluation methodology of team dynamic task performance of operating team at nuclear power plant under the dynamic and tactical environment such as radiological accident. This thesis suggested a team dynamic task performance evaluation model so called team crystallization model stemmed from Endsely's situation awareness model being comprised of four elements: state, information, organization, and orientation and its quantification methods using system dynamics approach and a communication process model based on a receding horizon control approach. The team crystallization model is a holistic approach for evaluating the team dynamic task performance in conjunction with team situation awareness considering physical system dynamics and team behavioral dynamics for a tactical and dynamic task at nuclear power plant. This model provides a systematic measure to evaluate time-dependent team effectiveness or performance affected by multi-agents such as plant states, communication quality in terms of transferring situation-specific information and strategies for achieving the team task goal at given time, and organizational factors. To demonstrate the applicability of the proposed model and its quantification method, the case study was carried out using the data obtained from a full-scope power plant simulator for 1,000MWe pressurized water reactors with four on-the-job operating groups and one expert group who knows accident sequences. Simulated results team dynamic task performance with reference key plant parameters behavior and team-specific organizational center of gravity and cue-and-response matrix illustrated good symmetry with observed value. The team crystallization model will be useful and effective tool for evaluating team effectiveness in terms of recruiting new operating team for new plant as cost-benefit manner. Also, this model can be utilized as a systematic analysis tool for

  9. A methodology for incorporating geomechanically-based fault damage zones models into reservoir simulation

    Science.gov (United States)

    Paul, Pijush Kanti

    In the fault damage zone modeling study for a field in the Timor Sea, I present a methodology to incorporate geomechanically-based fault damage zones into reservoir simulation. In the studied field, production history suggests that the mismatch between actual production and model prediction is due to preferential fluid flow through the damage zones associated with the reservoir scale faults, which is not included in the baseline petrophysical model. I analyzed well data to estimate stress heterogeneity and fracture distributions in the reservoir. Image logs show that stress orientations are homogenous at the field scale with a strike-slip/normal faulting stress regime and maximum horizontal stress oriented in NE-SW direction. Observed fracture zones in wells are mostly associated with well scale fault and bed boundaries. These zones do not show any anomalies in production logs or well test data, because most of the fractures are not optimally oriented to the present day stress state, and matrix permeability is high enough to mask any small anomalies from the fracture zones. However, I found that fracture density increases towards the reservoir scale faults, indicating high fracture density zones or damage zones close to these faults, which is consistent with the preferred flow direction indicated by interference and tracer test done between the wells. It is well known from geologic studies that there is a concentration of secondary fractures and faults in a damage zone adjacent to larger faults. Because there is usually inadequate data to incorporate damage zone fractures and faults into reservoir simulation models, in this study I utilized the principles of dynamic rupture propagation from earthquake seismology to predict the nature of fractured/damage zones associated with reservoir scale faults. The implemented workflow can be used to more routinely incorporate damage zones into reservoir simulation models. Applying this methodology to a real reservoir utilizing

  10. Project-Based Learning and Agile Methodologies in Electronic Courses: Effect of Student Population and Open Issues

    Directory of Open Access Journals (Sweden)

    Marina Zapater

    2013-12-01

    Full Text Available Project-Based Learning (PBL and Agile methodologies have proven to be very interesting instructional strategies in Electronics and Engineering education, because they provide practical learning skills that help students understand the basis of electronics. In this paper we analyze two courses, one belonging to a Master in Electronic Engineering and one to a Bachelor in Telecommunication Engineering that apply Agile-PBL methodologies, and compare the results obtained in both courses with a traditional laboratory course. Our results support previous work stating that Agile-PBL methodologies increase student satisfaction. However, we also highlight some open issues that negatively affect the implementation of these methodologies,such as planning overhead or accidental complexity. Moreover,we show how differences in the student population, mostly related to the time spent on-campus, their commitment to the course or part-time dedication, have an impact on the benefits of Agile-PBL methods. In these cases, Agile-PBL methodologies by themselves are not enough and need to be combined with other techniques to increase student motivation.

  11. Ozone-depleting Substances (ODS)

    Data.gov (United States)

    U.S. Environmental Protection Agency — This site includes all of the ozone-depleting substances (ODS) recognized by the Montreal Protocol. The data include ozone depletion potentials (ODP), global...

  12. Methodological quality of systematic reviews and clinical trials on women's health published in a Brazilian evidence-based health journal

    OpenAIRE

    Cristiane Rufino Macedo; Rachel Riera; Maria Regina Torloni

    2013-01-01

    OBJECTIVES: To assess the quality of systematic reviews and clinical trials on women's health recently published in a Brazilian evidence-based health journal. METHOD: All systematic reviews and clinical trials on women's health published in the last five years in the Brazilian Journal of Evidence-based Health were retrieved. Two independent reviewers critically assessed the methodological quality of reviews and trials using AMSTAR and the Cochrane Risk of Bias Table, respectively. RESULTS: Sy...

  13. A New Genetic Algorithm Methodology for Design Optimization of Truss Structures: Bipopulation-Based Genetic Algorithm with Enhanced Interval Search

    OpenAIRE

    Tugrul Talaslioglu

    2009-01-01

    A new genetic algorithm (GA) methodology, Bipopulation-Based Genetic Algorithm with Enhanced Interval Search (BGAwEIS), is introduced and used to optimize the design of truss structures with various complexities. The results of BGAwEIS are compared with those obtained by the sequential genetic algorithm (SGA) utilizing a single population, a multipopulation-based genetic algorithm (MPGA) proposed for this study and other existing approaches presented in literature. This study has two goals: o...

  14. Validation of 2DH hydrodynamic and morphological mathematical models. A methodology based on SAR imaging

    Science.gov (United States)

    Canelas, Ricardo; Heleno, Sandra; Pestana, Rita; Ferreira, Rui M. L.

    2014-05-01

    The objective of the present work is to devise a methodology to validate 2DH shallow-water models suitable to simulate flow hydrodynamics and channel morphology. For this purpose, a 2DH mathematical model, assembled at CEHIDRO, IST, is employed to model Tagus river floods over a 70 km reach and Synthetic Aperture Radar (SAR) images are collected to retrieve planar inundation extents. The model is suited for highly unsteady discontinuous flows over complex, time-evolving geometries, employing a finite-volume discretization scheme, based on a flux-splitting technique incorporating a reviewed version of the Roe Riemann solver. Novel closure terms for the non-equilibrium sediment transport model are included. New boundary conditions are employed, based on the Riemann variables associated the outgoing characteristic fields, coping with the provided hydrographs in a mathematically coherent manner. A high resolution Digital Elevation Model (DEM) is used and levee structures are considered as fully erodible elements. Spatially heterogeneous roughness characteristics are derived from land-use databases such as CORINE LandCover 2006. SAR satellite imagery of the floods is available and is used to validate the simulation results, with particular emphasis on the 2000/2001 flood. The delimited areas from the satellite and simulations are superimposed. The quality of the adjustment depends on the calibration of roughness coefficients and the spatial discretization of with small structures, with lengths at the order of the spatial discretization. Flow depths and registered discharges are recovered from the simulation and compared with data from a measuring station in the domain, with the comparison revealing remarkably high accuracy, both in terms of amplitudes and phase. Further inclusion of topographical detail should improve the comparison of flood extents regarding satellite data. The validated model was then employed to simulate 100-year floods in the same reach. The

  15. Safety assessment of a vault-based disposal facility using the ISAM methodology

    International Nuclear Information System (INIS)

    As part of the IAEA's Co-ordinated Research Project (CRP) on Improving Long-term of Safety Assessment Methodologies for Near Surface Waste Disposal Facilities (ISAM), three example cases were developed. The aim was to testing the ISAM safety assessment methodology using as realistic as possible data. One of the Test Cases, the Vault Test Case (VTC), related to the disposal of low level radioactive waste (LLW) to a hypothetical facility comprising a set of above surface vaults. This paper uses the various steps of the ISAM safety assessment methodology to describe the work undertaken by ISAM participants in developing the VTC and provides some general conclusions that can be drawn from the findings of their work. (author)

  16. A methodology aimed at fostering and sustaining the development processes of an IE-based industry

    Science.gov (United States)

    Corallo, Angelo; Errico, Fabrizio; de Maggio, Marco; Giangreco, Enza

    In the current competitive scenario, where business relationships are fundamental in building successful business models and inter/intra organizational business processes are progressively digitalized, an end-to-end methodology is required that is capable of guiding business networks through the Internetworked Enterprise (IE) paradigm: a new and innovative organizational model able to leverage Internet technologies to perform real-time coordination of intra and inter-firm activities, to create value by offering innovative and personalized products/services and reduce transaction costs. This chapter presents the TEKNE project Methodology of change that guides business networks, by means of a modular and flexible approach, towards the IE techno-organizational paradigm, taking into account the competitive environment of the network and how this environment influences its strategic, organizational and technological levels. Contingency, the business model, enterprise architecture and performance metrics are the key concepts that form the cornerstone of this methodological framework.

  17. Reliability analysis of repairable system based on GO-FLOW methodology

    International Nuclear Information System (INIS)

    A quantitative analysis method named GO-FLOW is introduced to analyze the reliability of system with priority in maintenance and the amount of repairman limited. Approximate formulas model that can be applied to the GO-FLOW calculation is derived for the reliability parameters of repairable assembly. Then the model's feasibility is validated, and its error is analyzed. An example of redundancy pump component is presented, and the result achieved by GO-FLOW is compared with that by GO methodology. The results show that GO-FLOW Methodology can be used for quantitative analysis of this sort of repairable system; The model of GO-FLOW is effective and the algorithm is more convenient compared with GO methodology. (authors)

  18. Parametric Identification of Solar Series based on an Adaptive Parallel Methodology

    Indian Academy of Sciences (India)

    Juan A. Gómez Pulido; Miguel A. Vega Rodríguez; Juan M. Sánchez Pérez

    2005-03-01

    In this work we present an adaptive parallel methodology to optimize the identification of time series through parametric models, applying it to the case of sunspot series. We employ high precision computation of system identification algorithms, and use recursive least squares processing and ARMAX (Autoregressive Moving Average Extensive) parametric modelling. This methodology could be very useful when the high precision mathematical modelling of dynamic complex systems is required. After explaining the proposed heuristics and the tuning of its parameters, we showthe results we have found for several solar series using different implementations. Thus, we demonstrate how the result precision improves.

  19. Environmental external effects from wind power based on the EU ExternE methodology

    DEFF Research Database (Denmark)

    Ibsen, Liselotte Schleisner; Nielsen, Per Sieverts

    1998-01-01

    The European Commission has launched a major study project, ExternE, to develop a methodology to quantify externalities. A “National Implementation Phase”, was started under the Joule II programme with the purpose of implementing the ExternE methodology in all member states. The main objective of...... the Danish part of the project is to implement the framework for externality evaluation, for three different power plants located in Denmark. The paper will focus on the assessment of the impacts of the whole fuel cycles for wind, natural gas and biogas. Priority areas for environmental impact...

  20. Methodology for the Construction of a Rule-Based Knowledge Base Enabling the Selection of Appropriate Bronze Heat Treatment Parameters Using Rough Sets

    Directory of Open Access Journals (Sweden)

    Górny Z.

    2015-04-01

    Full Text Available Decisions regarding appropriate methods for the heat treatment of bronzes affect the final properties obtained in these materials. This study gives an example of the construction of a knowledge base with application of the rough set theory. Using relevant inference mechanisms, knowledge stored in the rule-based database allows the selection of appropriate heat treatment parameters to achieve the required properties of bronze. The paper presents the methodology and the results of exploratory research. It also discloses the methodology used in the creation of a knowledge base.

  1. Assessing Source Zone Mass Depletion in Heterogeneous Media: Application of a Multi-Rate-Mass-Transfer Approach Based on a Geostatistical Medium Description

    Science.gov (United States)

    Elenius, M. T.; Miller, E. L.; Abriola, L. M.

    2014-12-01

    Chlorinated solvents tend to persist for long periods in heterogeneous porous media, in part due to sustained sources of contaminant sequestered in lower permeability zones and sorbed to the soil matrix. Sharp contrasts in soil properties have been modeled successfully using Markov Chain / Transition Probability (MC/TP) methods. This statistical approach provides a means of generating permeability fields that are consistent with prior knowledge concerning the frequency and relative positioning of different strata.To assess source zone mass depletion in a suite of such geological realizations, the large computational burden may prohibit the use of direct numerical simulations. One alternative approach is the application of a multi-rate-mass-transfer (MRMT) method, an extension of the dual-domain concept that was first developed in the soil science literature for sorption modeling. In MRMT, rather than discretizing immobile regions, such as clay layers, the concentration in these regions is treated by explicit state variables, and the transport between mobile and immobile regions is modeled by first-order exchange terms. However, in the implementation of this approach, fine-scale simulations on subdomains are often necessary to develop appropriate parameters. Such simulations are tedious, especially when attempting to account for uncertainty in the geological description. In this work, the link between characteristics of MC/TP and transfer parameters in the MRMT is evaluated by regression based on fine-scale simulations, using the simulator MODFLOW/MT3DMS. Upscaled simulation results are obtained with the same simulator, linked to an MRMT module. The results facilitate efficient assessment of reactive transport in domains with sharp contrasts in soil properties and limit the need for fine-scale numerical simulations.

  2. Stratospheric ozone depletion

    International Nuclear Information System (INIS)

    The amount of stratospheric ozone and the reduction of the ozone layer vary according to seasons and latitudes. At present total and vertical ozone is monitored over all Austria. The mean monthly ozone levels between 1994 and 2000 are presented. Data on stratospheric ozone and UV-B radiation are published daily on the home page http: www.lebesministerium.at. The use of ozone depleting substances such as chlorofluorocarbons (CFCs), hydrochlorofluorocarbons (HCFCs) is provided. Besides, the national measures taken to reduce their use. Figs. 2, Tables 2. (nevyjel)

  3. A neutronics methodology for the NIST research reactor based on MCNPX

    International Nuclear Information System (INIS)

    A methodology for calculating inventories for the NBSR has been developed using the MCNPX computer code with the BURN option. A major advantage of the present methodology over the previous methodology, where MONTEBURNS and MCNP5 were used, is that more materials can be included in the model. The NBSR has 30 fuel elements each with a 17.8 cm (7 in) gap in the middle of the fuel. In the startup position, the shim control arms are partially inserted in the top half of the core. During the 38.5 day cycle, the shim arms are slowly removed to their withdrawn (horizontal) positions. This movement of shim arms causes asymmetries between the burnup of the fuel in the upper and lower halves and across the line of symmetry for the fuel loading. With the MONTEBURNS analyses there was a limitation to the number of materials that could be analyzed so 15 materials in the top half of the core and 15 materials in the bottom half of the core were used, and a half-core (east-west) symmetry was assumed. Since MCNPX allows more materials, this east-west symmetry was not necessary and the core was represented with 60 different materials. The methodology for developing the inventories is presented along with comparisons of neutronic parameters calculated with the previous and present sets of inventories. (author)

  4. Context Based Inferences in Research Methodology: The Role of Culture in Justifying Knowledge Claims

    Science.gov (United States)

    Evers, Colin W.; Mason, Mark

    2011-01-01

    Drawing on work in epistemology and the philosophy of science, this paper seeks to provide very general reasons for why a comparative perspective needs to be applied to the inferential procedures of research methodologies where these concern the issue of justifying knowledge claims. In particular, the paper explores the role of culture on a number…

  5. A Neutronics Methodology for the NIST Research Reactor Based on MCNXP

    International Nuclear Information System (INIS)

    A methodology for calculating inventories for the NBSR has been developed using the MCNPX computer code with the BURN option. A major advantage of the present methodology over the previous methodology, where MONTEBURNS and MCNP5 were used, is that more materials can be included in the model. The NBSR has 30 fuel elements each with a 17.8 cm (7 in) gap in the middle of the fuel. In the startup position, the shim control arms are partially inserted in the top half of the core. During the 38.5 day cycle, the shim arms are slowly removed to their withdrawn (horizontal) positions. This movement of shim arms causes asymmetries between the burnup of the fuel in the upper and lower halves and across the line of symmetry for the fuel loading. With the MONTEBURNS analyses there was a limitation to the number of materials that could be analyzed so 15 materials in the top half of the core and 15 materials in the bottom half of the core were used, and a half-core (east-west) symmetry was assumed. Since MCNPX allows more materials, this east-west symmetry was not necessary and the core was represented with 60 different materials. The methodology for developing the inventories is presented along with comparisons of neutronic parameters calculated with the previous and present sets of inventories.

  6. Power estimation and optimization methodologies for VLIW-based embedded systems

    CERN Document Server

    Zaccaria, Vittorio; Sciuto, Donatella; Silvano, Cristina

    2007-01-01

    Power Estimation Methods.- Background.- Instruction-Level Power Estimation for VLIW Processor Cores.- Software Power Estimation of the LX Core: A Case Study.- System-Level Power Estimation for the LX Architecture.- Microprocessor Abstraction Levels.- Power Optimization Methods.- Background.- A Micro-Architectural Optimization for Low Power.- A Design Space Exploration Methodology.- Conclusions and future work.

  7. Spatial and model-order based reactor signal analysis methodology for BWR core stability evaluation

    International Nuclear Information System (INIS)

    A new methodology for the boiling water reactor core stability evaluation from measured noise signals has been recently developed and adopted at the Paul Scherrer Institut (PSI). This methodology consists in a general reactor noise analysis where as much as possible information recorded during the tests is investigated prior to determining core representative stability parameters, i.e. the decay ratio (DR) and the resonance frequency, along with an associated estimate of the uncertainty range. A central part in this approach is that the evaluation of the core stability parameters is performed not only for a few but for ALL recorded neutron flux signals, allowing thereby the assessment of signal-related uncertainties. In addition, for each signal, three different model-order optimization methods are systematically employed to take into account the sensitivity upon the model-order. The current methodology is then applied to the evaluation of the core stability measurements performed at the Leibstadt NPP, Switzerland, during cycles 10, 13 and 19. The results show that as the core becomes very stable, the method-related uncertainty becomes the major contributor to the overall uncertainty range while for intermediate DR values, the signal-related uncertainty becomes dominant. However, as the core stability deteriorates, the method-related and signal-related spreads have similar contributions to the overall uncertainty, and both are found to be small. The PSI methodology identifies the origin of the different contributions to the uncertainty. Furthermore, in order to assess the results obtained with the current methodology, a comparative study is for completeness carried out with respect to results from previously developed and applied procedures. The results show a good agreement between the current method and the other methods

  8. Review of Seismic Evaluation Methodologies for Nuclear Power Plants Based on a Benchmark Exercise

    International Nuclear Information System (INIS)

    quantification of the effect of different analytical approaches on the response of the piping system under single and multi-support input motions), the spent fuel pool (to estimate the sloshing frequencies, maximum wave height and spilled water amount, and predict free surface evolution), and the pure water tank (to predict the observed buckling modes of the pure water tank). Analyses of the main results include comparison between different computational models, variability of results among participants, and comparison of analysis results with recorded ones. This publication addresses state of the practice for seismic evaluation and margin assessment methodologies for SSCs in NPPs based on the KARISMA benchmark exercise. As such, it supports and complements other IAEA publications with respect to seismic safety of new and existing nuclear installations. It was developed within the framework of International Seismic Safety Centre activities. It provides detailed guidance on seismic analysis, seismic design and seismic safety re-evaluation of nuclear installations and will be of value to researchers, operating organizations, regulatory authorities, vendors and technical support organizations

  9. Applying distance-to-target weighing methodology to evaluate the environmental performance of bio-based energy, fuels, and materials

    International Nuclear Information System (INIS)

    The enhanced use of biomass for the production of energy, fuels, and materials is one of the key strategies towards sustainable production and consumption. Various life cycle assessment (LCA) studies demonstrate the great potential of bio-based products to reduce both the consumption of non-renewable energy resources and greenhouse gas emissions. However, the production of biomass requires agricultural land and is often associated with adverse environmental effects such as eutrophication of surface and ground water. Decision making in favor of or against bio-based and conventional fossil product alternatives therefore often requires weighing of environmental impacts. In this article, we apply distance-to-target weighing methodology to aggregate LCA results obtained in four different environmental impact categories (i.e., non-renewable energy consumption, global warming potential, eutrophication potential, and acidification potential) to one environmental index. We include 45 bio- and fossil-based product pairs in our analysis, which we conduct for Germany. The resulting environmental indices for all product pairs analyzed range from -19.7 to +0.2 with negative values indicating overall environmental benefits of bio-based products. Except for three options of packaging materials made from wheat and cornstarch, all bio-based products (including energy, fuels, and materials) score better than their fossil counterparts. Comparing the median values for the three options of biomass utilization reveals that bio-energy (-1.2) and bio-materials (-1.0) offer significantly higher environmental benefits than bio-fuels (-0.3). The results of this study reflect, however, subjective value judgments due to the weighing methodology applied. Given the uncertainties and controversies associated not only with distance-to-target methodologies in particular but also with weighing approaches in general, the authors strongly recommend using weighing for decision finding only as a

  10. Vulnerability curves vs. vulnerability indicators: application of an indicator-based methodology for debris-flow hazards

    Science.gov (United States)

    Papathoma-Köhle, Maria

    2016-08-01

    The assessment of the physical vulnerability of elements at risk as part of the risk analysis is an essential aspect for the development of strategies and structural measures for risk reduction. Understanding, analysing and, if possible, quantifying physical vulnerability is a prerequisite for designing strategies and adopting tools for its reduction. The most common methods for assessing physical vulnerability are vulnerability matrices, vulnerability curves and vulnerability indicators; however, in most of the cases, these methods are used in a conflicting way rather than in combination. The article focuses on two of these methods: vulnerability curves and vulnerability indicators. Vulnerability curves express physical vulnerability as a function of the intensity of the process and the degree of loss, considering, in individual cases only, some structural characteristics of the affected buildings. However, a considerable amount of studies argue that vulnerability assessment should focus on the identification of these variables that influence the vulnerability of an element at risk (vulnerability indicators). In this study, an indicator-based methodology (IBM) for mountain hazards including debris flow (Kappes et al., 2012) is applied to a case study for debris flows in South Tyrol, where in the past a vulnerability curve has been developed. The relatively "new" indicator-based method is being scrutinised and recommendations for its improvement are outlined. The comparison of the two methodological approaches and their results is challenging since both methodological approaches deal with vulnerability in a different way. However, it is still possible to highlight their weaknesses and strengths, show clearly that both methodologies are necessary for the assessment of physical vulnerability and provide a preliminary "holistic methodological framework" for physical vulnerability assessment showing how the two approaches may be used in combination in the future.

  11. The IAEA collaborating centre for neutron activation based methodologies of research reactors

    International Nuclear Information System (INIS)

    The Reactor Institute Delft of the Delft University of Technology houses the Netherlands' only academic nuclear research reactor, with associated instrumentation and laboratories, for scientific education and research with ionizing radiation. The Institute's swimming pool type research reactor reached first criticality in 1963 and is currently operated at 2MW thermal powers on a 100 h/week basis. The reactor is equipped with neutron mirror guides serving ultra modern neutron beam physics instruments and with a very bright positron facility. Fully automated gamma-ray spectrometry systems are used by the laboratory for neutron activation analysis, providing large scale services under an ISO/IEC 17025:2005 compliant management system, being (since 1993) the first accredited laboratory of its kind in the world. Already for several years, this laboratory is sustainable by rendering these services to both the public and the private sector. The prime user of the Institute's fac ilities is the scientific Research Department of Radiation, Radionuclide and Reactors of the Faculty of Applied Sciences, housed inside the building. All reactor facilities are also made available for use by or for services to, external clients (industry, government, private sector, other (international research institutes and universities). The Reactor Institute Delft was inaugurated in May 2009 as a new lAEA Collaborating Centre for Neutron Activation Based Methodologies of Research Reactors. The collaboration involves education, research and development in (I) Production of reactor-produced, no-carrier added radioisotopes of high specific activity via neutron activation; (II) Neutron activation analysis with emphasis on automation as well as analysis of large samples, and radiotracer techniques and as a cross-cutting activity, (IIl) Quality assurance and management in research and application of research reactor based techniques and in research reactor operations. This c ollaboration will

  12. Microscopic to macroscopic depletion model development for FORMOSA-P

    International Nuclear Information System (INIS)

    Microscopic depletion has been gaining popularity with regard to employment in reactor core nodal calculations, mainly attributed to the superiority of microscopic depletion in treating spectral history effects during depletion. Another trend is the employment of loading pattern optimization computer codes in support of reload core design. Use of such optimization codes has significantly reduced design efforts to optimize reload core loading patterns associated with increasingly complicated lattice designs. A microscopic depletion model has been developed for the FORMOSA-P pressurized water reactor (PWR) loading pattern optimization code. This was done for both fidelity improvements and to make FORMOSA-P compatible with microscopic-based nuclear design methods. Needless to say, microscopic depletion requires more computational effort compared with macroscopic depletion. This implies that microscopic depletion may be computationally restrictive if employed during the loading pattern optimization calculation because many loading patterns are examined during the course of an optimization search. Therefore, the microscopic depletion model developed here uses combined models of microscopic and macroscopic depletion. This is done by first performing microscopic depletions for a subset of possible loading patterns from which 'collapsed' macroscopic cross sections are obtained. The collapsed macroscopic cross sections inherently incorporate spectral history effects. Subsequently, the optimization calculations are done using the collapsed macroscopic cross sections. Using this approach allows maintenance of microscopic depletion level accuracy without substantial additional computing resources

  13. Evaluating the impact of a locality based social policy intervention on mental health: conceptual and methodological issues.

    Science.gov (United States)

    Rogers, A; Huxley, P; Thomas, R; Robson, B; Evans, S; Stordy, J; Gately, C

    2001-01-01

    Urban regeneration initiatives provide an opportunity for examining the impact of changes in socio-economic circumstances on the mental health of different groups and individuals within localities. This paper sets out the conceptual and methodological bases for evaluating the impact of a population based social policy intervention on mental health. We suggest the need to integrate a range of disciplinary and methodological developments in research on health inequalities in exploring the impact of urban regeneration on mental health. A combination of multi-level modelling, subjective indicators and narrative accounts of individuals about mental health in the context of locality and personal changes are central for developing theories and methods appropriate for exploring the action and interaction of effects operating between structural and individual/agency levels. PMID:11694057

  14. A biocompatible microchip and methodology for efficiently trapping and positioning living cells into array based on negative dielectrophoresis

    Science.gov (United States)

    Guo, Xiaoliang; Zhu, Rong

    2015-06-01

    We present a microchip and trapping methodology based on negative dielectrophoresis (nDEP), whereby living cells were manipulated and positioned into an array with high trapping efficiency while maintaining good viability. The main factors that ensured good viability of cells were investigated including temperature of medium, extra transmembrane potential on cells, and electrolysis effect in DEP-based trapping. Optimum DEP conditions for the microchip were determined by considering both biocompatibility and trapping efficiency. Experiments demonstrated that under a voltage of 3.6-4 Vpp and at a frequency of 100 kHz, HeLa cells could be trapped and positioned into an array in less than 10 s while maintaining good viability. The normal adherence morphology and fluorescence of the cells, dyed with propidium iodide and Calcein-AM, were observed and verified the biocompatibility of the microchip and trapping methodology.

  15. Business analysis methodology in telecommunication industry – the research based on the grounded theory

    Directory of Open Access Journals (Sweden)

    Hana Nenickova

    2013-10-01

    Full Text Available The objective of this article is to present the grounded theory using in the qualitative research as a basis to build a business analysis methodology for the implementation of information systems in telecommunication enterprises in Czech Republic. In the preparation of the methodology I have used the current needs of telecommunications companies, which are characterized mainly by high dependence on information systems. Besides that, this industry is characterized by high flexibility and competition and compressing of the corporate strategy timeline. The grounded theory of business analysis defines the specifics of the telecommunications industry, focusing on the very specific description of the procedure for collecting the business requirements and following the business strategy.

  16. A correlative imaging based methodology for accurate quantitative assessment of bone formation in additive manufactured implants.

    Science.gov (United States)

    Geng, Hua; Todd, Naomi M; Devlin-Mullin, Aine; Poologasundarampillai, Gowsihan; Kim, Taek Bo; Madi, Kamel; Cartmell, Sarah; Mitchell, Christopher A; Jones, Julian R; Lee, Peter D

    2016-06-01

    A correlative imaging methodology was developed to accurately quantify bone formation in the complex lattice structure of additive manufactured implants. Micro computed tomography (μCT) and histomorphometry were combined, integrating the best features from both, while demonstrating the limitations of each imaging modality. This semi-automatic methodology registered each modality using a coarse graining technique to speed the registration of 2D histology sections to high resolution 3D μCT datasets. Once registered, histomorphometric qualitative and quantitative bone descriptors were directly correlated to 3D quantitative bone descriptors, such as bone ingrowth and bone contact. The correlative imaging allowed the significant volumetric shrinkage of histology sections to be quantified for the first time (~15 %). This technique demonstrated the importance of location of the histological section, demonstrating that up to a 30 % offset can be introduced. The results were used to quantitatively demonstrate the effectiveness of 3D printed titanium lattice implants. PMID:27153828

  17. Methodology based on the value of lost load for evaluating economical losses due to disturbances in the power quality

    International Nuclear Information System (INIS)

    The paper focuses on the evaluation of the economic losses in the industrial sector due to a lack of power quality. The term power quality includes, in general, a set of boundary conditions which allow electrical systems connected to the grid to function as planned. Therefore, the operation of the power system outside these boundaries directly impacts on the economic performance of the whole system. The evaluation of the economical loses derived from this fact has been the subject of several studies, mostly applied to individual cases, with calculation methodologies either too complicated to be applicable, needing a large input of data, mostly dealing with confidentiality issues, or based on particular cases and impossible to extrapolate. The objective of this paper is provide decision makers and authorities responsible of enforcing compensations for losses derived of power quality issues with a relatively easy to apply methodology for calculating the real quantity of these losses. An example of the application of this methodology and an analysis of the results is also provided. The validation of the methodology and its use for different comparison analysis was also carried out. Conclusions are given at the end of the work and further research is proposed.

  18. A contract-based methodology for aircraft electric power system design

    OpenAIRE

    Nuzzo, P; H. Xu; Ozay, N; Finn, JB; Sangiovanni-Vincentelli, AL; Murray, RM; Donzé, A; Seshia, SA

    2014-01-01

    In an aircraft electric power system, one or more supervisory control units actuate a set of electromechanical switches to dynamically distribute power from generators to loads, while satisfying safety, reliability, and real-time performance requirements. To reduce expensive redesign steps, this control problem is generally addressed by minor incremental changes on top of consolidated solutions. A more systematic approach is hindered by a lack of rigorous design methodologies that allow estim...

  19. IMPROVING PSYCHOMOTRICITY COMPONENTS IN PRESCHOOL CHILDREN USING TEACHING METHODOLOGIES BASED ON MIRROR NEURONS ACTIVATION

    OpenAIRE

    Gáll Zs. Sz.; Balint L.

    2015-01-01

    The scientific substrate of the study relies upon the concept of mirror neurons. Unlike other neurons, these are characterized by an imitation feature. They play an important role in learning processes – especially during childhood, enabling the imitation of motions and determining the primary acquirement thereof. Using this as a starting point, the study aims to work out and apply a methodology in keeping with the content of the psychomotor expression activities curriculum for preschool educ...

  20. Groundwater Interactive: Interdisciplinary Web-Based Software Incorporating New Learning Methodologies and Technologies

    OpenAIRE

    Mendez, Eduardo

    2002-01-01

    Groundwater related courses are offered through several colleges at Virginia Tech. These classes enroll a diverse group of students with varied academic backgrounds and educational levels. Though these classes emphasize different aspects of groundwater resources, they lack a unified approach in instructional materials and learning methodologies for knowledge they do share. The goals of this research are to lessen the impact of variable student backgrounds and to better integrate the course...

  1. Risk-based methodology for parameter calibration of a reservoir flood control model

    OpenAIRE

    Bianucci, P.; A. Sordo-Ward; Pérez, J. I.; J. García-Palacios; L. Mediero; L. Garrote

    2013-01-01

    Flash floods are of major relevance in natural disaster management in the Mediterranean region. In many cases, the damaging effects of flash floods can be mitigated by adequate management of flood control reservoirs. This requires the development of suitable models for optimal operation of reservoirs. A probabilistic methodology for calibrating the parameters of a reservoir flood control model (RFCM) that takes into account the stochastic variability of flood events is presented. This study a...

  2. Development of six sigma concurrent parameter and tolerance design method based on response surface methodology

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Using Response Surface Methodology (RSM), an optimizing model of concurrent parameter and tolerance design is proposed where response mean equals its target in the target being best. The optimizing function of the model is the sum of quality loss and tolerance cost subjecting to the variance confidence region of which six sigma capability can be assured. An example is illustrated in order to compare the differences between the developed model and the parameter design with minimum variance. The results show ...

  3. The rationale behind the development of an airline operations control centre using Gaia-based methodology.

    OpenAIRE

    Castro, Antonio J.M.; Eugenio Oliveira

    2008-01-01

    In this paper, we report how we complemented Gaia methodology to analyse and design a multi-agent system for an airline company operations control centre. Besides showing the rationale behind the analysis, design and implementation of our system, we also present how we mapped the abstractions used in agent-oriented design to specific constructs in JADE. The advantages of using a goal-oriented early requirements analysis and its influence on subsequent phases of analysis and design are also pr...

  4. Optimization of Immobilization Conditions of Candida antarctica Lipase Based on Response Surface Methodology

    OpenAIRE

    Liu, J.-H.; Zhang, Y.-Y; Xia, Y.-M.; Su, F

    2010-01-01

    The conditions, including mass ratio of PEG4000 to lipase, pH, and mass ratio of diatomites to lipase, for immobilization of Candida antarctica lipase with PEG non-covalent modification were optimized by means of the response surface methodology (RSM). The immobilized lipase specific activity in the reaction of transesterification was selected as the response value. A mathematical model was developed to investigate the influences of various immobilization parameters and to predict the optimum...

  5. Optimization design of the stratospheric airship's power system based on the methodology of orthogonal experiment

    Institute of Scientific and Technical Information of China (English)

    Jian LIU; Quan-bao WANG; Hai-tao ZHAO; Ji-an CHEN; Ye QIU; Deng-ping DUAN

    2013-01-01

    The optimization design of the power system is essential for stratospheric airships with paradoxical requirements of high reliability and low weight.The methodology of orthogonal experiment is presented to deal with the problem of the optimization design of the airship's power system.Mathematical models of the solar array,regenerative fuel cell,and power management subsystem (PMS) are presented.The basic theory of the method of orthogonal experiment is discussed,and the selection of factors and levels of the experiment and the choice of the evaluation function are also revealed.The proposed methodology is validated in the optimization design of the power system of the ZhiYuan-2 stratospheric airship.Results show that the optimal configuration is easily obtained through this methodology.Furthermore,the optimal configuration and three sub-optimal configurations are in the Pareto frontier of the design space.Sensitivity analyses for the weight and reliability of the airship's power system are presented.

  6. Open Space Evaluation Methodology and Three Dimensional Evaluation Model as a Base for Sustainable Development Tracking

    Directory of Open Access Journals (Sweden)

    Melita Rozman Cafuta

    2015-10-01

    Full Text Available Sustainable development, as a concept of controlled development, is a management characteristic. Adaptation to progress is important to achieve sustainability. The research focus here is on developing an evaluation methodology for determining the characteristics of urban open space. A method was designed for use in the comparative analysis of environmental perception evaluation between different time sequences. It allows us to compare results before and after spatial interventions or spatial development tracking over time. The newly-developed SEC model (suitable for everyone, environmentally-accepted, and cost-effective was an essential element used in the research methodology. The model was designed using the systematic principle, the top–down approach, and the decomposition method. Three basic dimensions were divided into six factors. Factors were divided into eighteen indicators that are either quantitatively or qualitatively assessed. Indicators were divided into several aspects. An instrument (questionnaire was developed to support the evaluation methodology of urban open space characteristics. Every aspect belongs to a question in the questionnaire. The applicability of the SEC model was demonstrated in two case studies. Evaluation took place during two different time sequences, once during the day-time and once during the night. Obtained results provide useful information of the current spatial situation needed for sustainable development strategy preparation.

  7. Multicriteria decision-making analysis based methodology for predicting carbonate rocks' uniaxial compressive strength

    Directory of Open Access Journals (Sweden)

    Ersoy Hakan

    2012-10-01

    Full Text Available

    ABSTRACT

    Uniaxial compressive strength (UCS deals with materials' to ability to withstand axially-directed pushing forces and especially considered to be rock materials' most important mechanical properties. However, the UCS test is an expensive, very time-consuming test to perform in the laboratory and requires high-quality core samples having regular geometry. Empirical equations were thus proposed for predicting UCS as a function of rocks' index properties. Analytical hierarchy process and multiple regression analysis based methodology were used (as opposed to traditional linear regression methods on data-sets obtained from carbonate rocks in NE Turkey. Limestone samples ranging from Devonian to late Cretaceous ages were chosen; travertine-onyx samples were selected from morphological environments considering their surface environmental conditions Test results from experiments carried out on about 250 carbonate rock samples were used in deriving the model. While the hierarchy model focused on determining the most important index properties affecting on UCS, regression analysis established meaningful relationships between UCS and index properties; 0. 85 and 0. 83 positive coefficient correlations between the variables were determined by regression analysis. The methodology provided an appropriate alternative to quantitative estimation of UCS and avoided the need for tedious and time consuming laboratory testing


    RESUMEN

    La resistencia a la compresión uniaxial (RCU trata con la capacidad de los materiales para soportar fuerzas empujantes dirigidas axialmente y, especialmente, es considerada ser uno de las más importantes propiedades mecánicas de

  8. Mindfulness-based intervention for prodromal sleep disturbances in older adults: Design and methodology of a randomized controlled trial

    OpenAIRE

    Black, David S.; O’Reilly, Gillian A.; Olmstead, Richard; Breen, Elizabeth C.; Irwin, Michael R.

    2014-01-01

    Sleep problems are prevalent among older adults, often persist untreated, and are predictive of health detriments. Given the limitations of conventional treatments, non-pharmacological treatments such as mindfulness-based interventions (MBIs) are gaining popularity for sleep ailments. However, nothing is yet known about the impact of MBIs on sleep in older adults with prodromal sleep disturbances. This article details the design and methodology of a 6-week parallel-group RCT calibrated to tes...

  9. Subclassification of small bowel Crohn's disease using magnetic resonance enterography: a review using evidence-based medicine methodology.

    Science.gov (United States)

    Murphy, D J; Smyth, A E; McEvoy, S H; Gibson, D J; Doherty, G A; Malone, D E

    2015-12-01

    Magnetic resonance enterography (MRE) has a growing role in imaging small bowel Crohn's disease (SBCD), both in diagnosis and assessment of treatment response. Certain SBCD phenotypes respond well to biologic therapy and others require surgery; MRE has an expanding role in triaging these patients. In this review, we evaluate the MRE signs that subclassify SBCD using evidence-based medicine (EBM) methodology and provide a structured approach to MRE interpretation. PMID:26372328

  10. Structural Studies on Flexible Small Molecules Based on NMR in Oriented Media. Methodology and Application to Natural Products

    OpenAIRE

    Trigo Mouriño, Pablo

    2013-01-01

    This thesis describes the development and application of structural elucidation methodologies based on NMR in aligned media. Nuclear magnetic resonance is arguably the most important technique for the structural analysis of organic molecules in solution. In the last decade, Residual Dipolar Coupling (RDC) analysis emerged as a powerful tool for the determination of the three-dimensional structure of organic molecules in solution, complementing and even outperforming the approac...

  11. Hybrid methodology for social & digital space design: user experience & interaction models design based on human science & user-centered

    OpenAIRE

    Bollini, L

    2012-01-01

    The spread of social dynamics of web 2.0 and the SPIME devices (Sterling 2006) which allow consultation georeferenced and in mobile context of internet data have introduced new challenges to the classic (web) interfaces and interactive systems design models. The approaches of the design disciplines – based on problem-solving, analytical & synthetic models (see Munari 1981; Potter 2002 and Bollini 2007) or already mediated by cognitive science methodologies such as user-personas...

  12. The New MCNP6 Depletion Capability

    Energy Technology Data Exchange (ETDEWEB)

    Fensin, Michael Lorne [Los Alamos National Laboratory; James, Michael R. [Los Alamos National Laboratory; Hendricks, John S. [Los Alamos National Laboratory; Goorley, John T. [Los Alamos National Laboratory

    2012-06-19

    The first MCNP based inline Monte Carlo depletion capability was officially released from the Radiation Safety Information and Computational Center as MCNPX 2.6.0. Both the MCNP5 and MCNPX codes have historically provided a successful combinatorial geometry based, continuous energy, Monte Carlo radiation transport solution for advanced reactor modeling and simulation. However, due to separate development pathways, useful simulation capabilities were dispersed between both codes and not unified in a single technology. MCNP6, the next evolution in the MCNP suite of codes, now combines the capability of both simulation tools, as well as providing new advanced technology, in a single radiation transport code. We describe here the new capabilities of the MCNP6 depletion code dating from the official RSICC release MCNPX 2.6.0, reported previously, to the now current state of MCNP6. NEA/OECD benchmark results are also reported. The MCNP6 depletion capability enhancements beyond MCNPX 2.6.0 reported here include: (1) new performance enhancing parallel architecture that implements both shared and distributed memory constructs; (2) enhanced memory management that maximizes calculation fidelity; and (3) improved burnup physics for better nuclide prediction. MCNP6 depletion enables complete, relatively easy-to-use depletion calculations in a single Monte Carlo code. The enhancements described here help provide a powerful capability as well as dictate a path forward for future development to improve the usefulness of the technology.

  13. Post flood damage data collection and assessment in Albania based on DesInventar methodology

    Science.gov (United States)

    Toto, Emanuela; Massabo, Marco; Deda, Miranda; Rossello, Laura

    2015-04-01

    In 2013 in Albania was implemented a collection of disaster losses based on Desinventar. The DesInventar system consists in a methodology and software tool that lead to the systematic collection, documentation and analysis of loss data on disasters. The main sources of information about disasters used for the Albanian database were the Albanian Ministry of Internal Affairs, the National Library and the State archive. Specifically for floods the database created contains nearly 900 datasets, for a period of 148 years (from 1865 to 2013). The data are georeferenced on the administrative units of Albania: Region, Provinces and Municipalities. The datasets describe the events by reporting the date of occurrence, the duration, the localization in administrative units and the cause. Additional information regards the effects and damage that the event caused on people (deaths, injured, missing, affected, relocated, evacuated, victims) and on houses (houses damaged or destroyed). Other quantitative indicators are the losses in local currency or US dollars, the damage on roads, the crops affected , the lost cattle and the involvement of social elements over the territory such as education and health centers. Qualitative indicators simply register the sectors (e.g. transportations, communications, relief, agriculture, water supply, sewerage, power and energy, industries, education, health sector, other sectors) that were affected. Through the queries and analysis of the data collected it was possible to identify the most affected areas, the economic loss, the damage in agriculture, the houses and people affected and many other variables. The most vulnerable Regions for the past floods in Albania were studied and individuated, as well as the rivers that cause more damage in the country. Other analysis help to estimate the damage and losses during the main flood events of the recent years, occurred in 2010 and 2011, and to recognize the most affected sectors. The database was

  14. Depleted uranium: Metabolic disruptor?

    International Nuclear Information System (INIS)

    The presence of uranium in the environment can lead to long-term contamination of the food chain and of water intended for human consumption and thus raises many questions about the scientific and societal consequences of this exposure on population health. Although the biological effects of chronic low-level exposure are poorly understood, results of various recent studies show that contamination by depleted uranium (DU) induces subtle but significant biological effects at the molecular level in organs including the brain, liver, kidneys and testicles. For the first time, it has been demonstrated that DU induces effects on several metabolic pathways, including those metabolizing vitamin D, cholesterol, steroid hormones, acetylcholine and xenobiotics. This evidence strongly suggests that DU might well interfere with many metabolic pathways. It might thus contribute, together with other man-made substances in the environment, to increased health risks in some regions. (authors)

  15. Riddle of depleted uranium

    International Nuclear Information System (INIS)

    Depleted Uranium (DU) is the waste product of uranium enrichment from the manufacturing of fuel rods for nuclear reactors in nuclear power plants and nuclear power ships. DU may also results from the reprocessing of spent nuclear reactor fuel. Potentially DU has both chemical and radiological toxicity with two important targets organs being the kidney and the lungs. DU is made into a metal and, due to its availability, low price, high specific weight, density and melting point as well as its pyrophoricity; it has a wide range of civilian and military applications. Due to the use of DU over the recent years, there appeared in some press on health hazards that are alleged to be due to DU. In these paper properties, applications, potential environmental and health effects of DU are briefly reviewed

  16. Novel methodology for 3D reconstruction of carotid arteries and plaque characterization based upon magnetic resonance imaging carotid angiography data.

    Science.gov (United States)

    Sakellarios, Antonis I; Stefanou, Kostas; Siogkas, Panagiotis; Tsakanikas, Vasilis D; Bourantas, Christos V; Athanasiou, Lambros; Exarchos, Themis P; Fotiou, Evangelos; Naka, Katerina K; Papafaklis, Michail I; Patterson, Andrew J; Young, Victoria E L; Gillard, Jonathan H; Michalis, Lampros K; Fotiadis, Dimitrios I

    2012-10-01

    In this study, we present a novel methodology that allows reliable segmentation of the magnetic resonance images (MRIs) for accurate fully automated three-dimensional (3D) reconstruction of the carotid arteries and semiautomated characterization of plaque type. Our approach uses active contours to detect the luminal borders in the time-of-flight images and the outer vessel wall borders in the T(1)-weighted images. The methodology incorporates the connecting components theory for the automated identification of the bifurcation region and a knowledge-based algorithm for the accurate characterization of the plaque components. The proposed segmentation method was validated in randomly selected MRI frames analyzed offline by two expert observers. The interobserver variability of the method for the lumen and outer vessel wall was -1.60%±6.70% and 0.56%±6.28%, respectively, while the Williams Index for all metrics was close to unity. The methodology implemented to identify the composition of the plaque was also validated in 591 images acquired from 24 patients. The obtained Cohen's k was 0.68 (0.60-0.76) for lipid plaques, while the time needed to process an MRI sequence for 3D reconstruction was only 30 s. The obtained results indicate that the proposed methodology allows reliable and automated detection of the luminal and vessel wall borders and fast and accurate characterization of plaque type in carotid MRI sequences. These features render the currently presented methodology a useful tool in the clinical and research arena. PMID:22617149

  17. A Methodological Review for the Analysis of Divide and Conquer Based Sorting/ Searching Algorithms

    Directory of Open Access Journals (Sweden)

    Deepak Abhyankar

    2011-09-01

    Full Text Available This paper develops a practical methodology for the analysis of sorting/searching algorithms. To achieve this objective an analytical study of Quicksort and searching problem was undertaken. This work explains that asymptotic analysis can be misleading if applied slovenly. The study provides a fresh insight into the working of Quicksort and Binary search. Also this presents an exact analysis of Quicksort. Our study finds that asymptotic analysis is a sort of approximation and may hide many useful facts. It was shown that infinite inefficient algorithms can easily be classified with a few efficient algorithms using asymptotic approach.

  18. A Methodology for Platform Based High—Level System—on—Chip Verification

    Institute of Scientific and Technical Information of China (English)

    GAOFeng; LIUPeng; YAOQingdong

    2003-01-01

    The time-to-market challenge has increased the need for shortening the co-verification time in system-on-chip development.In this article,a new methodology of high-level hardware/software coverification is introduced.With the help of the real-time operating system,the application program can easily be migrated from the software simulator to the hardware emulation board.The hierarchical architecture can be used to separate application program from the implementation of the platform during the veriflaction process.The highlevel verification platform is successfully used in developing the HDTV decoding chip.

  19. A problem-based approach to teaching research methodology to medical graduates in Iran

    Directory of Open Access Journals (Sweden)

    Mehrdad Jalalian Hosseini

    2009-08-01

    Full Text Available Physicians are reticent to participate in research projects for avariety of reasons. Facilitating the active involvement ofdoctors in research projects is a high priority for the IranianBlood Transfusion Organization (IBTO. A one-month trainingcourse on research methodology was conducted for a groupof physicians in Mashhad, in northeast Iran. The participantswere divided in ten groups. They prepared a researchproposal under the guidance of a workshop leader. Thequality of the research proposals, which were prepared by allparticipants, went beyond our expectations. All of theresearch proposals were relevant to blood safety. In this briefreport we describe our approach.

  20. Emotion Extractor-AI based methodology to implement prosody features in Speech Synthesis

    Directory of Open Access Journals (Sweden)

    M B Chandak

    2011-07-01

    Full Text Available This paper presents the methodology to extract emotion from the text at real time and add the expression to the documents contents during speech synthesis. To understand the existence of emotions self assessment test was carried out on set of documents and preliminary rules were formulated for three basic emotions: Pleasure, Arousal and Dominance. These rules are used in an automated procedure that assigns emotional state values to document contents. These values are then used by speech synthesizer to add emotions to speech. The system is language independent and content free.

  1. Methodological issues in analyzing small populations using CCHS cycles based on the official language minority studies.

    Science.gov (United States)

    Makvandi, Ewa; Bouchard, Louise; Bergeron, Pierre-Jerôme; Sedigh, Golnaz

    2013-01-01

    Statistical analyses for small populations or small domains of interest can be challenging. To obtain reliable estimates, only very large surveys such as the Canadian Community Health Survey can be considered. However, despite its good geographical and temporal coverage, the analysis of small populations in smaller regions (e.g., health regions) and in regards to specific health issues remains challenging. We will look at the methodological issues in analysis of small populations in relation to sampling and non-sampling survey errors that affect the precision and accuracy of the estimates. Francophone minorities in Canada will be used to illustrate the issues throughout the paper. PMID:24300323

  2. The Toxicity of Depleted Uranium

    OpenAIRE

    Wayne Briner

    2010-01-01

    Depleted uranium (DU) is an emerging environmental pollutant that is introduced into the environment primarily by military activity. While depleted uranium is less radioactive than natural uranium, it still retains all the chemical toxicity associated with the original element. In large doses the kidney is the target organ for the acute chemical toxicity of this metal, producing potentially lethal tubular necrosis. In contrast, chronic low dose exposure to depleted uranium may not produce a c...

  3. A methodology for the development of software agent based interoperable telemedicine systems: a tele-electrocardiography perspective.

    Science.gov (United States)

    Ganguly, P; Ray, P

    2000-01-01

    Telemedicine involves the integration of information, human-machine, and healthcare technologies. Because different modalities of patient care require applications running on heterogeneous computing environment, software interoperability is a major issue in telemedicine. Software agent technology provides a range of promising techniques to solve this problem. This article discusses the development of a methodology for the design of interoperable telemedicine systems (illustrated with a tele-electrocardiography application). Software interoperability between different applications can be modeled at different levels of abstraction such as physical interoperability, data-type interoperability, specification-level interoperability, and semantic interoperability. Software agents address the issue of software interoperability at semantic level. A popular object-oriented software development methodology - unified modeling language (UML) - has been used for this development. This research has demonstrated the feasibility of the development of agent-based interoperable telemedicine systems. More research is needed before widespread deployment of such systems can take place. PMID:10957742

  4. A PDE-based methodology for modeling, parameter estimation and feedback control in structural and structural acoustic systems

    Science.gov (United States)

    Banks, H. T.; Brown, D. E.; Metcalf, Vern L.; Silcox, R. J.; Smith, Ralph C.; Wang, Yun

    1994-01-01

    A problem of continued interest concerns the control of vibrations in a flexible structure and the related problem of reducing structure-borne noise in structural acoustic systems. In both cases, piezoceramic patches bonded to the structures have been successfully used as control actuators. Through the application of a controlling voltage, the patches can be used to reduce structural vibrations which in turn lead to methods for reducing structure-borne noise. A PDE-based methodology for modeling, estimating physical parameters, and implementing a feedback control scheme for problems of this type is discussed. While the illustrating example is a circular plate, the methodology is sufficiently general so as to be applicable in a variety of structural and structural acoustic systems.

  5. Assertion based verification methodology for HDL designs of primary sodium pump speed and eddy current flow measurement systems of PFBR

    International Nuclear Information System (INIS)

    With the growing complexity and size of digital designs, functional verification has become a huge challenge. The validation and testing process accounts for a significant percentage of the overall development effort and cost for electronic systems. Many studies have shown that up to 70% of the design development time and resources are spent on functional verification. Functional errors manifest themselves very early in the design flow, and unless they are detected upfront, they can result in severe consequences - both financially and from a safety viewpoint. This paper covers the various types of verification methodologies and focuses on Assertion Based Verification Methodology for HDL designs, taking as case studies, the Primary Sodium Pump Speed and Eddy Current Flow Measurement Systems of PFBR. (author)

  6. Depleted zinc: Properties, application, production

    International Nuclear Information System (INIS)

    The addition of ZnO, depleted in the Zn-64 isotope, to the water of boiling water nuclear reactors lessens the accumulation of Co-60 on the reactor interior surfaces, reduces radioactive wastes and increases the reactor service-life because of the inhibitory action of zinc on inter-granular stress corrosion cracking. To the same effect depleted zinc in the form of acetate dihydrate is used in pressurized water reactors. Gas centrifuge isotope separation method is applied for production of depleted zinc on the industrial scale. More than 20 years of depleted zinc application history demonstrates its benefits for reduction of NPP personnel radiation exposure and combating construction materials corrosion.

  7. Preparation of guides, on mechanics of fluids, for the physics teaching based on the investigatory methodology

    Energy Technology Data Exchange (ETDEWEB)

    Munoz, Loreto Mora; Buzzo, Ricardo; Martinez-Mardones, Javier; Romero, Angel [Instituto de Fisica, Pontificia Universidad Catolica de Valparaiso Av. Brasil 2950, Valparaiso (Chile)], E-mail: jmartine@ucv.cl

    2008-11-01

    The challenges in the present educational reform emphasize the professional character of the teacher in the planning of classes, execution of activities and evaluation of learning. A set of planned activities is not to a class of science as a pile of thrown bricks is to a house; this is, that if it is not counted on the knowledge of the preconceptions of the students, the daily realities that they face and of the expectations that these have at the time of participating in the science classes, cannot be obtained the proposed objectives. The well-known investigatory method applied to the education of sciences approaches the conceptual contents from practical activities of easy reproduction that are guided in effective form towards the topics to teach. Guides OPPS (Operation Primary Physics Science), of Louisiana University, are excellent examples of the application of this methodology, as much in the material that corresponds to the students as in the material for the guide of the learning activities (call Leader of the class). This international experience, within the framework used of the Plans and Programs of the Ministry of Education of Chile (MINEDUC), is the main axis of this work in which the accomplishment of guides with this methodology considers, approaching contained of a unit of the common plan of physics for third grade of high school.

  8. A methodology based in particle swarm optimization algorithm for preventive maintenance focused in reliability and cost

    International Nuclear Information System (INIS)

    In this work, a Particle Swarm Optimization Algorithm (PSO) is developed for preventive maintenance optimization. The proposed methodology, which allows the use flexible intervals between maintenance interventions, instead of considering fixed periods (as usual), allows a better adaptation of scheduling in order to deal with the failure rates of components under aging. Moreover, because of this flexibility, the planning of preventive maintenance becomes a difficult task. Motivated by the fact that the PSO has proved to be very competitive compared to other optimization tools, this work investigates the use of PSO as an alternative tool of optimization. Considering that PSO works in a real and continuous space, it is a challenge to use it for discrete optimization, in which scheduling may comprise variable number of maintenance interventions. The PSO model developed in this work overcome such difficulty. The proposed PSO searches for the best policy for maintaining and considers several aspects, such as: probability of needing repair (corrective maintenance), the cost of such repairs, typical outage times, costs of preventive maintenance, the impact of maintaining the reliability of systems as a whole, and the probability of imperfect maintenance. To evaluate the proposed methodology, we investigate an electro-mechanical system consisting of three pumps and four valves, High Pressure Injection System (HPIS) of a PWR. Results show that PSO is quite efficient in finding the optimum preventive maintenance policies for the HPIS. (author)

  9. Preparation of guides, on mechanics of fluids, for the physics teaching based on the investigatory methodology

    International Nuclear Information System (INIS)

    The challenges in the present educational reform emphasize the professional character of the teacher in the planning of classes, execution of activities and evaluation of learning. A set of planned activities is not to a class of science as a pile of thrown bricks is to a house; this is, that if it is not counted on the knowledge of the preconceptions of the students, the daily realities that they face and of the expectations that these have at the time of participating in the science classes, cannot be obtained the proposed objectives. The well-known investigatory method applied to the education of sciences approaches the conceptual contents from practical activities of easy reproduction that are guided in effective form towards the topics to teach. Guides OPPS (Operation Primary Physics Science), of Louisiana University, are excellent examples of the application of this methodology, as much in the material that corresponds to the students as in the material for the guide of the learning activities (call Leader of the class). This international experience, within the framework used of the Plans and Programs of the Ministry of Education of Chile (MINEDUC), is the main axis of this work in which the accomplishment of guides with this methodology considers, approaching contained of a unit of the common plan of physics for third grade of high school.

  10. Experimental methodology for turbocompressor in-duct noise evaluation based on beamforming wave decomposition

    Science.gov (United States)

    Torregrosa, A. J.; Broatch, A.; Margot, X.; García-Tíscar, J.

    2016-08-01

    An experimental methodology is proposed to assess the noise emission of centrifugal turbocompressors like those of automotive turbochargers. A step-by-step procedure is detailed, starting from the theoretical considerations of sound measurement in flow ducts and examining specific experimental setup guidelines and signal processing routines. Special care is taken regarding some limiting factors that adversely affect the measuring of sound intensity in ducts, namely calibration, sensor placement and frequency ranges and restrictions. In order to provide illustrative examples of the proposed techniques and results, the methodology has been applied to the acoustic evaluation of a small automotive turbocharger in a flow bench. Samples of raw pressure spectra, decomposed pressure waves, calibration results, accurate surge characterization and final compressor noise maps and estimated spectrograms are provided. The analysis of selected frequency bands successfully shows how different, known noise phenomena of particular interest such as mid-frequency "whoosh noise" and low-frequency surge onset are correlated with operating conditions of the turbocharger. Comparison against external inlet orifice intensity measurements shows good correlation and improvement with respect to alternative wave decomposition techniques.

  11. A Methodology for Mapping Meanings in Text-Based Sustainability Communication

    Directory of Open Access Journals (Sweden)

    Mark Brown

    2013-06-01

    Full Text Available In moving society towards more sustainable forms of consumption and production, social learning must play an important role. Making the assumption that it occurs as a consequence of changes in understanding, this article presents a methodology for mapping meanings in sustainability communication texts. The methodology uses techniques from corpus linguistics and framing theory. Two large databases of text were constructed by copying material down from the websites of two different groups of social actors: (i environmental NGOs and (ii British green business, and saving it as .txt files. The findings on individual words show that the NGOs and business use them very differently. Focusing on words expressing concern for the natural environment, it is proposed that the two actors also conceptualize their concern differently. Green business’s cognitive system of concern has two well-developed frames; good intentions and risk management. However, three frames—concern for the natural environment, perception of the damage, and responsibility, are light on detail. In contrast, within the NGOs’ system of concern, the frames of concern for the natural environment, perception of the damage and responsibility, contain words making detailed representations.

  12. PAGIS summary report of phase 1: a common methodological approach based on European data and models

    International Nuclear Information System (INIS)

    Since 1982 a joint study has been launched by the CEC with the participation of national institutions in the E.C., aiming at a Performance Assessment of Geological Isolation Systems (PAGIS) for HLW disposal. This document is a summary of the first phase of the study which was devoted to the collection of data and models and to the choice of an appropriate methodology. To this purpose, real or national sites have been chosen, which are representative of three types of continental geological formations in the E.C.: clay, granite and salt (although the choices imply no committment of any kind about their final use). Moreover, sub-seabed areas have also been identified. The study covers the following items: - basic data on waste characteristics, site data and repository designs; - methodology, which allows sensitivity and uncertainty analyses to be performed, as well as the assessment of radiation doses to individuals and populations; - preliminary modelling of radionuclide release and migration through the geosphere (near- and far-field) and the biosphere following their various pathways to man; - selection of the most relevant radionuclide release scenarios and their probability of occurrence. Reference values have been selected for the basic data as well as variants covering the various options which are under consideration in the different Countries of the E.C.

  13. Health monitoring methodology based on exergetic analysis for building mechanical systems

    International Nuclear Information System (INIS)

    Exergetic analysis is not often performed in the context of retrocommissioning (RCX); this research provides insight into the benefits of incorporating this approach. Data collected from a previously developed RCX test for an air handling unit (AHU) on a college campus are used in an advanced thermodynamic analysis. The operating data is analyzed using the first and second laws and retrofit design solutions are recommended for improved system performance; the second law analysis is particularly helpful because it requires few additional calculations or data collections. The thermodynamic methodology is extended to a building's cooling plant, which uses a vapor compression refrigeration cycle (VCRC) chiller. Existing chiller data collected for the design of automated fault detection and diagnosis methodology is used. As with the AHU analysis, the second law analysis locates irreversibilities that would not be determined from a first law analysis alone. Plant data representing both normal and faulty operation is used to develop a chiller model for assessing performance and health monitoring. Data is analyzed to determine the viability of health monitoring by performing an exergy analysis on existing data. Conclusions are drawn about the usefulness of exergetic analysis for improving system operations of energy intensive building mechanical systems.

  14. Information Systems Planning based on the BSP methodology: the DETRAN/AL case

    Directory of Open Access Journals (Sweden)

    Adiel Teixeira de Almeida

    2008-12-01

    Full Text Available The ability to generate, process and transmit information is the first step of a production process that finishes by its application in the process of value aggregation to products and services. However, in order to provide access to the necessary information for organizations investments in technology are not enough: they must invest in the information infrastructure. In this context, the Information Systems planning should ensure that investments inInformation Systems are aligned with the organizational strategy. This article presents the results of the application of the Business System Planning methodology to the Information System planning of the administrative and financial area of the Transit Department of theAlagoas State. The choice of the BSP methodology was motivated by its emphasis on business processes and it provided the identification of the strategic vision of the organization, the business processes of the administrative and financial area and the groups of information necessary to procedures. Additionally, through the use of a prioritization software it was possible to establish groups of information services modules to be developed in a first horizon.

  15. Artificial Neural Network based γ-hadron segregation methodology for TACTIC telescope

    Energy Technology Data Exchange (ETDEWEB)

    Dhar, V.K., E-mail: veer@barc.gov.in [Astrophysical Sciences Division, Bhabha Atomic Research Centre, Mumbai 400 085 (India); Tickoo, A.K.; Koul, M.K.; Koul, R. [Astrophysical Sciences Division, Bhabha Atomic Research Centre, Mumbai 400 085 (India); Dubey, B.P. [Electronics and Instrumentation Services Division, Bhabha Atomic Research Centre, Mumbai 400 085 (India); Rannot, R.C.; Yadav, K.K.; Chandra, P.; Kothari, M.; Chanchalani, K.; Venugopal, K. [Astrophysical Sciences Division, Bhabha Atomic Research Centre, Mumbai 400 085 (India)

    2013-04-21

    The sensitivity of a Cherenkov imaging telescope is strongly dependent on the rejection of the cosmic-ray background events. The methods which have been used to achieve the segregation between the γ-rays from the source and the background cosmic-rays, include methods like Supercuts/Dynamic Supercuts, Maximum likelihood classifier, Kernel methods, Fractals, Wavelets and random forest. While the segregation potential of the neural network classifier has been investigated in the past with modest results, the main purpose of this paper is to study the γ/hadron segregation potential of various ANN algorithms, some of which are supposed to be more powerful in terms of better convergence and lower error compared to the commonly used Backpropagation algorithm. The results obtained suggest that Levenberg–Marquardt method outperforms all other methods in the ANN domain. Applying this ANN algorithm to ∼101.44h of Crab Nebula data collected by the TACTIC telescope, during November 10, 2005–January 30, 2006, yields an excess of ∼(1141±106) with a statistical significance of ∼11.07σ, as against an excess of ∼(928±100) with a statistical significance of ∼9.40σ obtained with Dynamic Supercuts selection methodology. The main advantage accruing from the ANN methodology is that it is more effective at higher energies and this has allowed us to re-determine the Crab Nebula energy spectrum in the energy range ∼1–24TeV.

  16. Spintronic logic design methodology based on spin Hall effect-driven magnetic tunnel junctions

    Science.gov (United States)

    Kang, Wang; Wang, Zhaohao; Zhang, Youguang; Klein, Jacques-Olivier; Lv, Weifeng; Zhao, Weisheng

    2016-02-01

    Conventional complementary metal-oxide-semiconductor (CMOS) technology is now approaching its physical scaling limits to enable Moore’s law to continue. Spintronic devices, as one of the potential alternatives, show great promise to replace CMOS technology for next-generation low-power integrated circuits in nanoscale technology nodes. Until now, spintronic memory has been successfully commercialized. However spintronic logic still faces many critical challenges (e.g. direct cascading capability and small operation gain) before it can be practically applied. In this paper, we propose a standard complimentary spintronic logic (CSL) design methodology to form a CMOS-like logic design paradigm. Using the spin Hall effect (SHE)-driven magnetic tunnel junction (MTJ) device as an example, we demonstrate CSL implementation, functionality and performance. This logic family provides a unified design methodology for spintronic logic circuits and partly solves the challenges of direct cascading capability and small operation gain in the previously proposed spintronic logic designs. By solving a modified Landau-Lifshitz-Gilbert equation, the magnetization dynamics in the free layer of the MTJ is theoretically described and a compact electrical model is developed. With this electrical model, numerical simulations have been performed to evaluate the functionality and performance of the proposed CSL design. Simulation results demonstrate that the proposed CSL design paradigm is rather promising for low-power logic computing.

  17. A Novel Depletion-Mode MOS Gated Emitter Shorted Thyristor

    Institute of Scientific and Technical Information of China (English)

    张鹤鸣; 戴显英; 张义门; 马晓华; 林大松

    2000-01-01

    A Novel MOS-gated thyristor, depletion-mode MOS gated emitter shorted thyristor (DMST),and its two structures are proposed. In DMST,the channel of depletion-mode MOS makes the thyristor emitter-based junction inherently short. The operation of the device is controlled by the interruption and recovery of the depletion-mode MOS P channel. The perfect properties have been demonstrated by 2-D numerical simulations and the tests on the fabricated chips.

  18. Methodology for the Construction of a Rule-Based Knowledge Base Enabling the Selection of Appropriate Bronze Heat Treatment Parameters Using Rough Sets

    OpenAIRE

    Górny Z.; Kluska-Nawarecka S.; Wilk-Kołodziejczyk D.; Regulski K.

    2015-01-01

    Decisions regarding appropriate methods for the heat treatment of bronzes affect the final properties obtained in these materials. This study gives an example of the construction of a knowledge base with application of the rough set theory. Using relevant inference mechanisms, knowledge stored in the rule-based database allows the selection of appropriate heat treatment parameters to achieve the required properties of bronze. The paper presents the methodology and the results of exploratory r...

  19. Quantifying price risk of electricity retailer based on CAPM and RAROC methodology

    International Nuclear Information System (INIS)

    In restructured electricity markets, electricity retailers set up contracts with generation companies (GENCOs) and with end users to meet their load requirements at agreed upon tariff. The retailers invest consumer payments as capital in the volatile competitive market. In this paper, a model for quantifying price risk of electricity retailer is proposed. An IEEE 30 Bus test system is used to demonstrate the model. The Capital Asset Pricing Model (CAPM) is demonstrated to determine the retail electricity price for the end users. The factor Risk Adjusted Recovery on Capital (RAROC) is used to quantify the price risk involved. The methodology proposed in this paper can be used by retailer while submitting proposal for electricity tariff to the regulatory authority. (author)

  20. Towards a Tool-based Development Methodology for Pervasive Computing Applications

    CERN Document Server

    Cassou, Damien; Consel, Charles; Balland, Emilie; 10.1109/TSE.2011.107

    2012-01-01

    Despite much progress, developing a pervasive computing application remains a challenge because of a lack of conceptual frameworks and supporting tools. This challenge involves coping with heterogeneous devices, overcoming the intricacies of distributed systems technologies, working out an architecture for the application, encoding it in a program, writing specific code to test the application, and finally deploying it. This paper presents a design language and a tool suite covering the development life-cycle of a pervasive computing application. The design language allows to define a taxonomy of area-specific building-blocks, abstracting over their heterogeneity. This language also includes a layer to define the architecture of an application, following an architectural pattern commonly used in the pervasive computing domain. Our underlying methodology assigns roles to the stakeholders, providing separation of concerns. Our tool suite includes a compiler that takes design artifacts written in our language as...

  1. An Effective Approach Based on Response Surface Methodology for Predicting Friction Welding Parameters

    Science.gov (United States)

    Celik, Sare; Deniz Karaoglan, Aslan; Ersozlu, Ismail

    2016-03-01

    The joining of dissimilar metals is one of the most essential necessities of industries. Manufacturing by the joint of alloy steel and normal carbon steel is used in production, because it decreases raw material cost. The friction welding process parameters such as friction pressure, friction time, upset pressure, upset time and rotating speed play the major roles in determining the strength and microstructure of the joints. In this study, response surface methodology (RSM), which is a well-known design of experiments approach, is used for modeling the mathematical relation between the responses (tensile strength and maximum temperature), and the friction welding parameters with minimum number of experiments. The results show that RSM is an effective method for this type of problems for developing models and prediction.

  2. A methodology for defining shock tests based on shock response spectra and temporal moments

    Energy Technology Data Exchange (ETDEWEB)

    Cap, J.S.; Smallwood, D.O. [Sandia National Labs., Albuquerque, NM (United States). Mechanical and Thermal Environments Dept.

    1997-11-01

    Defining acceptable tolerances for a shock test has always been a problem due in large part to the use of Shock Response Spectra (SRS) as the sole description of the shock. While SRS do contain a wealth of information if one knows what to look for, it is commonly accepted that different agencies can generate vastly different time domain test inputs whose SRS all satisfy the test requirements within a stated tolerance band. At an even more basic level, the laboratory test specifications often fail to resemble the field environment even though the SRS appear to be similar. A concise means of bounding the time domain inputs would be of great benefit in reducing the variation in the resulting shock tests. This paper describes a methodology that uses temporal moments to improve the repeatability of shock test specifications.

  3. Distribution characteristics of the soils in Henan province of central China based on pedodiversity methodology

    International Nuclear Information System (INIS)

    A newly developed pedodiversity methodology was used in analyzing the distribution character of the soils in Henan province of central China. The rare soil types and representative soil types were defined after three soil parameters (soil patch numbers, total area and spatial distribution diversity in 2km×2km grid scale) were calculated respectively. Results show that there are positive correlations between soil patch numbers, total area and spatial distribution diversity, the regression equations between spatial distribution diversity and total area is y=0.086ln(x)-0.021, R2=0.992 and y=0.106ln(x)+0.161, R2=0.921 between spatial distribution diversity and patch numbers. The value constituent pattern of soil spatial distribution diversity fits the normal distribution. More attention needs to be paid to protect the endangered rare soil types. The soil type quantity per km2 is always different under different local conditions

  4. Methodology, results and experience of independent brachytherapy plan verifications based on DICOM standard

    International Nuclear Information System (INIS)

    The use of a high dose rate source together with an afterloading treatment delivery in brachytherapy plans allows for dose modulation minimizing dose to staff. An independent verification of the exported data to the treatment station is required by local regulations (being also a widely accepted recommendation on the international literature2). We have developed a methodology under home brew code to import DICOM treatment data onto an Excel spreadsheet that is able to calculate dose on given reference points using the TG-43 formalism of the AAPM3-5. It employs analytic fits of anisotropy factor and radial dose function for different sources6-8. The end point implementations we present here allow merging in one step an independent verification and a treatment printout. The use of DICOM standard makes our code versatile and provides greater compatibility with respect to current treatment planning systems. (Author)

  5. A Capacitance-Based Methodology for the Estimation of Piezoelectric Coefficients of Poled Piezoelectric Materials

    KAUST Repository

    Al Ahmad, Mahmoud

    2010-10-04

    A methodology is proposed to estimate the piezoelectric coefficients of bulk piezoelectric materials using simple capacitance measurements. The extracted values of d33 and d31 from the capacitance measurements were 506 pC/N and 247 pC/N, respectively. The d33 value is in agreement with that obtained from the Berlincourt method, which gave a d33 value of 500 pC/N. In addition, the d31 value is in agreement with the value obtained from the optical method, which gave a d 31 value of 223 pC/V. These results suggest that the proposed method is a viable way to quickly estimate piezoelectric coefficients of bulk unclamped samples. © 2010 The Electrochemical Society.

  6. Development of six sigma concurrent parameter and tolerance design method based on response surface methodology

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Using Response Surface Methodology (RSM), an optimizing model of concurrent parameter and tolerance design is proposed where response mean equals its target in the target being best. The optimizing function of the model is the sum of quality loss and tolerance cost subjecting to the variance confidence region of which six sigma capability can be assured. An example is illustrated in order to compare the differences between the developed model and the parameter design with minimum variance. The results show that the proposed method not only achieves robustness, but also greatly reduces cast. The objectives of high quality and low cost of product and process can be achieved simultaneously by the application of six sigma concurrent parameter and tolerance design.

  7. Quantifying price risk of electricity retailer based on CAPM and RAROC methodology

    Energy Technology Data Exchange (ETDEWEB)

    Karandikar, R.G.; Khaparde, S.A.; Kulkarni, S.V. [Electrical Engineering Department, Indian Institute of Technology Bombay, Mumbai 400 076 (India)

    2007-12-15

    In restructured electricity markets, electricity retailers set up contracts with generation companies (GENCOs) and with end users to meet their load requirements at agreed upon tariff. The retailers invest consumer payments as capital in the volatile competitive market. In this paper, a model for quantifying price risk of electricity retailer is proposed. An IEEE 30 Bus test system is used to demonstrate the model. The Capital Asset Pricing Model (CAPM) is demonstrated to determine the retail electricity price for the end users. The factor Risk Adjusted Recovery on Capital (RAROC) is used to quantify the price risk involved. The methodology proposed in this paper can be used by retailer while submitting proposal for electricity tariff to the regulatory authority. (author)

  8. A chemical status predictor. A methodology based on World-Wide sediment samples.

    Science.gov (United States)

    Gredilla, A; Fdez-Ortiz de Vallejuelo, S; de Diego, A; Arana, G; Stoichev, T; Amigo, J M; Wasserman, J C; Botello, A V; Sarkar, S K; Schäfer, J; Moreno, C; de la Guardia, M; Madariaga, J M

    2015-09-15

    As a consequence of the limited resources of underdeveloped countries and the limited interest of the developed ones, the assessment of the chemical quality of entire water bodies around the world is a utopia in the near future. The methodology described here may serve as a first approach for the fast identification of water bodies that do not meet the good chemical status demanded by the European Water Framework Directive (WFD). It also allows estimating the natural background (or reference values of concentration) of the areas under study using a simple criterion. The starting point is the calculation the World-Wide Natural Background Levels (WWNBLs) and World-Wide Threshold Values (WWTVs), two indexes that depend on the concentration of seven elements present in sediments. These elements, As, Cd, Cr, Cu, Ni, Pb and Zn, have been selected taking into account the recommendations of the UNEP (United Nations Environment Programme) and USEPA (United States Environmental Protection Agency), that describe them as elements of concern with respect to environmental toxicity. The methodology has been exemplified in a case study that includes 134 sediment samples collected in 11 transitional water bodies from 7 different countries and 4 different continents. Six of the water bodies considered met the good chemical status demanded by the WFD. The rest of them exceeded the reference WWTVs, at least for one of the elements. The estuaries of the Nerbioi-Ibaizabal (Basque Country) and Cavado (Portugal), the sea inlet of Río San Pedro (Spain), the Sepetiba Bay (Brazil) and the Yucateco lagoon (Mexico) belong to that group. PMID:26143082

  9. Considerations of the Software Metric-based Methodology for Software Reliability Assessment in Digital I and C Systems

    International Nuclear Information System (INIS)

    Analog I and C systems have been replaced by digital I and C systems because the digital systems have many potential benefits to nuclear power plants in terms of operational and safety performance. For example, digital systems are essentially free of drifts, have higher data handling and storage capabilities, and provide improved performance by accuracy and computational capabilities. In addition, analog replacement parts become more difficult to obtain since they are obsolete and discontinued. There are, however, challenges to the introduction of digital technology into the nuclear power plants because digital systems are more complex than analog systems and their operation and failure modes are different. Especially, software, which can be the core of functionality in the digital systems, does not wear out physically like hardware and its failure modes are not yet defined clearly. Thus, some researches to develop the methodology for software reliability assessment are still proceeding in the safety-critical areas such as nuclear system, aerospace and medical devices. Among them, software metric-based methodology has been considered for the digital I and C systems of Korean nuclear power plants. Advantages and limitations of that methodology are identified and requirements for its application to the digital I and C systems are considered in this study

  10. Expanding the Parameters for Excellence in Patient Assignments: Is Leveraging an Evidence-Data-Based Acuity Methodology Realistic?

    Science.gov (United States)

    Gray, Joel; Kerfoot, Karlene

    2016-01-01

    Finding the balance of equitable assignments continues to be a challenge for health care organizations seeking to leverage evidence-based leadership practices. Ratios and subjective acuity strategies for nurse-patient staffing continue to be the dominant approach in health care organizations. In addition to ratio-based assignments and acuity-based assignment models driven by financial targets, more emphasis on using evidence-based leadership strategies to manage and create science for effective staffing is needed. In particular, nurse leaders are challenged to increase the sophistication of management of patient turnover (admissions, discharges, and transfers) and integrate tools from Lean methodologies and quality management strategies to determine the effectiveness of nurse-patient staffing. PMID:26636229

  11. Depletable resources and the economy.

    NARCIS (Netherlands)

    Heijman, W.J.M.

    1991-01-01

    The subject of this thesis is the depletion of scarce resources. The main question to be answered is how to avoid future resource crises. After dealing with the complex relation between nature and economics, three important concepts in relation with resource depletion are discussed: steady state, ti

  12. The Toxicity of Depleted Uranium

    Directory of Open Access Journals (Sweden)

    Wayne Briner

    2010-01-01

    Full Text Available Depleted uranium (DU is an emerging environmental pollutant that is introduced into the environment primarily by military activity. While depleted uranium is less radioactive than natural uranium, it still retains all the chemical toxicity associated with the original element. In large doses the kidney is the target organ for the acute chemical toxicity of this metal, producing potentially lethal tubular necrosis. In contrast, chronic low dose exposure to depleted uranium may not produce a clear and defined set of symptoms. Chronic low-dose, or subacute, exposure to depleted uranium alters the appearance of milestones in developing organisms. Adult animals that were exposed to depleted uranium during development display persistent alterations in behavior, even after cessation of depleted uranium exposure. Adult animals exposed to depleted uranium demonstrate altered behaviors and a variety of alterations to brain chemistry. Despite its reduced level of radioactivity evidence continues to accumulate that depleted uranium, if ingested, may pose a radiologic hazard. The current state of knowledge concerning DU is discussed.

  13. Benchmarking Public Policy : Methodological Insights from Measurement of School Based Management

    OpenAIRE

    Parandekar, Suhas D.

    2014-01-01

    This working paper presents a benchmarking analysis of School Based Management (SBM) using empirical data from the Philippines. School based management is widely used as a policy tool in many countries that seek to improve the quality of service delivery through decentralization. School based management typically takes many years to have an impact on educational outcomes, but policy makers...

  14. Methodology for qualification of wood-based ash according to REACH - prestudy

    Energy Technology Data Exchange (ETDEWEB)

    Sjoeblom, Rolf (Tekedo AB, Nykoeping (Sweden)); Tivegaard, Anna-Maria (SSAB Merox AB, Oxeloesund (Sweden))

    2010-02-15

    The new European Union framework directive on waste is to be implemented during the year 2010. According to this directive, much of what today is regarded as waste will instead be assessed as by-products and in many cases fall under the new European union regulation REACH (Registration, Evaluation, Authorisation and Restriction of Chemicals). REACH applies in conjunction with the new European Union regulation CLP (Classification, Labelling and Packaging of substances and mixtures). There are introductory periods for both of these regulations, and in the case of CLP this regards transition from the present and previous rules under the dangerous substances and dangerous preparations directives (DSD and DPD, respectively). Similarly, the new framework directive on waste supersedes the previous directive and some other statements. There is a connection between the directives of waste and the rules for classification and labelling in that the classification of waste (in the categories hazardous and non-hazardous) build on (but are not identical to) the rules for labelling. Similarly, the national Swedish rules for acceptance of recycled material (waste) for use in geotechnical constructions relate to the provisions in REACH on assessment of chemical safety in the both request that the risk be assessed to be small, and that the same or similar methodologies can be applied to verify this. There is a 'reference alternative' in REACH that implies substantial testing prior to registration. Registration is the key to use of a substance even though a substance may be used as such, in a mixture, or to be released from an article. However, REACH as well as CLP contain a number of provisions for using literature data, data on similar chemicals e t c in order to avoid unnecessary testing. This especially applies to testing on humans and vertebrate animals. Vaermeforsk, through its Programme on Environmentally Friendly Use of Non-Coal Ashes has developed methodologies and

  15. A Platform-Based Design Methodology With Contracts and Related Tools for the Design of Cyber-Physical Systems

    OpenAIRE

    Nuzzo, P; Sangiovanni-Vincentelli, AL; Bresolin, D; Geretti, L; Villa, T

    2015-01-01

    © 2015 IEEE. We introduce a platform-based design methodology that uses contracts to specify and abstract the components of a cyber-physical system (CPS), and provide formal support to the entire CPS design flow. The design is carried out as a sequence of refinement steps from a high-level specification to an implementation built out of a library of components at the lower level. We review formalisms and tools that can be used to specify, analyze, or synthesize the design at different levels ...

  16. On-line new event detection and clustering using the concepts of the cover coefficient-based clustering methodology

    OpenAIRE

    Vural, Ahmet

    2002-01-01

    Cataloged from PDF version of article. In this study, we use the concepts of the cover coefficient-based clustering methodology (C3 M) for on-line new event detection and event clustering. The main idea of the study is to use the seed selection process of the C3 M algorithm for the purpose of detecting new events. Since C3 M works in a retrospective manner, we modify the algorithm to work in an on-line environment. Furthermore, in order to prevent producing oversize...

  17. Bases of general calculation thermohydrodynamic means (codes) verification and validation methodology for accident analysis at nuclear power plants

    International Nuclear Information System (INIS)

    On the basis of previous known approaches' analysis the generalised calculation thermohydrodynamic means verification/validation (V/V) methodology for accident/transition processes' analysis at NPPs is offered in this article. Taking into account formulated requirements and principles the basic V/V procedures, their correlation and order are grounded and considered. The realisation order includes forming calculation means applicability assessment criteria system, analysing mathematical models adequacy to real processes, developing test data bases including a stands adequacy analysis to full-scale conditions, results generalisation methods for final calculation means applicability assessments for specific tasks at specific equipment

  18. A methodology for obtaining the control rods patterns in a BWR using systems based on ants colonies

    International Nuclear Information System (INIS)

    In this work the AZCATL-PBC system based on a technique of ants colonies for the search of control rods patterns of those reactors of the Nuclear Power station of Laguna Verde (CNLV) is presented. The technique was applied to a transition cycle and one of balance. For both cycles they were compared the kef values obtained with a Haling calculation and the control rods pattern proposed by AZCATL-PBC for a burnt one fixed. It was found that the methodology is able to extend the length of the cycle with respect to the Haling prediction, maintaining sure to the reactor. (Author)

  19. Estimation of internal radiation dose in human based on animal data. Application of methodology in drug metabolism and pharmacokinetics

    International Nuclear Information System (INIS)

    Before conducting human study on radiolabeled drug, internal radiation dose is evaluated based on the animal data. Generally, however, species difference in the elimination process of radioactivity, mostly in the hepatic metabolism, is ignored. The methodology of correction was described for drugs that are eliminated mostly by hepatic metabolism. We showed the validity of using the method where the hepatic clearance in animal and human are constructed by the hepatic blood flow, protein unbound fraction and metabolic rate in vitro, and the internal radiation exposure calculated is corrected by the animal/human ratio of the hepatic clearance. (author)

  20. Visual methodologies and participatory action research: Performing women's community-based health promotion in post-Katrina New Orleans.

    Science.gov (United States)

    Lykes, M Brinton; Scheib, Holly

    2016-01-01

    Recovery from disaster and displacement involves multiple challenges including accompanying survivors, documenting effects, and rethreading community. This paper demonstrates how African-American and Latina community health promoters and white university-based researchers engaged visual methodologies and participatory action research (photoPAR) as resources in cross-community praxis in the wake of Hurricane Katrina and the flooding of New Orleans. Visual techniques, including but not limited to photonarratives, facilitated the health promoters': (1) care for themselves and each other as survivors of and responders to the post-disaster context; (2) critical interrogation of New Orleans' entrenched pre- and post-Katrina structural racism as contributing to the racialised effects of and responses to Katrina; and (3) meaning-making and performances of women's community-based, cross-community health promotion within this post-disaster context. This feminist antiracist participatory action research project demonstrates how visual methodologies contributed to the co-researchers' cross-community self- and other caring, critical bifocality, and collaborative construction of a contextually and culturally responsive model for women's community-based health promotion post 'unnatural disaster'. Selected limitations as well as the potential for future cross-community antiracist feminist photoPAR in post-disaster contexts are discussed. PMID:27080253

  1. IMPROVING PSYCHOMOTRICITY COMPONENTS IN PRESCHOOL CHILDREN USING TEACHING METHODOLOGIES BASED ON MIRROR NEURONS ACTIVATION

    Directory of Open Access Journals (Sweden)

    Gáll Zs. Sz.

    2015-08-01

    Full Text Available The scientific substrate of the study relies upon the concept of mirror neurons. Unlike other neurons, these are characterized by an imitation feature. They play an important role in learning processes – especially during childhood, enabling the imitation of motions and determining the primary acquirement thereof. Using this as a starting point, the study aims to work out and apply a methodology in keeping with the content of the psychomotor expression activities curriculum for preschool education, resorting to the demonstration procedures as a main teaching-learning method. Thus, we deem that mirror neurons reactivity will be determined more thoroughly, with a view to enhance the subject's psychomotor development according to body scheme, self-image and performance of basic postures and motions. For the research progress, an experimental group and a control group has been set up and the children’s psychomotor development level has been assessed both before the application of the independent variable and after the effects of the same upon the experimental group. As soon as the planned procedure was completed, the experimental group members showed a significant evolution in terms of the investigated psychomotor fields as compared to the control group.

  2. Study on an ISO 15926 based data modeling methodology for nuclear power industry

    International Nuclear Information System (INIS)

    The scope is therefore data integration and data to support the whole life of a plant. This representation is specified by a generic, conceptual Data Model (DM) that is independent of any particular application, but that is able to record data from the applications used in plant design, fabrication and operation. The data model is designed to be used in conjunction with Reference Data (RD): standard instances of the DM that represent information common to a number of users, plants, or both. This paper introduces a high level description of the structure of ISO 15926 and how this can be adapted to the nuclear power plant industry in particular. This paper introduces ISO 15926 methodology and how to extend the existing RDL for nuclear power industry. As the ISO 15926 representation is independent of applications, interfaces to existing or future applications have to be developed. Such interfaces are provided by Templates that takes input from external sources and 'lifts' it into an ISO 15926 repository, and/or 'lowers' the data into other applications. This is a similar process to the process defined by W3C. Data exchange can be done using e.g. XML messages, but the modelling is independent of technology used for the exchange

  3. Spatial analysis of electricity demand patterns in Greece: Application of a GIS-based methodological framework

    Science.gov (United States)

    Tyralis, Hristos; Mamassis, Nikos; Photis, Yorgos N.

    2016-04-01

    We investigate various uses of electricity demand in Greece (agricultural, commercial, domestic, industrial use as well as use for public and municipal authorities and street lightning) and we examine their relation with variables such as population, total area, population density and the Gross Domestic Product. The analysis is performed on data which span from 2008 to 2012 and have annual temporal resolution and spatial resolution down to the level of prefecture. We both visualize the results of the analysis and we perform cluster and outlier analysis using the Anselin local Moran's I statistic as well as hot spot analysis using the Getis-Ord Gi* statistic. The definition of the spatial patterns and relationships of the aforementioned variables in a GIS environment provides meaningful insight and better understanding of the regional development model in Greece and justifies the basis for an energy demand forecasting methodology. Acknowledgement: This research has been partly financed by the European Union (European Social Fund - ESF) and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: ARISTEIA II: Reinforcement of the interdisciplinary and/ or inter-institutional research and innovation (CRESSENDO project; grant number 5145).

  4. Study on an ISO 15926 based data modeling methodology for nuclear power industry

    Energy Technology Data Exchange (ETDEWEB)

    Cheon, Yang Ho; Park, Byeong Ho; Park, Seong Chan; Kim, Eun Kee [KEPCO E-C, Yongin (Korea, Republic of)

    2014-10-15

    The scope is therefore data integration and data to support the whole life of a plant. This representation is specified by a generic, conceptual Data Model (DM) that is independent of any particular application, but that is able to record data from the applications used in plant design, fabrication and operation. The data model is designed to be used in conjunction with Reference Data (RD): standard instances of the DM that represent information common to a number of users, plants, or both. This paper introduces a high level description of the structure of ISO 15926 and how this can be adapted to the nuclear power plant industry in particular. This paper introduces ISO 15926 methodology and how to extend the existing RDL for nuclear power industry. As the ISO 15926 representation is independent of applications, interfaces to existing or future applications have to be developed. Such interfaces are provided by Templates that takes input from external sources and 'lifts' it into an ISO 15926 repository, and/or 'lowers' the data into other applications. This is a similar process to the process defined by W3C. Data exchange can be done using e.g. XML messages, but the modelling is independent of technology used for the exchange.

  5. Vibrational Study and Force Field of the Citric Acid Dimer Based on the SQM Methodology

    Directory of Open Access Journals (Sweden)

    Laura Cecilia Bichara

    2011-01-01

    Full Text Available We have carried out a structural and vibrational theoretical study for the citric acid dimer. The Density Functional Theory (DFT method with the B3LYP/6-31G∗ and B3LYP/6-311++G∗∗ methods have been used to study its structure and vibrational properties. Then, in order to get a good assignment of the IR and Raman spectra in solid phase of dimer, the best fit possible between the calculated and recorded frequencies was carry out and the force fields were scaled using the Scaled Quantum Mechanic Force Field (SQMFF methodology. An assignment of the observed spectral features is proposed. A band of medium intensity at 1242 cm−1 together with a group of weak bands, previously not assigned to the monomer, was in this case assigned to the dimer. Furthermore, the analysis of the Natural Bond Orbitals (NBOs and the topological properties of electronic charge density by employing Bader's Atoms in Molecules theory (AIM for the dimer were carried out to study the charge transference interactions of the compound.

  6. A radioisotope based methodology for plant-fungal interactions in the rhizosphere

    Energy Technology Data Exchange (ETDEWEB)

    Weisenberger, A. G.; Bonito, G.; Lee, S.; McKisson, J. E.; Gryganskyi, A.; Reid, C. D.; Smith, M. F.; Vaidyanathan, G.; Welch, B.

    2013-10-01

    In plant ecophysiology research there is interest in studying the biology of the rhizosphere because of its importance in plant nutrient-interactions. The rhizosphere is the zone of soil surrounding a plant's root system where microbes (such as fungi) are influenced by the root and the roots by the microbes. We are investigating a methodology for imaging the distribution of molecular compounds of interest in the rhizosphere without disturbing the root or soil habitat. Our intention is to develop a single photon emission computed tomography (SPECT) system (PhytoSPECT) to image the bio-distribution of fungi in association with a host plant's roots. The technique we are exploring makes use of radioactive isotopes as tracers to label molecules that bind to fungal-specific compounds of interest and to image the fungi distribution in the plant and/or soil. We report on initial experiments designed to test the ability of fungal-specific compounds labeled with an iodine radioisotope that binds to chitin monomers (N-acetylglucosamine). Chitin is a compound not found in roots but in fungal cell walls. We will test the ability to label the compound with radioactive isotopes of iodine ({sup 125}I, and {sup 123}I).

  7. Predicting Pedestrian Flow: A Methodology and a Proof of Concept Based on Real-Life Data

    OpenAIRE

    Maria Davidich; Gerta Köster

    2013-01-01

    Building a reliable predictive model of pedestrian motion is very challenging: Ideally, such models should be based on observations made in both controlled experiments and in real-world environments. De facto, models are rarely based on real-world observations due to the lack of available data; instead, they are largely based on intuition and, at best, literature values and laboratory experiments. Such an approach is insufficient for reliable simulations of complex real-life scenarios: For in...

  8. Towards a cognitive robotics methodology for reward-based decision-making: dynamical systems modelling of the Iowa Gambling Task

    Science.gov (United States)

    Lowe, Robert; Ziemke, Tom

    2010-09-01

    The somatic marker hypothesis (SMH) posits that the role of emotions and mental states in decision-making manifests through bodily responses to stimuli of import to the organism's welfare. The Iowa Gambling Task (IGT), proposed by Bechara and Damasio in the mid-1990s, has provided the major source of empirical validation to the role of somatic markers in the service of flexible and cost-effective decision-making in humans. In recent years the IGT has been the subject of much criticism concerning: (1) whether measures of somatic markers reveal that they are important for decision-making as opposed to behaviour preparation; (2) the underlying neural substrate posited as critical to decision-making of the type relevant to the task; and (3) aspects of the methodological approach used, particularly on the canonical version of the task. In this paper, a cognitive robotics methodology is proposed to explore a dynamical systems approach as it applies to the neural computation of reward-based learning and issues concerning embodiment. This approach is particularly relevant in light of a strongly emerging alternative hypothesis to the SMH, the reversal learning hypothesis, which links, behaviourally and neurocomputationally, a number of more or less complex reward-based decision-making tasks, including the 'A-not-B' task - already subject to dynamical systems investigations with a focus on neural activation dynamics. It is also suggested that the cognitive robotics methodology may be used to extend systematically the IGT benchmark to more naturalised, but nevertheless controlled, settings that might better explore the extent to which the SMH, and somatic states per se, impact on complex decision-making.

  9. Case-Based Methodology as an Instructional Strategy for Understanding Diversity: Preservice Teachers' Perceptions

    Science.gov (United States)

    Butler, Malcolm B.; Lee, Seungyoun; Tippins, Deborah J.

    2006-01-01

    Case-based pedagogy focuses on teachers' problem solving, decision making, reflective practices and their own personalized theory about teaching and learning. Research has shown that case-based pedagogy enables teachers to improve their actions in teaching and learning from multiple perspectives, reflective thinking, active participation and…

  10. Brain-Based Learning and Classroom Practice: A Study Investigating Instructional Methodologies of Urban School Teachers

    Science.gov (United States)

    Morris, Lajuana Trezette

    2010-01-01

    The purpose of this study was to examine the implementation of brain-based instructional strategies by teachers serving at Title I elementary, middle, and high schools within the Memphis City School District. This study was designed to determine: (a) the extent to which Title I teachers applied brain-based strategies, (b) the differences in…

  11. Methodological advances

    Directory of Open Access Journals (Sweden)

    Lebreton, J.-D.

    2004-06-01

    Full Text Available The study of population dynamics has long depended on methodological progress. Among many striking examples, continuous time models for populations structured in age (Sharpe & Lotka, 1911 were made possible by progress in the mathematics of integral equations. Therefore the relationship between population ecology and mathematical and statistical modelling in the broad sense raises a challenge in interdisciplinary research. After the impetus given in particular by Seber (1982, the regular biennial EURING conferences became a major vehicle to achieve this goal. It is thus not surprising that EURING 2003 included a session entitled “Methodological advances”. Even if at risk of heterogeneity in the topics covered and of overlap with other sessions, such a session was a logical way of ensuring that recent and exciting new developments were made available for discussion, further development by biometricians and use by population biologists. The topics covered included several to which full sessions were devoted at EURING 2000 (Anderson, 2001 such as: individual covariates, Bayesian methods, and multi–state models. Some other topics (heterogeneity models, exploited populations and integrated modelling had been addressed by contributed talks or posters. Their presence among “methodological advances”, as well as in other sessions of EURING 2003, was intended as a response to their rapid development and potential relevance to biological questions. We briefly review all talks here, including those not published in the proceedings. In the plenary talk, Pradel et al. (in prep. developed GOF tests for multi–state models. Until recently, the only goodness–of–fit procedures for multistate models were ad hoc, and non optimal, involving use of standard tests for single state models (Lebreton & Pradel, 2002. Pradel et al. (2003 proposed a general approach based in particular on mixtures of multinomial distributions. Pradel et al. (in prep. showed

  12. A Methodology for Recognition of Emotions Based on Speech Analysis, for Applications to Human-Robot Interaction. An Exploratory Study

    Directory of Open Access Journals (Sweden)

    Rabiei Mohammad

    2014-01-01

    Full Text Available A system for recognition of emotions based on speech analysis can have interesting applications in human-robot interaction. In this paper, we carry out an exploratory study on the possibility to use a proposed methodology to recognize basic emotions (sadness, surprise, happiness, anger, fear and disgust based on phonetic and acoustic properties of emotive speech with the minimal use of signal processing algorithms. We set up an experimental test, consisting of choosing three types of speakers, namely: (i five adult European speakers, (ii five Asian (Middle East adult speakers and (iii five adult American speakers. The speakers had to repeat 6 sentences in English (with durations typically between 1 s and 3 s in order to emphasize rising-falling intonation and pitch movement. Intensity, peak and range of pitch and speech rate have been evaluated. The proposed methodology consists of generating and analyzing a graph of formant, pitch and intensity, using the open-source PRAAT program. From the experimental results, it was possible to recognize the basic emotions in most of the cases

  13. A stale challenge to the philosophy of science: commentary on "Is psychology based on a methodological error?" by Michael Schwarz.

    Science.gov (United States)

    Ruck, Nora; Slunecko, Thomas

    2010-06-01

    In his article "Is psychology based on a methodological error?" and based on a quite convincing empirical basis, Michael Schwarz offers a methodological critique of one of mainstream psychology's key test theoretical axioms, i.e., that of the in principle normal distribution of personality variables. It is characteristic of this paper--and at first seems to be a strength of it--that the author positions his critique within a frame of philosophy of science, particularly positioning himself in the tradition of Karl Popper's critical rationalism. When scrutinizing Schwarz's arguments, however, we find Schwarz's critique profound only as an immanent critique of test theoretical axioms. We raise doubts, however, as to Schwarz's alleged 'challenge' to the philosophy of science because the author not at all seems to be in touch with the state of the art of contemporary philosophy of science. Above all, we question the universalist undercurrent that Schwarz's 'bio-psycho-social model' of human judgment boils down to. In contrast to such position, we close our commentary with a plea for a context- and culture sensitive philosophy of science. PMID:20369392

  14. Computing elastic‐rebound‐motivated rarthquake probabilities in unsegmented fault models: a new methodology supported by physics‐based simulators

    Science.gov (United States)

    Field, Edward H.

    2015-01-01

    A methodology is presented for computing elastic‐rebound‐based probabilities in an unsegmented fault or fault system, which involves computing along‐fault averages of renewal‐model parameters. The approach is less biased and more self‐consistent than a logical extension of that applied most recently for multisegment ruptures in California. It also enables the application of magnitude‐dependent aperiodicity values, which the previous approach does not. Monte Carlo simulations are used to analyze long‐term system behavior, which is generally found to be consistent with that of physics‐based earthquake simulators. Results cast doubt that recurrence‐interval distributions at points on faults look anything like traditionally applied renewal models, a fact that should be considered when interpreting paleoseismic data. We avoid such assumptions by changing the "probability of what" question (from offset at a point to the occurrence of a rupture, assuming it is the next event to occur). The new methodology is simple, although not perfect in terms of recovering long‐term rates in Monte Carlo simulations. It represents a reasonable, improved way to represent first‐order elastic‐rebound predictability, assuming it is there in the first place, and for a system that clearly exhibits other unmodeled complexities, such as aftershock triggering.

  15. A proposal for transmission pricing methodology in Thailand based on electricity tracing and long-run average incremental cost

    International Nuclear Information System (INIS)

    Although there is no universally accepted methodology for restructuring of electricity supply industry, the transformations often involve separation of generation and transmission. Such separation results in a need for a transmission service charge to be levied on the system users. The National Energy Policy Office (NEPO) of Thailand has commissioned PricewaterhouseCooper (PwC) to propose a transmission service charge that is to be used during the market reform for the transmission business unit of the Electricity Generating Authority of Thailand (EGAT). Although the PwCs transmission use of system charge (TUOS) based on the long-run average incremental cost (LRAIC) and average transmission loss can satisfy the financial requirements, the charge allocations are not economically efficient since they do not provide any locational signal which could reflect costs imposed on the system by locating a system user in a particular geographical location. This paper describes the TUOS methodology suggested by PwC and makes a comparison with a transmission pricing method based on combination of the electricity tracing and LRAIC. The results indicate that, with electricity tracing, the charge allocations are improved in terms of fairness, as the charge reflects the geographical location and system conditions

  16. Project Management Methodology for the Development of M-Learning Web Based Applications

    Directory of Open Access Journals (Sweden)

    Adrian VISOIU

    2010-01-01

    Full Text Available M-learning web based applications are a particular case of web applications designed to be operated from mobile devices. Also, their purpose is to implement learning aspects. Project management of such applications takes into account the identified peculiarities. M-learning web based application characteristics are identified. M-learning functionality covers the needs of an educational process. Development is described taking into account the mobile web and its influences over the analysis, design, construction and testing phases. Activities building up a work breakdown structure for development of m-learning web based applications are presented. Project monitoring and control techniques are proposed. Resources required for projects are discussed.

  17. A methodology for computing uncertainty bounds of multivariable systems based on sector stability theory concepts

    Science.gov (United States)

    Waszak, Martin R.

    1992-01-01

    The application of a sector-based stability theory approach to the formulation of useful uncertainty descriptions for linear, time-invariant, multivariable systems is explored. A review of basic sector properties and sector-based approach are presented first. The sector-based approach is then applied to several general forms of parameter uncertainty to investigate its advantages and limitations. The results indicate that the sector uncertainty bound can be used effectively to evaluate the impact of parameter uncertainties on the frequency response of the design model. Inherent conservatism is a potential limitation of the sector-based approach, especially for highly dependent uncertain parameters. In addition, the representation of the system dynamics can affect the amount of conservatism reflected in the sector bound. Careful application of the model can help to reduce this conservatism, however, and the solution approach has some degrees of freedom that may be further exploited to reduce the conservatism.

  18. Associations - Communities - Residents. Building together a citizen-based project of renewable energies - Methodological guide

    International Nuclear Information System (INIS)

    This guide first outlines the challenges and stakes of citizen-based renewable energies: example of a necessary energy transition in Brittany, interest of a local production of renewable energies, examples in other European countries, and emergence of a citizen-based energy movement in France. The second part presents the four main phases of such a project (diagnosis, development, construction, and exploitation), the main issues to be addressed, and the main steps of a citizen-based renewable energy project (technical, legal and financial, and citizen-related aspects during the different phases). The third part describes how to elaborate a citizen-based project: by addressing the project dimensions, by defining a legal specification, by performing a provisional business model, by choosing an appropriate legal structure, by creating a project company, and by mobilizing local actors). The last part addresses how to finance the project: by building up own funds, by asking banks for support, and by citizen participation to investment

  19. Developing a Methodology for Supplier Base Reduction : A Case Study at Dynapac GmbH

    OpenAIRE

    Böris, Elin; Hall, Vendela

    2015-01-01

    Dynapac GmbH is a manufacturer of road construction equipment and has historically been acquired and merged with several companies, resulting in an expansion of their supplier base. Currently, they are experiencing a large supplier base within direct material causing a decrease in the effectiveness and efficiency in the management of the suppliers. Dynapac GmbH therefore wishes to lower the number of suppliers in order to obtain desired effects, such as cost savings, reduction of administrati...

  20. Project Management Methodology for the Development of M-Learning Web Based Applications

    OpenAIRE

    Adrian VISOIU

    2010-01-01

    M-learning web based applications are a particular case of web applications designed to be operated from mobile devices. Also, their purpose is to implement learning aspects. Project management of such applications takes into account the identified peculiarities. M-learning web based application characteristics are identified. M-learning functionality covers the needs of an educational process. Development is described taking into account the mobile web and its influences over the analysis, d...

  1. Comparison of Different Ground-Based NDVI Measurement Methodologies to Evaluate Crop Biophysical Properties

    OpenAIRE

    Rossana Monica Ferrara; Costanza Fiorentino; Nicola Martinelli; Pasquale Garofalo; Gianfranco Rana

    2010-01-01

    The usage of vegetation indices such as the Normalized Difference Vegetation Index (NDVI) calculated by means of remote sensing data is widely spread for describing vegetation status on large space scale. However, a big limitation of these indices is their inadequate time resolution for agricultural purposes. This limitation could be overcome by the ground-based vegetation indices that could provide an interesting tool for integrating satellite-based value. In this work, three techniques to c...

  2. A test of a physically-based strong ground motion prediction methodology with the 26 September 1997, Mw = 6.0 Colfiorito (Umbria-Marche sequence), Italy earthquake.

    OpenAIRE

    Scognamiglio, L.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione CNT, Roma, Italia; Hutchings, L.; Lawrence Berkeley National Laboratory

    2009-01-01

    We test the physically-based ground motion hazard prediction methodology of Hutchings et al. [Hutchings, L., Ioannidou, E., Kalogeras, I., Voulgaris, N., Savy, J., Foxall, W., Scognamiglio, L., and Stavrakakis, G., (2007). A physically-based strong ground motion prediction methodology; Application to PSHA and the 1999 M = 6.0 Athens Earthquake. Geophys. J. Int. 168, 569–680.] through an a posteriori prediction of the 26 September 1997, Mw 6.0 Colfiorito (Umbria–Marche, Italy) earthquake at fo...

  3. A physically based strong ground-motion prediction methodology; application to PSHA and the 1999 Mw = 6.0 Athens earthquake

    OpenAIRE

    Hutchings, L.; Lawrence Livermore National Laboratory, Hazards Mitigation Center, PO Box 808, L-201, Livermore, CA 94551-0808, USA.; Ioannidou, E.; Department of Geophysics-Geothermics, University of Athens, Athens 15783, Greece; Foxall, W.; Lawrence Livermore National Laboratory, Hazards Mitigation Center, PO Box 808, L-201, Livermore, CA 94551-0808, USA.; Voulgaris, N.; Department of Geophysics-Geothermics, University of Athens, Athens 15783, Greece; Savy, J.; Lawrence Livermore National Laboratory, Hazards Mitigation Center, PO Box 808, L-201, Livermore, CA 94551-0808, USA.; Kalogeras, I.; Institute of Geodynamics, National Observatory of Athens, Athens, Greece; Scognamiglio, L.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione CNT, Roma, Italia; Stavrakakis, G.; Institute of Geodynamics, National Observatory of Athens, Athens, Greece

    2007-01-01

    We present a physically based methodology to predict the range of ground-motion hazard for earthquakes along specific faults or within specific source volumes, and we demonstrate how to incorporate this methodology into probabilistic seismic hazard analyses (PSHA). By ‘physically based,’ we refer to ground-motion syntheses derived from physics and an understanding of the earthquake process. This approach replaces the aleatory uncertainty that current PSHA studies estimate by re...

  4. Producing K indices by the interactive method based on the traditional hand-scaling methodology - preliminary results

    Science.gov (United States)

    Valach, Fridrich; Váczyová, Magdaléna; Revallo, Miloš

    2016-01-01

    This paper reports on an interactive computer method for producing K indices. The method is based on the traditional hand-scaling methodology that had been practised at Hurbanovo Geomagnetic Observatory till the end of 1997. Here, the performance of the method was tested on the data of the Kakioka Magnetic Observatory. We have found that in some ranges of the K-index values our method might be a beneficial supplement to the computer-based methods approved and endorsed by IAGA. This result was achieved for both very low (K=0) and high (K ≥ 5) levels of the geomagnetic activity. The method incorporated an interactive procedure of selecting quiet days by a human operator (observer). This introduces a certain amount of subjectivity, similarly as the traditional hand-scaling method.

  5. Evolutionary programming-based methodology for economical output power from PEM fuel cell for micro-grid application

    Science.gov (United States)

    El-Sharkh, M. Y.; Rahman, A.; Alam, M. S.

    This paper presents a methodology for finding the optimal output power from a PEM fuel cell power plant (FCPP). The FCPP is used to supply power to a small micro-grid community. The technique used is based on evolutionary programming (EP) to find a near-optimal solution of the problem. The method incorporates the Hill-Climbing technique (HCT) to maintain feasibility during the solution process. An economic model of the FCPP is used. The model considers the production cost of energy and the possibility of selling and buying electrical energy from the local grid. In addition, the model takes into account the thermal energy output from the FCPP and the thermal energy requirement for the micro-grid community. The results obtained are compared against a solution based on genetic algorithms. Results are encouraging and indicate viability of the proposed technique.

  6. Ecological modernization of socio-economic development of the region in the context of social transformations: theoretical and methodological bases

    Directory of Open Access Journals (Sweden)

    O.V. Shkarupa

    2015-09-01

    Full Text Available The aim of the article. The aim of the article is to study the theoretical and methodological bases of ecological modernization of socio-economic development of the region. The results of the analysis. This paper studies scientific basis of transformation processes for sustainable development, which is important at this time. Considering direction of the history of sustainable development concept, the author emphasizes that if to follow the basic guidelines to upgrade social and economic systems to «green» economy, then it can expected results of ecological modernization. It will save funds due to forewarned of economic damage from pollution and environmental cost savings compensation «rehabilitation» resources and territories. Moreover, prevention of anthropogenic pressure increases the chances of social systems to improve the quality of life and improve the health of the nation. From economic point of view, clean production is more competitive. This study considered the theoretical and methodological bases of ecological modernization of process of socio-economic systems in a region that involves making special reference points of development. Ecological modernization is a prerequisite for ecologically transformation based on quality eco-oriented reforms in social and economic systems. Ecologically safe transformation of socio-economic development means certain progressive changes (intersystem, and intersystem synergistic and transformations that are strategic in view of eco-focused goal-setting. Such understanding provides for: 1 understanding of transformation as a process which is already identified in environmental trend of socio-economic system; 2 spatial certainty of eco-oriented reforms in connection with certainty qualities of the future development of the system. Arguably, it can and should lead to the structural changes in innovation for sustainable development. Conclusions and directions of further researches. It is shown that

  7. A Geographical Information System (GIS) based methodology for determination of potential biomasses and sites for biogas plants in southern Finland

    International Nuclear Information System (INIS)

    Highlights: • The biomethane potential in southern Finland is over 3 TWh. • Agricultural biomass accounts >90% of the biomethane potential in study regions. • The GIS method can be used for detailed biogas plant planning. • The GIS provides tools for e.g. land locations, cost and emission calculations. - Abstract: Objective: The objective of this study was to analyse the spatial distribution and amount of potential biomass feedstock for biomethane production and optimal locations, sizes and number of biogas plants in southern Finland in the area of three regional waste management companies. Methods: A Geographical Information System (GIS) based methodology, which also included biomass transport optimisation considering the existing road network and spatially varied biomass sources, was used. Kernel Density (KD) maps were calculated to pinpoint areas with high biomass concentration. Results: The results show that the total amount of biomass corresponds to 2.8 TWh of energy of which agro materials account for more than 90%. It was found that a total of 49 biogas plants could be built in three case regions with feedstock available within maximum transportation radius of 10 or 40 km. With maximum of 10 km biomass transportation distance, the production capacity of the planned plants ranges from 2.1 to 8.4 MW. If transportation distance was increased to 40 km, the plant capacities could also increase from 2.3 to 16.8 MW. Conclusions: As demonstrated in this study, the studied GIS methodology can be used for identification of the most suitable locations for biogas plants by providing the tools for e.g. transportation routes and distances. Practice implications: The methodology can further be used in environmental impact assessment as well as in cost analysis

  8. Estimating initial contaminant mass based on fitting mass-depletion functions to contaminant mass discharge data: Testing method efficacy with SVE operations data

    Science.gov (United States)

    Mainhagu, J.; Brusseau, M. L.

    2016-09-01

    The mass of contaminant present at a site, particularly in the source zones, is one of the key parameters for assessing the risk posed by contaminated sites, and for setting and evaluating remediation goals and objectives. This quantity is rarely known and is challenging to estimate accurately. This work investigated the efficacy of fitting mass-depletion functions to temporal contaminant mass discharge (CMD) data as a means of estimating initial mass. Two common mass-depletion functions, exponential and power functions, were applied to historic soil vapor extraction (SVE) CMD data collected from 11 contaminated sites for which the SVE operations are considered to be at or close to essentially complete mass removal. The functions were applied to the entire available data set for each site, as well as to the early-time data (the initial 1/3 of the data available). Additionally, a complete differential-time analysis was conducted. The latter two analyses were conducted to investigate the impact of limited data on method performance, given that the primary mode of application would be to use the method during the early stages of a remediation effort. The estimated initial masses were compared to the total masses removed for the SVE operations. The mass estimates obtained from application to the full data sets were reasonably similar to the measured masses removed for both functions (13 and 15% mean error). The use of the early-time data resulted in a minimally higher variation for the exponential function (17%) but a much higher error (51%) for the power function. These results suggest that the method can produce reasonable estimates of initial mass useful for planning and assessing remediation efforts.

  9. Application of backtracking algorithm to depletion calculations

    International Nuclear Information System (INIS)

    Based on the theory of linear chain method for analytical depletion calculations, the burn-up matrix is decoupled by the divide and conquer strategy and the linear chain with Markov characteristic is formed. The density, activity and decay heat of every nuclide in the chain can be calculated by analytical solutions. Every possible reaction path of the nuclide must be considered during the linear chain establishment process. To confirm the calculation precision and efficiency, the algorithm which can cover all the reaction paths of the nuclide and search the paths automatically according to to problem description and precision restrictions should be sought. Through analysis and comparison of several kinds of searching algorithms, the backtracking algorithm was selected to search and calculate the linear chains using Depth First Search (DFS) method. The depletion program can solve the depletion problem adaptively and with high fidelity. The solution space and time complexity of the program were analyzed. The new developed depletion program was coupled with Monte Carlo program MCMG-II to calculate the benchmark burn-up problem of the first core of China Experimental Fast Reactor (CEFR). The initial verification and validation of the program was performed by the calculation. (author)

  10. Application of backtracking algorithm to depletion calculations

    International Nuclear Information System (INIS)

    Based on the theory of linear chain method for analytical depletion calculations, the burnup matrix is decoupled by the divide and conquer strategy and the linear chain with Markov characteristic is formed. The density, activity and decay heat of every nuclide in the chain then can be calculated by analytical solutions. Every possible reaction path of the nuclide must be considered during the linear chain establishment process. To confirm the calculation precision and efficiency, the algorithm which can cover all the reaction paths and search the paths automatically according to the problem description and precision restrictions should be found. Through analysis and comparison of several kinds of searching algorithms, the backtracking algorithm was selected to establish and calculate the linear chains in searching process using depth first search (DFS) method, forming an algorithm which can solve the depletion problem adaptively and with high fidelity. The complexity of the solution space and time was analyzed by taking into account depletion process and the characteristics of the backtracking algorithm. The newly developed depletion program was coupled with Monte Carlo program MCMG-Ⅱ to calculate the benchmark burnup problem of the first core of China Experimental Fast Reactor (CEFR) and the preliminary verification and validation of the program were performed. (authors)

  11. Novel 3-D Object Recognition Methodology Employing a Curvature-Based Histogram

    Directory of Open Access Journals (Sweden)

    Liang-Chia Chen

    2013-07-01

    Full Text Available In this paper, a new object recognition algorithm employing a curvature-based histogram is presented. Recognition of three-dimensional (3-D objects using range images remains one of the most challenging problems in 3-D computer vision due to its noisy and cluttered scene characteristics. The key breakthroughs for this problem mainly lie in defining unique features that distinguish the similarity among various 3-D objects. In our approach, an object detection scheme is developed to identify targets underlining an automated search in the range images using an initial process of object segmentation to subdivide all possible objects in the scenes and then applying a process of object recognition based on geometric constraints and a curvature-based histogram for object recognition. The developed method has been verified through experimental tests for its feasibility confirmation.

  12. Isodose-based methodology for minimizing the morbidity and mortality of thoracic hypofractionated radiotherapy

    International Nuclear Information System (INIS)

    Background and purpose: Help identify and define potential normal tissue dose constraints to minimize the mortality and morbidity of hypofractionated lung radiotherapy. Materials and methods: A method to generate isodose-based constraints and visually evaluate treatment plans, based on the published peer reviewed literature and the linear quadratic model, is presented. The radiobiological analysis assumes that the linear quadratic model is valid up to 28 Gy per fraction, the α/β ratio is 2 for the spinal cord and brachial plexus, 4 for pneumonitis, 4 or 10 for acute skin reactions depending on treatment length, and 3 for late complications in other normal tissues. A review of the literature was necessary to identify possible endpoints and normal tissue constraints for thoracic hypofractionated lung radiotherapy. Results: Preliminary normal tissue constraints to reduce mortality and morbidity were defined for organs at risk based upon hypofractionated lung radiotherapy publications. A modified dose nomenclature was introduced to facilitate the comparison of hypofractionated doses. Potential side effects from hypofractionated lung radiotherapy such as aortic dissection, neuropathy, and fatal organ perforation rarely seen in conventional treatments were identified. The isodose-based method for treatment plan analysis and normal tissue dose constraint simplification was illustrated. Conclusions: The radiobiological analysis based on the LQ method, biologically equivalent dose nomenclature, and isodose-based method proposed in this study simplifies normal tissue dose constraints and treatment plan evaluation. This may also be applied to extrathoracic hypofractionated radiotherapy. Prospective validation of these preliminary thoracic normal tissue dose constraints for hypofractionated lung radiotherapy is necessary.

  13. DEVELOPMENT OF THE CONTROL METHODOLOGY OF THE GIANT MAGNETOSTRICTIVE ACTUATOR BASED ON MAGNETIC FLUX DENSITY

    Institute of Scientific and Technical Information of China (English)

    Jia Zhenyuan; Yang Xing; Shi Chun; Guo Dongming

    2003-01-01

    According to the principle of the magnetostriction generating mechanism, the control model of giant magnetostriction material based on magnetic field and the control method with magnetic flux density are developed. Furthermore, this control method is used to develop a giant magnetostrictive micro-displacement actuator (GMA) and its driving system. Two control methods whose control variables are current intensity and magnetic flux density are compared with each other by experimental studies. Finally, effective methods on improving the linearity and control precision of micro-displacement actuator and reducing the hysteresis based on the controlling magnetic flux density are obtained.

  14. A Novel Multiscale Physics Based Progressive Failure Methodology for Laminated Composite Structures

    Science.gov (United States)

    Pineda, Evan J.; Waas, Anthony M.; Bednarcyk, Brett A.; Collier, Craig S.; Yarrington, Phillip W.

    2008-01-01

    A variable fidelity, multiscale, physics based finite element procedure for predicting progressive damage and failure of laminated continuous fiber reinforced composites is introduced. At every integration point in a finite element model, progressive damage is accounted for at the lamina-level using thermodynamically based Schapery Theory. Separate failure criteria are applied at either the global-scale or the microscale in two different FEM models. A micromechanics model, the Generalized Method of Cells, is used to evaluate failure criteria at the micro-level. The stress-strain behavior and observed failure mechanisms are compared with experimental results for both models.

  15. An Event-Based Methodology to Generate Class Diagrams and its Empirical Evaluation

    Directory of Open Access Journals (Sweden)

    Sandeep K. Singh

    2010-01-01

    Full Text Available Problem statement: Event-based systems have importance in many application domains ranging from real time monitoring systems in production, logistics, medical devices and networking to complex event processing in finance and security. The increasing popularity of Event-based systems has opened new challenging issues for them. One such issue is to carry out requirements analysis of event-based systems and build conceptual models. Currently, Object Oriented Analysis (OOA using Unified Modeling Language (UML is the most popular requirement analysis approach for which several OOA tools and techniques have been proposed. But none of the techniques and tools to the best of our knowledge, have focused on event-based requirements analysis, rather all are behavior-based approaches. Approach: This study described a requirement analysis approach specifically for event based systems. The proposed approach started from events occurring in the system and derives an importable class diagram specification in XML Metadata Interchange (XMI format for Argo UML tool. Requirements of the problem domain are captured as events in restricted natural language using the proposed Event Templates in order to reduce the ambiguity. Results: Rules were designed to extract a domain model specification (analysis-level class diagram from Event Templates. A prototype tool 'EV-ClassGEN' is also developed to provide automation support to extract events from requirements, document the extracted events in Event Templates and implement rules to derive specification for an analysis-level class diagram. The proposed approach is also validated through a controlled experiment by applying it on many cases from different application domains like real time systems, business applications, gaming. Conclusion: Results of the controlled experiment had shown that after studying and applying Event-based approach, student's perception about ease of use and usefulness of OOA technique has

  16. The TNO individual monitoring service based on TLD concepts and methodology

    International Nuclear Information System (INIS)

    A general description is given of the dosimetric concepts in todays Individula Monitoring on which the TNO thermoluminiscence dosimetry (TLD) system is based. Some technical details of the TLD system itself are given. (H.W.). 10 refs.; 6 figs.; 1 tab

  17. Comparing econometric and survey-based methodologies in measuring offshoring: The Danish experience

    DEFF Research Database (Denmark)

    Refslund, Bjarke

    2016-01-01

    the national or regional level. Most macro analyses are based on proxies and trade statistics with limitations. Drawing on unique Danish survey data this article demonstrates how survey data can provide important insights into the national scale and impacts of offshoring, including changes of...

  18. How to create a methodology of conceptual visualization based on experiential cognitive science and diagrammatology

    DEFF Research Database (Denmark)

    Toft, Birthe

    2013-01-01

    Based on the insights of experiential cognitive science and of diagrammatology as defined by Charles S. Peirce and Frederik Stjernfelt, this article analyses the iconic links connecting visualizations of Stjernfelt diagrams with human perception and action and starts to lay the theoretical ground...

  19. Methodology for Design of Web-Based Laparoscopy e-Training System

    Science.gov (United States)

    Borissova, Daniela; Mustakerov, Ivan

    2011-01-01

    The Web-based e-learning can benefit from the modern multimedia tools combined with network capabilities to overcome traditional education. The objective of this paper is focused on e-training system development to improve performance of theoretical knowledge and providing ample opportunity for practical attainment of manual skills in virtual…

  20. A Museum in a Book: Teaching Culture through Decolonizing, Arts-Based Methodologies

    Science.gov (United States)

    Chappell, Sharon Verner; Chappell, Drew

    2011-01-01

    This paper explores the positivist, museum-based, and touristic constructions of indigenous cultures in the Americas, as represented in the DK "Eyewitness" series, and then overturns these constructions using an artist book created by the authors. In our analysis of the nonfiction series, we identified three trajectories: cataloguing, consignment…

  1. Life History Methodologies: An Investigation into Work-Based Learning Experiences of Community Education Workers

    Science.gov (United States)

    Issler, Sally; Nixon, David

    2007-01-01

    This article focuses on an investigation into the learning journeys undertaken by managers of a community education project in an area of urban deprivation. A constructivist interpretation of life history narrative revealed the positive effects of community workers' heavy dependence on experiential work-based learning, which resulted in the…

  2. Partition-based Low Power DFT Methodology for System-on-chips

    Institute of Scientific and Technical Information of China (English)

    LI Yu-fei; CHEN Jian; FU Yu-zhuo

    2007-01-01

    This paper presents a partition-based Design-forTest (DFT) technique to reduce the power consumption during scan-based testing. This method is based on partitioning the chip into several independent scan domains. By enabling the scan domains alternatively, only a fraction of the entire chip will be active at the same time, leading to Iow power consumption during test. Therefore, it will significantly reduce the possibility of Electronic Migration and Overheating. In order to prevent the drop of fault coverage, wrappers on the boundaries between scan domains are employed. This paper also presents a detailed design flow based on Electronics Design Automation(EDA) tools from Synopsy(s) to implement the proposed test structure. The proposed DFT method is experimented on a state-of-theart System-on-chips (SOC). The simulation results show a significant reduction in both average and peak power dissipation without sacrificing the fault coverage and test time. This SOC has been taped out in TSMC and finished the final test in ADVANTEST.

  3. Performance Assessment of the Wave Dragon Wave Energy Converter Based on the EquiMar Methodology

    DEFF Research Database (Denmark)

    Parmeggiani, Stefano; Chozas, Julia Fernandez; Pecher, Arthur;

    2011-01-01

    At the present pre-commercial phase of the wave energy sector, device developers are called to provide reliable estimates on power performance and production at possible deployment locations. The EU EquiMar project has proposed a novel approach, where the performance assessment is based mainly on...

  4. Comparison of Different Ground-Based NDVI Measurement Methodologies to Evaluate Crop Biophysical Properties

    Directory of Open Access Journals (Sweden)

    Rossana Monica Ferrara

    2010-06-01

    Full Text Available The usage of vegetation indices such as the Normalized Difference Vegetation Index (NDVI calculated by means of remote sensing data is widely spread for describing vegetation status on large space scale. However, a big limitation of these indices is their inadequate time resolution for agricultural purposes. This limitation could be overcome by the ground-based vegetation indices that could provide an interesting tool for integrating satellite-based value. In this work, three techniques to calculate the ground-NDVI have been evaluated for sugar beet cultivated in South Italy in all its phenological phases: the NDVI1 based on hand made reflectance measurements, the NDVI2 calculated on automatically reflectance measurements and the broadband NDVIb based on Photosynthetically Active Radiation (PAR and global radiation measurements. The best performance was obtained by the NDVIb. Moreover, crop-microclimate-NDVI relations were investigated. In particular, the relationship between NDVI and the Leaf Area Index (LAI was found logarithmic with a saturation of NDVI at LAI around 1.5 m2 m-2. A clear relation was found between NDVI and crop coefficient Kc experimentally determined by the ratio between actual and reference measured or modelled evapotranspirations, while the relation between NDVI and crop actual evapotranspiration was very weak and not usable for practical purposes. Lastly, no relationship between the microclimate and the NDVI was found.

  5. Depletable resources and the economy.

    OpenAIRE

    Heijman, W. J. M.

    1991-01-01

    The subject of this thesis is the depletion of scarce resources. The main question to be answered is how to avoid future resource crises. After dealing with the complex relation between nature and economics, three important concepts in relation with resource depletion are discussed: steady state, time preference and efficiency.For the steady state, three variants are distinguished; the stationary state, the physical steady state and the state of steady growth. It is concluded that the so-call...

  6. Information Technology Service Management (ITSM Implementation Methodology Based on Information Technology Infrastructure Library Ver.3 (ITIL V3

    Directory of Open Access Journals (Sweden)

    Mostafa Mohamed AlShamy

    2012-06-01

    Full Text Available This paper is intended to cover the concept of IT Infrastructure Library (ITIL v3 and how to implement it in order to increase the efficiency of any Egyptian IT corporate and to help the corporate employees to do their work easily and its clients to feel the quality of services provided to them. ITIL is considered now as the de facto standard for IT Service Management (ITSM in organizations which operate their business based on IT infrastructure and services. ITIL v3 was implemented in western organizations but still it is a new framework for the Egyptian and Arabian environment. The best proof of the lack of ITSM in the Arab region and not Egypt alone is that the percentage of the companies which have ISO/IEC 20000 are less than 2% of the total certified companies in the whole world and in Egypt no company has it until now as stated on APMG ISO/IEC 20000 website[1]. Accordingly this paper investigates an implementation methodology of ITIL in an Egyptian organization taking into consideration the cultural factors and how it will affect the success of this implementation. We have already implemented this methodology in three Egyptian companies and it succeeded to increase the level of process maturity from level one to level four according the PMF

  7. Final Report, Nuclear Energy Research Initiative (NERI) Project: An Innovative Reactor Analysis Methodology Based on a Quasidiffusion Nodal Core Model

    International Nuclear Information System (INIS)

    OAK (B204) Final Report, NERI Project: ''An Innovative Reactor Analysis Methodology Based on a Quasidiffusion Nodal Core Model'' The present generation of reactor analysis methods uses few-group nodal diffusion approximations to calculate full-core eigenvalues and power distributions. The cross sections, diffusion coefficients, and discontinuity factors (collectively called ''group constants'') in the nodal diffusion equations are parameterized as functions of many variables, ranging from the obvious (temperature, boron concentration, etc.) to the more obscure (spectral index, moderator temperature history, etc.). These group constants, and their variations as functions of the many variables, are calculated by assembly-level transport codes. The current methodology has two main weaknesses that this project addressed. The first weakness is the diffusion approximation in the full-core calculation; this can be significantly inaccurate at interfaces between different assemblies. This project used the nodal diffusion framework to implement nodal quasidiffusion equations, which can capture transport effects to an arbitrary degree of accuracy. The second weakness is in the parameterization of the group constants; current models do not always perform well, especially at interfaces between unlike assemblies. The project developed a theoretical foundation for parameterization and homogenization models and used that theory to devise improved models. The new models were extended to tabulate information that the nodal quasidiffusion equations can use to capture transport effects in full-core calculations

  8. Dielectric Resonator-Based Flow and Stopped-Flow EPR with Rapid Field Scanning: A Methodology for Increasing Kinetic Information

    Science.gov (United States)

    Sienkiewicz, Andrzej; Ferreira, Ana Maria da Costa; Danner, Birgit; Scholes, Charles P.

    1999-02-01

    We report methodology which combines recently developed dielectric resonator-based, rapid-mix, stopped-flow EPR (appropriate for small, aqueous, lossy samples) with rapid scanning of the external (Zeeman) magnetic field where the scanning is preprogrammed to occur at selected times after the start of flow. This methodology gave spectroscopic information complementary to that obtained by stopped-flow EPR at single fields, and with low reactant usage, it yielded more graphic insight into the time evolution of radical and spin-labeled species. We first used the ascorbyl radical as a test system where rapid scans triggered after flow was stopped provided "snapshots" of simultaneously evolving and interacting radical species. We monitored ascorbyl radical populations either as brought on by biologically damaging peroxynitrite oxidant or as chemically and kinetically interacting with a spectroscopically overlapping nitroxide radical. In a different biophysical application, where a spin-label lineshape reflected rapidly changing molecular dynamics of folding spin-labeled protein, rapid scan spectra were taken during flow with different flow rates and correspondingly different times after the mixing-induced inception of protein folding. This flow/rapid scan method is a means for monitoring early immobilization of the spin probe in the course of the folding process.

  9. Towards a complete propagation uncertainties in depletion calculations

    Energy Technology Data Exchange (ETDEWEB)

    Martinez, J.S. [Universidad Politecnica de Madrid (Spain). Dept. of Nuclear Engineering; Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Garching (Germany); Zwermann, W.; Gallner, L.; Puente-Espel, Federico; Velkov, K.; Hannstein, V. [Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Garching (Germany); Cabellos, O. [Universidad Politecnica de Madrid (Spain). Dept. of Nuclear Engineering

    2013-07-01

    Propagation of nuclear data uncertainties to calculated values is interesting for design purposes and libraries evaluation. XSUSA, developed at GRS, propagates cross section uncertainties to nuclear calculations. In depletion simulations, fission yields and decay data are also involved and are a possible source of uncertainty that should be taken into account. We have developed tools to generate varied fission yields and decay libraries and to propagate uncertainties through depletion in order to complete the XSUSA uncertainty assessment capabilities. A generic test to probe the methodology is defined and discussed. (orig.)

  10. An optimization-based integrated controls-structures design methodology for flexible space structures

    Science.gov (United States)

    Maghami, Peiman G.; Joshi, Suresh M.; Armstrong, Ernest S.

    1993-01-01

    An approach for an optimization-based integrated controls-structures design is presented for a class of flexible spacecraft that require fine attitude pointing and vibration suppression. The integrated design problem is posed in the form of simultaneous optimization of both structural and control design variables. The approach is demonstrated by application to the integrated design of a generic space platform and to a model of a ground-based flexible structure. The numerical results obtained indicate that the integrated design approach can yield spacecraft designs that have substantially superior performance over a conventional design wherein the structural and control designs are performed sequentially. For example, a 40-percent reduction in the pointing error is observed along with a slight reduction in mass, or an almost twofold increase in the controlled performance is indicated with more than a 5-percent reduction in the overall mass of the spacecraft (a reduction of hundreds of kilograms).

  11. Experimental validation of optimization-based integrated controls-structures design methodology for flexible space structures

    Science.gov (United States)

    Maghami, Peiman G.; Gupta, Sandeep; Joshi, Suresh M.; Walz, Joseph E.

    1993-01-01

    An optimization-based integrated design approach for flexible space structures is experimentally validated using three types of dissipative controllers, including static, dynamic, and LQG dissipative controllers. The nominal phase-0 of the controls structure interaction evolutional model (CEM) structure is redesigned to minimize the average control power required to maintain specified root-mean-square line-of-sight pointing error under persistent disturbances. The redesign structure, phase-1 CEM, was assembled and tested against phase-0 CEM. It is analytically and experimentally demonstrated that integrated controls-structures design is substantially superior to that obtained through the traditional sequential approach. The capability of a software design tool based on an automated design procedure in a unified environment for structural and control designs is demonstrated.

  12. Attribute Based Selection of Thermoplastic Resin for Vacuum Infusion Process: A Decision Making Methodology

    DEFF Research Database (Denmark)

    Raghavalu Thirumalai, Durai Prabhakaran; Lystrup, Aage; Løgstrup Andersen, Tom

    2012-01-01

    The composite industry looks toward a new material system (resins) based on thermoplastic polymers for the vacuum infusion process, similar to the infusion process using thermosetting polymers. A large number of thermoplastics are available in the market with a variety of properties suitable for...... different engineering applications, and few of those are available in a not yet polymerised form suitable for resin infusion. The proper selection of a new resin system among these thermoplastic polymers is a concern for manufactures in the current scenario and a special mathematical tool would be...... beneficial. In this paper, the authors introduce a new decision making tool for resin selection based on significant attributes. This article provides a broad overview of suitable thermoplastic material systems for vacuum infusion process available in today’s market. An illustrative example—resin selection...

  13. A Trajectory-Oriented, Carriageway-Based Road Network Data Model, Part 2: Methodology

    Institute of Scientific and Technical Information of China (English)

    LI Xiang; LIN Hui

    2006-01-01

    This is the second of a three-part series of papers which presents the principle and architecture of the CRNM, a trajectory-oriented, carriageway-based road network data model. The first part of the series has introduced a general background of building trajectory-oriented road network data models, including motivation, related works, and basic concepts. Based on it, this paper describs the CRNM in detail. At first, the notion of basic roadway entity is proposed and discussed. Secondly, carriageway is selected as the basic roadway entity after compared with other kinds of roadway, and approaches to representing other roadways with carriageways are introduced. At last, an overall architecture of the CRNM is proposed.

  14. Selective Chemical Labeling of Proteins with Small Fluorescent Molecules Based on Metal-Chelation Methodology

    OpenAIRE

    Nobuaki Soh

    2008-01-01

    Site-specific chemical labeling utilizing small fluorescent molecules is a powerful and attractive technique for in vivo and in vitro analysis of cellular proteins, which can circumvent some problems in genetic encoding labeling by large fluorescent proteins. In particular, affinity labeling based on metal-chelation, advantageous due to the high selectivity/simplicity and the small tag-size, is promising, as well as enzymatic covalent labeling, thereby a variety of novel methods have been stu...

  15. A Methodological Framework for Project-based Collaborative Learning in a Networked Environment

    OpenAIRE

    Daradoumis, Thanasis; Xhafa, Fatos; Marquès, Joan Manuel

    2002-01-01

    In this work we present a new way to use existing technology in a real learning context. We investigate and propose a scenario based on collaborative work of virtual groups of students developing a software project. Our intent is to promote learning through collaborative implementation of a project in shared workspaces. This raises several important issues that concern virtual group formation techniques, group regulation and role-playing by students. Other important aspects are related to the...

  16. Methodology and theory evaluation of overall equipment effectiveness based on market

    OpenAIRE

    Anvari, Farhad; EDWARDS, Rodger; Starr, Andrew G.

    2010-01-01

    Purpose - Continuous manufacturing systems used within the steel industry involve different machines and processes that are arranged in a sequence of operations in order to manufacture the products. The steel industry is generally a capital-intensive industry and, because of high capital investment, the utilisation of equipment as effectively as possible is of high priority. This paper seeks to illustrate a new method, overall equipment effectiveness market-based (OEE-MB) for the precise calc...

  17. Rapid Stereology Based Quantitative Immunohistochemistry of Dendritic Cells in Lymph Nodes: A Methodological Study

    OpenAIRE

    van Hensbergen, Yvette; Luykx‐de Bakker, Sylvia A.; Heideman, Daniëlle A.M.; Meijer, Gerrit A.; Pinedo, Herbert M.; Paul J. van Diest

    2001-01-01

    This study was done to arrive at a fast and reliable protocol for assessment of fractional volumes of immunohistochemically stained dendritic cells in lymph nodes. Twenty axillary lymph nodes of patients with locally advanced breast cancer were immuno‐histochemically stained with an S100 antibody. Fractional volumes of dendritic cells were assessed by stereology based quantitative immunohistochemistry using an interactive video overlay system including an automated microscope. The gold standa...

  18. CONTRIBUTIONS REFERRING TO A TERRITORIAL MODEL FOR DEVELOPMENT OF THE ECONOMY BASED ON KNOWLEGDE (METHODOLOGICAL FRAMEWORK)

    OpenAIRE

    Mihail Dumitrescu; Lavinia Ţoţan

    2007-01-01

    This paper presented a short evolution of the concepts until the knowledge economy. It is mentioned the filosofy of the research which containts: the objectives of the projects, the changes in the preparation levels of human resources, the evolution of informational processes. The model also containts the determination of the characterizing indicators for the knowledge economy at territorial level with the computing relations, and also the evaluation on statistical bases.

  19. System Error Compensation Methodology Based on a Neural Network for a Micromachined Inertial Measurement Unit

    OpenAIRE

    Shi Qiang Liu; Rong Zhu

    2016-01-01

    Errors compensation of micromachined-inertial-measurement-units (MIMU) is essential in practical applications. This paper presents a new compensation method using a neural-network-based identification for MIMU, which capably solves the universal problems of cross-coupling, misalignment, eccentricity, and other deterministic errors existing in a three-dimensional integrated system. Using a neural network to model a complex multivariate and nonlinear coupling system, the errors could be readily...

  20. ELABORATION OF METHODOLOGICAL TOOLS FOR AGRICULTURAL RISK MANAGEMENT BASED ON INNOVATION

    OpenAIRE

    Voroshilova I. V.; Piterskaya L. Y.; Babkina E. N.

    2015-01-01

    The article deals with the possibility of expanding of agricultural tools in risk management based on commodity financial instruments and weather derivatives. On the basis of summarizing the research results of domestic and foreign scholars and creative interpretation of the results the authors supplemented and refined definition of the category of "risk" and "risk of agricultural production” is obtained. The article supplements classification of risk in agricultural production and circulatio...

  1. Knowledge search for new product development: a multi-agent based methodology

    OpenAIRE

    Jian, Guo

    2011-01-01

    Manufacturers are the leaders in developing new products to drive productivity. Higher productivity means more products based on the same materials, energy, labour, and capitals. New product development plays a critical role in the success of manufacturing firms. Activities in the product development process are dependent on the knowledge of new product development team members. Increasingly, many enterprises consider effective knowledge search to be a source of competitive advantage. Th...

  2. The Narrative-Based Enhancement Methodology for the Cultural Property and the Cultural Institution

    OpenAIRE

    Kastytis Rudokas

    2013-01-01

    The paper addresses the contemporary conception of the place marketing, image and the impact of multiple identities on the cultural institution of a city. The first part of the paper is based on the most famous Clare A. Gunn theory on two possible perceptions of the postmodern place image. The author of the article points out that the cultural value of an object is conditional and depends on communicational strategies and community needs. As an example of identity introduction to a place, the...

  3. A GIS-based methodology for safe site selection of a building in a hilly region

    OpenAIRE

    Satish Kumar; Bansal, V. K.

    2016-01-01

    Worker safety during construction is widely accepted, but the selection of safe sites for a building is generally not considered. Safe site selection (SSS) largely depends upon compiling, analyzing, and refining the information of an area where a building is likely to be located. The locational and topographical aspects of an area located in hilly regions play a major role in SSS, but are generally neglected in traditional and CAD-based systems used for site selection. Architects and engineer...

  4. Combining Axiomatic Design and Case-Based Reasoning in a Design Methodology of Mechatronics Products

    OpenAIRE

    Janthong, Nattawut; BRISSAUD Daniel; Butdee, Suthep

    2009-01-01

    Current market environments are volatile and unpredictable. The ability for design products to meet customer's requirements has become critical to success. The key element to develop such products is identifying functional requirements and knowledge utilization based on a scientific approach to provide both designers of new products and redesigners of existing products with a suitable solution that meets to customer's needs. This paper presents a method to (re)design mechatronic products by c...

  5. A Methodology Integrating Petri Nets and Knowledge-based Systems to Support Process Family Planning

    OpenAIRE

    Zhang, Linda; Xu, Qianli; Helo, Petri

    2011-01-01

    Abstract Planning production processes for product families has been well recognized as an effective means of achieving successful product family development. However, most existing approaches do not lend themselves to planning production processes with focus on the optimality of the cohort of a product family. This paper addresses process family planning for product families. In view of the advantages of Petri nets (PNs) for modeling large systems, the potential of knowledge-based...

  6. A systematic literature review of methodology used to measure effectiveness in digital game-based learning

    OpenAIRE

    All, Anissa; Nunez Castellar, Elena Patricia; Van Looy, Jan

    2013-01-01

    In recent years, a growing number of studies is being conducted into the effectiveness of digital game-based learning (DGBL). Despite this growing interest, however, it remains difficult to draw general conclusions due to the disparities in methods and reporting. Guidelines or a standardized procedure for conducting DGBL effectiveness research would allow to compare results across studies and provide well-founded and more generalizable evidence for the impact of DGBL. This study presents a fi...

  7. Low-complexity performance evaluation methodologies for OFDMA-based packet-switched wireless networks

    OpenAIRE

    Fernekeß, Andreas

    2010-01-01

    Cellular wireless networks have to be accurately planned to provide Quality of Service (QoS) to the user and to achieve revenue for the operator. Therefore, estimates of key performance indicators (KPIs) depending on parameters like the user scheduling, the data traffic and the data traffic load are necessary for planning of cellular wireless networks. Today’s cellular wireless networks are based on orthogonal frequency division multiple access (OFDMA) and a packet-switched network architectu...

  8. A Methodological Review of Piezoelectric Based Acoustic Wave Generation and Detection Techniques for Structural Health Monitoring

    OpenAIRE

    Zhigang Sun; Bruno Rocha; Kuo-Ting Wu; Nezih Mrad

    2013-01-01

    Piezoelectric transducers have a long history of applications in nondestructive evaluation of material and structure integrity owing to their ability of transforming mechanical energy to electrical energy and vice versa. As condition based maintenance has emerged as a valuable approach to enhancing continued aircraft airworthiness while reducing the life cycle cost, its enabling structural health monitoring (SHM) technologies capable of providing on-demand diagnosis of the structure without i...

  9. Tracking Environmental Compliance and Remediation Trajectories Using Image-Based Anomaly Detection Methodologies

    Directory of Open Access Journals (Sweden)

    James K. Lein

    2011-11-01

    Full Text Available Recent interest in use of satellite remote sensing for environmental compliance and remediation assessment has been heightened by growing policy requirements and the need to provide more rapid and efficient monitoring and enforcement mechanisms. However, remote sensing solutions are attractive only to the extent that they can deliver environmentally relevant information in a meaningful and time-sensitive manner. Unfortunately, the extent to which satellite-based remote sensing satisfies the demands for compliance and remediation assessment under the conditions of an actual environmental accident or calamity has not been well documented. In this study a remote sensing solution to the problem of site remediation and environmental compliance assessment was introduced based on the use of the RDX anomaly detection algorithm and vegetation indices developed from the Tasseled Cap Transform. Results of this analysis illustrate how the use of standard vegetation transforms, integrated into an anomaly detection strategy, enable the time-sequenced tracking of site remediation progress. Based on these results credible evidence can be produced to support compliance evaluation and remediation assessment following major environmental disasters.

  10. A multi-attribute based methodology for vehicle detection and identification

    Science.gov (United States)

    Elangovan, Vinayak; Alsaidi, Bashir; Shirkhodaie, Amir

    2013-05-01

    Robust vehicle detection and identification is required for the intelligent persistent surveillance systems. In this paper, we present a Multi-attribute Vehicle Detection and Identification technique (MVDI) for detection and classification of stationary vehicles. The proposed model uses a supervised Hamming Neural Network (HNN) for taxonomy of shape of the vehicle. Vehicles silhouette features are employed for the training of the HNN from a large array of training vehicle samples in different type, scale, and color variation. Invariant vehicle silhouette attributes are used as features for training of the HNN which is based on an internal Hamming Distance and shape features to determine degree of similarity of a test vehicle against those it's selectively trained with. Upon detection of class of the vehicle, the other vehicle attributes such as: color and orientation are determined. For vehicle color detection, provincial regions of the vehicle body are used for matching color of the vehicle. For the vehicle orientation detection, the key structural features of the vehicle are extracted and subjected to classification based on color tune, geometrical shape, and tire region detection. The experimental results show the technique is promising and has robustness for detection and identification of vehicle based on their multi-attribute features. Furthermore this paper demonstrates the importance of the vehicle attributes detection towards the identification of Human-Vehicle Interaction events.

  11. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  12. Scenario development methodologies

    International Nuclear Information System (INIS)

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are a) Event tree analysis, b) Influence diagrams and c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs

  13. Putting Order into Our Universe: The Concept of Blended Learning—A Methodology within the Concept-based Terminology Framework

    Directory of Open Access Journals (Sweden)

    Joana Fernandes

    2016-06-01

    Full Text Available This paper aims at discussing the advantages of a methodology design grounded on a concept-based approach to Terminology applied to the most prominent scenario of current Higher Education: blended learning. Terminology is a discipline that aims at representing, describing and defining specialized knowledge through language, putting order into our universe (Nuopponen, 2011. Concepts, as elements of the structure of knowledge (Sager, 1990 emerge as a complex research object. Can they be found in language? A concept-based approach to Terminology implies a clear-cut view of the role of language in terminological work: though language is postulated as being a fundamental tool to grasp, describe and organize knowledge, an isomorphic relationship between language and knowledge cannot be taken for granted. In other words, the foundational premise of a concept-based approach is that there is no one-to-one correspondence between atomic elements of knowledge and atomic elements of linguistic expression. This is why a methodological approach to Terminology merely based upon specialized text research is regarded as biased (Costa, 2013. As a consequence, we argue that interactional strategies between terminologist and domain expert deserve particular research attention. To our mind, the key to concept-based terminological work is to carry out a concept analysis of data gathered from a specialised text corpora combined with an elicitation process of the tacit knowledge and concept-oriented discursive negotiation. Following such view, we put forward a methodology to answer the question: how is blended learning defined in the Post-Bologna scenario? Even though there are numerous high-quality models and practical descriptions for its implementation (similarly to other concepts related to distance learning, the need to understand, demarcate and harmonize the concept of blended learning against the current Higher Education background results from the premise that

  14. Methodological exploratory study applied to occupational epidemiology

    International Nuclear Information System (INIS)

    The utilization of epidemiologic methods and techniques has been object of practical experimentation and theoretical-methodological reflection in health planning and programming process. Occupational Epidemiology is the study of the causes and prevention of diseases and injuries from exposition and risks in the work environment. In this context, there is no intention to deplete such a complex theme but to deal with basic concepts of Occupational Epidemiology, presenting the main characteristics of the analysis methods used in epidemiology, as investigate the possible determinants of exposition (chemical, physical and biological agents). For this study, the social-demographic profile of the IPEN-CNEN/SP work force was used. The knowledge of this reference population composition is based on sex, age, educational level, marital status and different occupations, aiming to know the relation between the health aggravating factors and these variables. The methodology used refers to a non-experimental research based on a theoretical methodological practice. The work performed has an exploratory character, aiming a later survey of indicators in the health area in order to analyze possible correlations related to epidemiologic issues. (author)

  15. Strain-Based Design Methodology of Large Diameter Grade X80 Linepipe

    Energy Technology Data Exchange (ETDEWEB)

    Lower, Mark D. [ORNL

    2014-04-01

    Continuous growth in energy demand is driving oil and natural gas production to areas that are often located far from major markets where the terrain is prone to earthquakes, landslides, and other types of ground motion. Transmission pipelines that cross this type of terrain can experience large longitudinal strains and plastic circumferential elongation as the pipeline experiences alignment changes resulting from differential ground movement. Such displacements can potentially impact pipeline safety by adversely affecting structural capacity and leak tight integrity of the linepipe steel. Planning for new long-distance transmission pipelines usually involves consideration of higher strength linepipe steels because their use allows pipeline operators to reduce the overall cost of pipeline construction and increase pipeline throughput by increasing the operating pressure. The design trend for new pipelines in areas prone to ground movement has evolved over the last 10 years from a stress-based design approach to a strain-based design (SBD) approach to further realize the cost benefits from using higher strength linepipe steels. This report presents an overview of SBD for pipelines subjected to large longitudinal strain and high internal pressure with emphasis on the tensile strain capacity of high-strength microalloyed linepipe steel. The technical basis for this report involved engineering analysis and examination of the mechanical behavior of Grade X80 linepipe steel in both the longitudinal and circumferential directions. Testing was conducted to assess effects on material processing including as-rolled, expanded, and heat treatment processing intended to simulate coating application. Elastic-plastic and low-cycle fatigue analyses were also performed with varying internal pressures. Proposed SBD models discussed in this report are based on classical plasticity theory and account for material anisotropy, triaxial strain, and microstructural damage effects

  16. Adaptive Real Time Data Mining Methodology for Wireless Body Area Network Based Healthcare Applications

    Directory of Open Access Journals (Sweden)

    Dipti Durgesh Patil

    2012-08-01

    Full Text Available Since the population is growing, the need for high quality and efficient healthcare, both at home and in hospital, is becoming more important. This paper presents the innovative wireless sensor network basedMobile Real-time Health care Monitoring (WMRHM framework which has the capacity of giving health predictions online based on continuously monitored real time vital body signals. Developments in sensors, miniaturization of low-power microelectronics, and wireless networks are becoming a significant opportunity for improving the quality of health care services. Physiological signals like ECG, EEG, SpO2, BP etc. can be monitor through wireless sensor networks and analyzed with the help of data mining techniques. These real-time signals are continuous in nature and abruptly changing hence there is a need to apply an efficient and concept adapting real-time data stream mining techniques for taking intelligent health care decisions online. Because of the high speed and huge volume data set in data streams, the traditional classification technologies are no longer applicable. The most important criteria are to solve the real-time data streams mining problem with ‘concept drift’ efficiently. This paper presents the state-of-the art in this field with growing vitality and introduces the methods for detecting concept drift in data stream, then gives a significant summary of existing approaches to the problem of concept drift. The work is focused on applying these real time stream mining algorithms on vital signals of human body in Wireless Body Area Network( WBAN based health care environment.

  17. Weibull-Based Design Methodology for Rotating Structures in Aircraft Engines

    OpenAIRE

    Zaretsky V. E.; Hendricks C. R.; Soditus S.

    2003-01-01

    The NASA Energy-Efficient Engine (E3-Engine) is used as the basis of a Weibull-based life and reliability analysis. Each component's life, and thus the engine's life, is defined by high-cycle fatigue or low-cycle fatigue. Knowing the cumulative life distribution of each of the components making up the engine as represented by a Weibull slope is a prerequisite to predicting the life and reliability of the entire engine. As the engine's Weibull slope increases, the predicted life decreases. The...

  18. Values and attitudes of entrepreneurs in Sao Tome and Principe: A study based on Q Methodology

    OpenAIRE

    Gil, Fabiola; Dentinho, Tomaz

    2010-01-01

    São Tomé e Príncipe is a small insular country located in the Gulf of Guinea, populated by 160000 persons, living in 1000 square kilometers of a rainforest environment. The GDP pc is 1200 US$ per year, 50% of the economy is based in auto-consumption and the main engine of the remaining economic system is the external public debt, rents from promising oil, and donations (50%), complemented by tourism, cacao exports, private transferences, and other exports. To understand the role ...

  19. Model-Driven Methodology for Rapid Deployment of Smart Spaces Based on Resource-Oriented Architectures

    OpenAIRE

    José R. Casar; Josué Iglesias; Bernardos, Ana M.; Iván Corredor

    2012-01-01

    Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT) and Web of Things (WoT) are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i) to integrate sensing and actuating functionalities into everyday objects, and (ii) to easily allow a diver...

  20. Development of a testing methodology for computerized procedure system based on JUnit framework and MFM

    International Nuclear Information System (INIS)

    Paper Based Procedure (PBP) and Computerized Procedure System (CPS) are studied to demonstrate that it is necessary to develop CPS in Nuclear Power Plant (NPP) Instrumentation and Control (I and C) system. Computerized procedure system is actually a software system. All the desired and undesired properties of a software system can be described and evaluated as software qualities. Generally, software qualities can be categorized into product quality and process quality. In order to achieve product quality, the process quality of a software system should also be considered and achieved. Characteristics of CPS will be described to analyse the product and process of an example CPS: ImPRO. At the same time, several main product and process issues will be analysed from Verification and Validation (V and V) point of view. It is concluded and suggested that V and V activities can also be regarded as a software development process, this point of view then is applied to the V and V activities of ImPRO as a systematic approach of testing of ImPRO. To support and realize this approach, suitable testing technologies and testing strategies are suggested based on JUnit framework and Multi-level Flow Modeling (MFM)