WorldWideScience

Sample records for based depletion methodology

  1. VERA Core Simulator Methodology for PWR Cycle Depletion

    Energy Technology Data Exchange (ETDEWEB)

    Kochunas, Brendan [University of Michigan; Collins, Benjamin S [ORNL; Jabaay, Daniel [University of Michigan; Kim, Kang Seog [ORNL; Graham, Aaron [University of Michigan; Stimpson, Shane [University of Michigan; Wieselquist, William A [ORNL; Clarno, Kevin T [ORNL; Palmtag, Scott [Core Physics, Inc.; Downar, Thomas [University of Michigan; Gehin, Jess C [ORNL

    2015-01-01

    This paper describes the methodology developed and implemented in MPACT for performing high-fidelity pressurized water reactor (PWR) multi-cycle core physics calculations. MPACT is being developed primarily for application within the Consortium for the Advanced Simulation of Light Water Reactors (CASL) as one of the main components of the VERA Core Simulator, the others being COBRA-TF and ORIGEN. The methods summarized in this paper include a methodology for performing resonance self-shielding and computing macroscopic cross sections, 2-D/1-D transport, nuclide depletion, thermal-hydraulic feedback, and other supporting methods. These methods represent a minimal set needed to simulate high-fidelity models of a realistic nuclear reactor. Results demonstrating this are presented from the simulation of a realistic model of the first cycle of Watts Bar Unit 1. The simulation, which approximates the cycle operation, is observed to be within 50 ppm boron (ppmB) reactivity for all simulated points in the cycle and approximately 15 ppmB for a consistent statepoint. The verification and validation of the PWR cycle depletion capability in MPACT is the focus of two companion papers.

  2. Development of a Reliable Fuel Depletion Methodology for the HTR-10 Spent Fuel Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Kiwhan [Los Alamos National Laboratory; Beddingfield, David H. [Los Alamos National Laboratory; Geist, William H. [Los Alamos National Laboratory; Lee, Sang-Yoon [unaffiliated

    2012-07-03

    A technical working group formed in 2007 between NNSA and CAEA to develop a reliable fuel depletion method for HTR-10 based on MCNPX and to analyze the isotopic inventory and radiation source terms of the HTR-10 spent fuel. Conclusions of this presentation are: (1) Established a fuel depletion methodology and demonstrated its safeguards application; (2) Proliferation resistant at high discharge burnup ({approx}80 GWD/MtHM) - Unfavorable isotopics, high number of pebbles needed, harder to reprocess pebbles; (3) SF should remain under safeguards comparable to that of LWR; and (4) Diversion scenarios not considered, but can be performed.

  3. METHODOLOGICAL BASES OF OUTSOURCING

    Directory of Open Access Journals (Sweden)

    Lanskaya D. V.

    2014-09-01

    Full Text Available Outsourcing is investigated from a position of finding steady and unique competitive advantages of a public corporation due to attraction of carriers of unique intellectual and uses social capitals of the specialized companies within the institutional theory. Key researchers and events in the history of outsourcing are marked out, the existing approaches to definition of the concept of outsourcing, advantage and risks from application of technology of outsourcing are considered. It is established that differences of outsourcing, sub-contraction and cooperation are not in the nature of the functional relations, and in the depth of considered economic terms and phenomena. The methodology of outsourcing is considered as a part of methodology of cooperation of enterprise innovative structures of being formed sector of knowledge economy

  4. Adding trend data to Depletion-Based Stock Reduction Analysis

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A Bayesian model of Depletion-Based Stock Reduction Analysis (DB-SRA), informed by a time series of abundance indexes, was developed, using the Sampling Importance...

  5. Depletion methodology in the 3-D whole core transport code DeCART

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kang Seog; Cho, Jin Young; Zee, Sung Quun

    2005-02-01

    Three dimensional whole-core transport code DeCART has been developed to include a characteristics of the numerical reactor to replace partly the experiment. This code adopts the deterministic method in simulating the neutron behavior with the least assumption and approximation. This neutronic code is also coupled with the thermal hydraulic code CFD and the thermo mechanical code to simulate the combined effects. Depletion module has been implemented in DeCART code to predict the depleted composition in the fuel. The exponential matrix method of ORIGEN-2 has been used for the depletion calculation. The library of including decay constants, yield matrix and others has been used and greatly simplified for the calculation efficiency. This report summarizes the theoretical backgrounds and includes the verification of the depletion module in DeCART by performing the benchmark calculations.

  6. A robust TEC depletion detector algorithm for satellite based navigation in Indian zone and depletion analysis for GAGAN

    Science.gov (United States)

    Dashora, Nirvikar

    2012-07-01

    Equatorial plasma bubble (EPB) and associated plasma irregularities are known to cause severe scintillation for the satellite signals and produce range errors, which eventually result either in loss of lock of the signal or in random fluctuation in TEC, respectively, affecting precise positioning and navigation solutions. The EPBs manifest as sudden reduction in line of sight TEC, which are more often called TEC depletions, and are spread over thousands of km in meridional direction and a few hundred km in zonal direction. They change shape and size while drifting from one longitude to another in nighttime ionosphere. For a satellite based navigation system, like GAGAN in India that depends upon (i) multiple satellites (i.e. GPS) (ii) multiple ground reference stations and (iii) a near real time data processing, such EPBs are of grave concern. A TEC model generally provides a near real-time grid based ionospheric vertical errors (GIVEs) over hypothetically spread 5x5 degree latitude-longitude grid points. But, on night when a TEC depletion occurs in a given longitude sector, it is almost impossible for any system to give a forecast of GIVEs. If loss-of-lock events occur due to scintillation, there is no way to improve the situation. But, when large and random depletions in TEC occur with scintillations and without loss-of-lock, it affects low latitude TEC in two ways. (a) Multiple satellites show depleted TEC which may be very different from model-TEC values and hence the GIVE would be incorrect over various grid points (ii) the user may be affected by depletions which are not sampled by reference stations and hence interpolated GIVE within one square would be grossly erroneous. The most general solution (and the far most difficult as well) is having advance knowledge of spatio-temporal occurrence and precise magnitude of such depletions. While forecasting TEC depletions in spatio-temporal domain are a scientific challenge (as we show below), operational systems

  7. Evaluation of acute tryptophan depletion and sham depletion with a gelatin-based collagen peptide protein mixture

    DEFF Research Database (Denmark)

    Stenbæk, D S; Einarsdottir, H S; Goregliad-Fjaellingsdal, T

    2016-01-01

    Acute Tryptophan Depletion (ATD) is a dietary method used to modulate central 5-HT to study the effects of temporarily reduced 5-HT synthesis. The aim of this study is to evaluate a novel method of ATD using a gelatin-based collagen peptide (CP) mixture. We administered CP-Trp or CP+Trp mixtures...

  8. Electrofishing mark-recapture and depletion methodologies evoke behavioral and physiological changes in cutthroat trout

    Science.gov (United States)

    Mesa, M. G.; Schreck, C.B.

    1989-01-01

    We examined the behavioral and physiological responses of wild and hatchery-reared cutthroat trout Oncorhynchus clarki subjected to a single electroshock, electroshock plus marking, and multiple electroshocks in natural and artificial streams. In a natural stream, cutthroat trout released after capture by electrofishing and marking showed distinct behavioral changes: fish immediately sought cover, remained relatively inactive, did not feed, and were easily approached by a diver. An average of 3–4 h was required for 50% of the fish to return to a seemingly normal mode of behavior, although responses varied widely among collection sites. Using the depletion method, we observed little change in normal behavior offish remaining in the stream section (i.e., uncaptured fish) after successive passes with electrofishing gear. In an artificial stream, hatchery-reared and wild cutthroat trout immediately decreased their rates of feeding and aggression after they were electroshocked and marked. Hatchery fish generally recovered in 2–3 h; wild fish required at least 24 h to recover. Analysis of feeding and aggression data by hierarchical rank revealed no distinct recovery trends among hatchery fish of different ranks; among wild cutthroat trout, however, socially dominant fish seemed to recover faster than intermediate and subordinate fish. Physiological indicators of stress (plasma cortisol and blood lactic acid) increased significantly in cutthroat trout subjected to electroshock plus marking or single or multiple electroshocks. As judged by the magnitude of the greatest change in cortisol and lactate, multiple electroshocks elicited the most severe stress response; however, plasma concentrations of both substances had returned to unstressed control levels by 6 h after treatment. It was evident that electrofishing and the procedures involved with estimating fish population size elicited a general stress response that was manifested not only physiologically but also

  9. Reliability based design optimization: Formulations and methodologies

    Science.gov (United States)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed

  10. Risk-based methodology for USNRC inspections

    Energy Technology Data Exchange (ETDEWEB)

    Wong, S.M. [Brookhaven National Lab., Upton, NY (United States); Holahan, G.M.; Chung, J.W.; Johnson, M.R. [Nuclear Regulatory Commission, Washington, DC (United States). Office of Nuclear Reactor Regulation

    1995-12-01

    This paper describes the development and trial applications of a risk-based methodology to enhance the inspection processes for US nuclear power plants. Objectives of risk-based methods to complement prescriptive engineering approaches in US Nuclear Regulatory Commission (USNRC) inspection programs are presented. Insights from time-dependent risk profiles of plant configurational from Individual Plant Evaluation (IPE) studies were integrated to develop a framework for optimizing inspection efforts in NRC regulatory initiatives. Lessons learned from NRC pilot applications of the risk-based methodology for evaluation of the effectiveness of operational risk management programs at US nuclear power plant sites are also discussed.

  11. Towards an MDA-based development methodology

    NARCIS (Netherlands)

    Gavras, Anastasius; Belaunde, Mariano; Ferreira Pires, Luis; Almeida, João Paolo A.; Oquendo, Flavio; Warboys, Brian C.; Morrison, Ron

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  12. Ontology-Based Classification System Development Methodology

    Directory of Open Access Journals (Sweden)

    Grabusts Peter

    2015-12-01

    Full Text Available The aim of the article is to analyse and develop an ontology-based classification system methodology that uses decision tree learning with statement propositionalized attributes. Classical decision tree learning algorithms, as well as decision tree learning with taxonomy and propositionalized attributes have been observed. Thus, domain ontology can be extracted from the data sets and can be used for data classification with the help of a decision tree. The use of ontology methods in decision tree-based classification systems has been researched. Using such methodologies, the classification accuracy in some cases can be improved.

  13. DKBLM—Deep Knowledge Based Learning Methodology

    Institute of Scientific and Technical Information of China (English)

    马志方

    1993-01-01

    To solve the Imperfect Theory Problem(ITP)faced by Explanation Based Generalization(EBG),this paper proposes a methodology,Deep Knowledge Based Learning Methodology(DKBLM)by name and gives an implementation of DKBLM,called Hierarchically Distributed Learning System(HDLS).As an example of HDLS's application,this paper shows a learning system(MLS)in meteorology domain and its running with a simplified example.DKBLM can acquire experiential knowledge with causality in it.It is applicable to those kinds of domains,in which experiments are relatively difficults to caryy out,and in which there exist many available knowledge systems at different levels for the same domain(such as weather forecasting).

  14. Ontology-Based Classification System Development Methodology

    OpenAIRE

    2015-01-01

    The aim of the article is to analyse and develop an ontology-based classification system methodology that uses decision tree learning with statement propositionalized attributes. Classical decision tree learning algorithms, as well as decision tree learning with taxonomy and propositionalized attributes have been observed. Thus, domain ontology can be extracted from the data sets and can be used for data classification with the help of a decision tree. The use of ontology methods in decision ...

  15. Too exhausted to remember: ego depletion undermines subsequent event-based prospective memory.

    Science.gov (United States)

    Li, Jian-Bin; Nie, Yan-Gang; Zeng, Min-Xia; Huntoon, Meghan; Smith, Jessi L

    2013-01-01

    Past research has consistently found that people are likely to do worse on high-level cognitive tasks after exerting self-control on previous actions. However, little has been unraveled about to what extent ego depletion affects subsequent prospective memory. Drawing upon the self-control strength model and the relationship between self-control resources and executive control, this study proposes that the initial actions of self-control may undermine subsequent event-based prospective memory (EBPM). Ego depletion was manipulated through watching a video requiring visual attention (Experiment 1) or completing an incongruent Stroop task (Experiment 2). Participants were then tested on EBPM embedded in an ongoing task. As predicted, the results showed that after ruling out possible intervening variables (e.g. mood, focal and nonfocal cues, and characteristics of ongoing task and ego depletion task), participants in the high-depletion condition performed significantly worse on EBPM than those in the low-depletion condition. The results suggested that the effect of ego depletion on EBPM was mainly due to an impaired prospective component rather than to a retrospective component.

  16. Agent-based Modeling Methodology for Analyzing Weapons Systems

    Science.gov (United States)

    2015-03-26

    43 Figure 14: Simulation Study Methodology for the Weapon System Analysis Metrics Definition and Data Collection The analysis plan calls for...AGENT-BASED MODELING METHODOLOGY FOR ANALYZING WEAPONS SYSTEMS THESIS Casey D. Connors, Major, USA...AGENT-BASED MODELING METHODOLOGY FOR ANALYZING WEAPONS SYSTEMS THESIS Presented to the Faculty Department of Operational Sciences

  17. Case Based Reasoning: Case Representation Methodologies

    Directory of Open Access Journals (Sweden)

    Shaker H. El-Sappagh

    2015-11-01

    Full Text Available Case Based Reasoning (CBR is an important technique in artificial intelligence, which has been applied to various kinds of problems in a wide range of domains. Selecting case representation formalism is critical for the proper operation of the overall CBR system. In this paper, we survey and evaluate all of the existing case representation methodologies. Moreover, the case retrieval and future challenges for effective CBR are explained. Case representation methods are grouped in to knowledge-intensive approaches and traditional approaches. The first group overweight the second one. The first methods depend on ontology and enhance all CBR processes including case representation, retrieval, storage, and adaptation. By using a proposed set of qualitative metrics, the existing methods based on ontology for case representation are studied and evaluated in details. All these systems have limitations. No approach exceeds 53% of the specified metrics. The results of the survey explain the current limitations of CBR systems. It shows that ontology usage in case representation needs improvements to achieve semantic representation and semantic retrieval in CBR system.

  18. An Empirically Based Methodology for the Nineties.

    Science.gov (United States)

    Nunan, David

    A review of research bearing on second language teaching methodology looks at what it tells about language processing and production, classroom interaction and second language learning, and learning strategy preferences. The perspective taken is that methodology consists of classroom tasks and activities. Implications of the research for the…

  19. New methodology for Ozone Depletion Potentials of short-lived compounds: n-Propyl bromide as an example

    Science.gov (United States)

    Wuebbles, Donald J.; Patten, Kenneth O.; Johnson, Matthew T.; Kotamarthi, Rao

    2001-07-01

    A number of the compounds proposed as replacements for substances controlled under the Montreal Protocol have extremely short atmospheric lifetimes, on the order of days to a few months. An important example is n-propyl bromide (also referred to as 1-bromopropane, CH2BrCH2CH3 or simplified as 1-C3H7Br or nPB). This compound, useful as a solvent, has an atmospheric lifetime of less than 20 days due to its reaction with hydroxyl. Because nPB contains bromine, any amount reaching the stratosphere has the potential to affect concentrations of stratospheric ozone. The definition of Ozone Depletion Potentials (ODP) needs to be modified for such short-lived compounds to account for the location and timing of emissions. It is not adequate to treat these chemicals as if they were uniformly emitted at all latitudes and longitudes as normally done for longer-lived gases. Thus, for short-lived compounds, policymakers will need a table of ODP values instead of the single value generally provided in past studies. This study uses the MOZART2 three-dimensional chemical-transport model in combination with studies with our less computationally expensive two-dimensional model to examine potential effects of nPB on stratospheric ozone. Multiple facets of this study examine key questions regarding the amount of bromine reaching the stratosphere following emission of nPB. Our most significant findings from this study for the purposes of short-lived replacement compound ozone effects are summarized as follows. The degradation of nPB produces a significant quantity of bromoacetone which increases the amount of bromine transported to the stratosphere due to nPB. However, much of that effect is not due to bromoacetone itself, but instead to inorganic bromine which is produced from tropospheric oxidation of nPB, bromoacetone, and other degradation products and is transported above the dry and wet deposition processes of the model. The MOZART2 nPB results indicate a minimal correction of the

  20. SNE's methodological basis - web-based software in entrepreneurial surveys

    DEFF Research Database (Denmark)

    Madsen, Henning

    This overhead based paper gives an introduction to the research methodology applied in the surveys carried out in the SNE-project.......This overhead based paper gives an introduction to the research methodology applied in the surveys carried out in the SNE-project....

  1. An object-based methodology for knowledge representation in SGML

    Energy Technology Data Exchange (ETDEWEB)

    Kelsey, R.L. [Los Alamos National Lab., NM (United States)|New Mexico State Univ., Las Cruces, NM (United States); Hartley, R.T. [New Mexico State Univ., Las Cruces, NM (United States); Webster, R.B. [Los Alamos National Lab., NM (United States)

    1997-11-01

    An object-based methodology for knowledge representation and its Standard Generalized Markup Language (SGML) implementation is presented. The methodology includes class, perspective domain, and event constructs for representing knowledge within an object paradigm. The perspective construct allows for representation of knowledge from multiple and varying viewpoints. The event construct allows actual use of knowledge to be represented. The SGML implementation of the methodology facilitates usability, structured, yet flexible knowledge design, and sharing and reuse of knowledge class libraries.

  2. An object-based methodology for knowledge representation

    Energy Technology Data Exchange (ETDEWEB)

    Kelsey, R.L. [Los Alamos National Lab., NM (United States)|New Mexico State Univ., Las Cruces, NM (United States); Hartley, R.T. [New Mexico State Univ., Las Cruces, NM (United States); Webster, R.B. [Los Alamos National Lab., NM (United States)

    1997-11-01

    An object based methodology for knowledge representation is presented. The constructs and notation to the methodology are described and illustrated with examples. The ``blocks world,`` a classic artificial intelligence problem, is used to illustrate some of the features of the methodology including perspectives and events. Representing knowledge with perspectives can enrich the detail of the knowledge and facilitate potential lines of reasoning. Events allow example uses of the knowledge to be represented along with the contained knowledge. Other features include the extensibility and maintainability of knowledge represented in the methodology.

  3. Phenomena based Methodology for Process Synthesis incorporating Process Intensification

    DEFF Research Database (Denmark)

    Lutze, Philip; Babi, Deenesh Kavi; Woodley, John

    2013-01-01

    at processes at the lowest level of aggregation which is the phenomena level. In this paper, a phenomena based synthesis/design methodology incorporating process intensification is presented. Using this methodology, a systematic identification of necessary and desirable (integrated) phenomena as well...

  4. A performance-based methodology for rating remediation systems

    Energy Technology Data Exchange (ETDEWEB)

    Rudin, M.J.; O' Brien, M.C.; Richardson, J.G.; Morrison, J.L.; Morneau, R.A. (Idaho National Engineering Lab., Idaho Falls, ID (United States))

    1993-10-01

    A methodology for evaluating and rating candidate remediation systems has been developed within the Buried Waste Integrated Demonstration (BWID) Systems Analysis Project at the Idaho National Engineering Laboratory (INEL). Called the performance-based technology selection filter (PBTSF), the methodology provides a formalized process to score systems based upon performance measures, and regulatory and technical requirements. The results are auditable and can be validated with field data.

  5. A Systematic Methodology for Design of Emulsion Based Chemical Products

    DEFF Research Database (Denmark)

    Mattei, Michele; Kontogeorgis, Georgios; Gani, Rafiqul

    2012-01-01

    A systematic methodology for emulsion based chemical product design is presented. The methodology employs a model-based product synthesis/design stage and a modelexperiment based further refinement and/or validation stage. In this paper only the first stage is presented. The methodology employs...... a hierarchical approach starting with the identification of the needs to be satisfied by the emulsified product and then building up the formulation by adding one-by-one the different classes of chemicals. A structured database together with dedicated property prediction models and evaluation criteria...... are employed to obtain a list of formulations that satisfy constraints representing the desired needs (target properties). Through a conceptual case study dealing with the design of a sunscreen lotion, the application of this new methodology is illustrated....

  6. The Impact of an Ego Depletion Manipulation on Performance-Based and Self-Report Assessment Measures.

    Science.gov (United States)

    Charek, Daniel B; Meyer, Gregory J; Mihura, Joni L

    2016-10-01

    We investigated the impact of ego depletion on selected Rorschach cognitive processing variables and self-reported affect states. Research indicates acts of effortful self-regulation transiently deplete a finite pool of cognitive resources, impairing performance on subsequent tasks requiring self-regulation. We predicted that relative to controls, ego-depleted participants' Rorschach protocols would have more spontaneous reactivity to color, less cognitive sophistication, and more frequent logical lapses in visualization, whereas self-reports would reflect greater fatigue and less attentiveness. The hypotheses were partially supported; despite a surprising absence of self-reported differences, ego-depleted participants had Rorschach protocols with lower scores on two variables indicative of sophisticated combinatory thinking, as well as higher levels of color receptivity; they also had lower scores on a composite variable computed across all hypothesized markers of complexity. In addition, self-reported achievement striving moderated the effect of the experimental manipulation on color receptivity, and in the Depletion condition it was associated with greater attentiveness to the tasks, more color reactivity, and less global synthetic processing. Results are discussed with an emphasis on the response process, methodological limitations and strengths, implications for calculating refined Rorschach scores, and the value of using multiple methods in research and experimental paradigms to validate assessment measures.

  7. A Metadata based Knowledge Discovery Methodology for Seeding Translational Research.

    Science.gov (United States)

    Kothari, Cartik R; Payne, Philip R O

    2015-01-01

    In this paper, we present a semantic, metadata based knowledge discovery methodology for identifying teams of researchers from diverse backgrounds who can collaborate on interdisciplinary research projects: projects in areas that have been identified as high-impact areas at The Ohio State University. This methodology involves the semantic annotation of keywords and the postulation of semantic metrics to improve the efficiency of the path exploration algorithm as well as to rank the results. Results indicate that our methodology can discover groups of experts from diverse areas who can collaborate on translational research projects.

  8. Design Intelligent Model base Online Tuning Methodology for Nonlinear System

    Directory of Open Access Journals (Sweden)

    Ali Roshanzamir

    2014-04-01

    Full Text Available In various dynamic parameters systems that need to be training on-line adaptive control methodology is used. In this paper fuzzy model-base adaptive methodology is used to tune the linear Proportional Integral Derivative (PID controller. The main objectives in any systems are; stability, robust and reliability. However PID controller is used in many applications but it has many challenges to control of continuum robot. To solve these problems nonlinear adaptive methodology based on model base fuzzy logic is used. This research is used to reduce or eliminate the PID controller problems based on model reference fuzzy logic theory to control of flexible robot manipulator system and testing of the quality of process control in the simulation environment of MATLAB/SIMULINK Simulator.

  9. Regression-Based Artificial Neural Network Methodology in Response Surface Methodology

    Institute of Scientific and Technical Information of China (English)

    何桢; 肖粤翔

    2004-01-01

    Response surface methodology (RSM) is an important tool for process parameter optimization, robust design and other quality improvement efforts. When the relationship between influential input variables and output response is very complex, it' s hard to find the real response surface using RSM. In recent years artificial neural network(ANN) has been used in RSM. But the classical ANN does not work well under the constraints of real applications. An algorithm of regression-based ANN(R-ANN) is proposed in this paper, which is a supplement to the classical ANN methodology. It makes network closer to the response surface, so that training time is reduced and robustness is strengthened. The procedure of improving ANN by regressions is described and the comparisons among R-ANN, RSM and classical ANN are computed graphically in three examples. Our research shows that the R-ANN methodology is a good supplement to the RSM and classical ANN methodology,which can yield lower standard error of prediction under conditions that the scope of experiment is rigidly restricted.

  10. Towards an MDA-based development methodology for distributed applications

    NARCIS (Netherlands)

    Gavras, A.; Belaunde, M.; Ferreira Pires, L.; Andrade Almeida, J.P.; van Sinderen, M.J.; Ferreira Pires, L.

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  11. A Model-based Prognostics Methodology for Electrolytic Capacitors Based on Electrical Overstress Accelerated Aging

    Data.gov (United States)

    National Aeronautics and Space Administration — A remaining useful life prediction methodology for elec- trolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical...

  12. A Model-Based Prognostics Methodology For Electrolytic Capacitors Based On Electrical Overstress Accelerated Aging

    Data.gov (United States)

    National Aeronautics and Space Administration — A remaining useful life prediction methodology for electrolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical...

  13. Design Methodology for Self-organized Mobile Networks Based

    Directory of Open Access Journals (Sweden)

    John Petearson Anzola

    2016-06-01

    Full Text Available The methodology proposed in this article enables a systematic design of routing algorithms based on schemes of biclustering, which allows you to respond with timely techniques, clustering heuristics proposed by a researcher, and a focused approach to routing in the choice of clusterhead nodes. This process uses heuristics aimed at improving the different costs in communication surface groups called biclusters. This methodology globally enables a variety of techniques and heuristics of clustering that have been addressed in routing algorithms, but we have not explored all possible alternatives and their different assessments. Therefore, the methodology oriented design research of routing algorithms based on biclustering schemes will allow new concepts of evolutionary routing along with the ability to adapt the topological changes that occur in self-organized data networks.

  14. Optimal (Solvent) Mixture Design through a Decomposition Based CAMD methodology

    DEFF Research Database (Denmark)

    Achenie, L.; Karunanithi, Arunprakash T.; Gani, Rafiqul

    2004-01-01

    Computer Aided Molecular/Mixture design (CAMD) is one of the most promising techniques for solvent design and selection. A decomposition based CAMD methodology has been formulated where the mixture design problem is solved as a series of molecular and mixture design sub-problems. This approach is...... is able to overcome most of the difficulties associated with the solution of mixture design problems. The new methodology has been illustrated with the help of a case study involving the design of solvent-anti solvent binary mixtures for crystallization of Ibuprofen.......Computer Aided Molecular/Mixture design (CAMD) is one of the most promising techniques for solvent design and selection. A decomposition based CAMD methodology has been formulated where the mixture design problem is solved as a series of molecular and mixture design sub-problems. This approach...

  15. Methodological Approaches for Estimating Gross Regional Product after Taking into Account Depletion of Natural Resources, Environmental Pollution and Human Capital Aspects

    Directory of Open Access Journals (Sweden)

    Boris Alengordovich Korobitsyn

    2015-09-01

    Full Text Available A key indicator of the System of National Accounts of Russia at a regional scale is Gross Regional Product characterizing the value of goods and services produced in all sectors of the economy in a country and intended for final consumption, capital formation and net exports (excluding imports. From a sustainability perspective, the most weakness of GRP is that it ignores depreciation of man-made assets, natural resource depletion, environmental pollution and degradation, and potential social costs such as poorer health due to exposure to occupational hazards. Several types of alternative approaches to measuring socio-economic progress are considering for six administrative units of the Ural Federal District for the period 2006–2014. Proposed alternatives to GRP as a measure of social progress are focused on natural resource depletion, environmental externalities and some human development aspects. The most promising is the use of corrected macroeconomic indicators similar to the “genuine savings” compiled by the World Bank. Genuine savings are defined in this paper as net savings (net gross savings minus consumption of fixed capital minus the consumption of natural non-renewable resources and the monetary evaluations of damages resulting from air pollution, water pollution and waste disposal. Two main groups of non renewable resources are considered: energy resources (uranium ore, oil and natural gas and mineral resources (iron ore, copper, and aluminum. In spite of various shortcomings, this indicator represents a considerable improvement over GRP information. For example, while GRP demonstrates steady growth between 2006 and 2014 for the main Russian oil- and gas-producing regions — Hanty-Mansi and Yamalo-Nenets Autonomous Okrugs, genuine savings for these regions decreased over all period. It means that their resource-based economy could not be considered as being on a sustainable path even in the framework of

  16. A Discrepancy-Based Methodology for Nuclear Training Program Evaluation.

    Science.gov (United States)

    Cantor, Jeffrey A.

    1991-01-01

    A three-phase comprehensive process for commercial nuclear power training program evaluation is presented. The discrepancy-based methodology was developed after the Three Mile Island nuclear reactor accident. It facilitates analysis of program components to identify discrepancies among program specifications, actual outcomes, and industry…

  17. Methodological Innovation in Practice-Based Design Doctorates

    Directory of Open Access Journals (Sweden)

    Joyce S. R. Yee

    2010-01-01

    Full Text Available This article presents a selective review of recent design PhDs that identify and analyse the methodological innovation that is occurring in the field, in order to inform future provision of research training. Six recently completed design PhDs are used to highlight possible philosophical and practical models that can be adopted by future PhD students in design. Four characteristics were found in design PhD methodology: innovations in the format and structure of the thesis, a pick-and-mix approach to research design, situating practice in the inquiry, and the validation of visual analysis. The article concludes by offering suggestions on how research training can be improved. By being aware of recent methodological innovations in the field, design educators will be better informed when developing resources for future design doctoral candidates and assisting supervision teams in developing a more informed and flexible approach to practice-based research.

  18. Evaluation methodology for seismic base isolation of nuclear equipments

    Energy Technology Data Exchange (ETDEWEB)

    Ebisawa, K. (Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan)); Uga, T. (Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan))

    1993-08-01

    An evaluation methodology for seismic base isolated nuclear equipments is proposed. The evaluation can be classified into two steps. In the first step, the seismic functional failure probability during the lifetime of equipment without base isolation devices is quantified in order to decide the applicability of the base isolated structure. The second step is comparative and calculates the ratio of the seismic failure frequency of the equipment without base isolation devices to that with them in order to evaluate the effectiveness of the base isolated structure. The sample evaluation considers the case of high voltage type emergency transformer with ceramic tubes. (orig.)

  19. Food composition and acid-base balance: alimentary alkali depletion and acid load in herbivores.

    Science.gov (United States)

    Kiwull-Schöne, Heidrun; Kiwull, Peter; Manz, Friedrich; Kalhoff, Hermann

    2008-02-01

    Alkali-enriched diets are recommended for humans to diminish the net acid load of their usual diet. In contrast, herbivores have to deal with a high dietary alkali impact on acid-base balance. Here we explore the role of nutritional alkali in experimentally induced chronic metabolic acidosis. Data were collected from healthy male adult rabbits kept in metabolism cages to obtain 24-h urine and arterial blood samples. Randomized groups consumed rabbit diets ad libitum, providing sufficient energy but variable alkali load. One subgroup (n = 10) received high-alkali food and approximately 15 mEq/kg ammonium chloride (NH4Cl) with its drinking water for 5 d. Another group (n = 14) was fed low-alkali food for 5 d and given approximately 4 mEq/kg NH4Cl daily for the last 2 d. The wide range of alimentary acid-base load was significantly reflected by renal base excretion, but normal acid-base conditions were maintained in the arterial blood. In rabbits fed a high-alkali diet, the excreted alkaline urine (pH(u) > 8.0) typically contained a large amount of precipitated carbonate, whereas in rabbits fed a low-alkali diet, both pH(u) and precipitate decreased considerably. During high-alkali feeding, application of NH4Cl likewise decreased pH(u), but arterial pH was still maintained with no indication of metabolic acidosis. During low-alkali feeding, a comparably small amount of added NH4Cl further lowered pH(u) and was accompanied by a significant systemic metabolic acidosis. We conclude that exhausted renal base-saving function by dietary alkali depletion is a prerequisite for growing susceptibility to NH4Cl-induced chronic metabolic acidosis in the herbivore rabbit.

  20. A Library-Based Synthesis Methodology for Reversible Logic

    CERN Document Server

    Saeedi, Mehdi; Zamani, Morteza Saheb; 10.1016/j.mejo.2010.02.002

    2010-01-01

    In this paper, a library-based synthesis methodology for reversible circuits is proposed where a reversible specification is considered as a permutation comprising a set of cycles. To this end, a pre-synthesis optimization step is introduced to construct a reversible specification from an irreversible function. In addition, a cycle-based representation model is presented to be used as an intermediate format in the proposed synthesis methodology. The selected intermediate format serves as a focal point for all potential representation models. In order to synthesize a given function, a library containing seven building blocks is used where each building block is a cycle of length less than 6. To synthesize large cycles, we also propose a decomposition algorithm which produces all possible minimal and inequivalent factorizations for a given cycle of length greater than 5. All decompositions contain the maximum number of disjoint cycles. The generated decompositions are used in conjunction with a novel cycle assi...

  1. An InP-Based Dual-Depletion-Region Electroabsorption Modulator with Low Capacitance and Predicted High Bandwidth

    Institute of Scientific and Technical Information of China (English)

    SHAO Yong-Bo; ZHAO Ling-Juan; YU Hong-Yan; QIU Ji-Fang; QIU Ying-Ping; PAN Jiao-Qing; WANG Bao-Jun; ZHU Hong-Liang; WANG Wei

    2011-01-01

    A novel dual-depletion-region electroabsorption modulator (DDR-EAM) based on InP at 1550nm is fabricated.The measured capacitance and extinction ratio of the DDR-EAM reveal that the dual depletion region structure can reduce the device capacitance significantly without any degradation of extinction ratio.Moreover,the bandwidth of the DDR-EAM predicted by using an equivalent circuit model is larger than twice the bandwidth of the conventional lumped-electrode EAM (L-EAM).The electroabsorption modulator (EAM) is highly desirable as an external electro-optical modulator due to its high speed,low cost and capability of integration with other optical component such as DFB lasers,DBR lasers or semiconductor optical amplifiers.[1-4]So far,EAMs are typically fabricated by using lumped electrodes[1-4] and travelling-wave electrodes.[5-15]%A novel dual-depletion-region electroabsorption modulator (DDR-EAM) based on InP at 1550nm is fabricated. The measured capacitance and extinction ratio of the DDR-EAM reveal that the dual depletion region structure can reduce the device capacitance significantly without any degradation of extinction ratio. Moreover, the bandwidth of the DDR-EAM predicted by using an equivalent circuit model is larger than twice the bandwidth of the conventional lumped-electrode EAM (L-EAM).

  2. Incremental Placement-Based Clock Network Minimization Methodology

    Institute of Scientific and Technical Information of China (English)

    ZHOU Oiang; CAI Yici; HUANG Liang; HONG Xianlong

    2008-01-01

    Power is the major challenge threatening the progress of very large scale integration (VLSI) tech-nology development. In ultra-deep submicron VLSI designs, clock network size must be minimized to re-duce power consumption, power supply noise, and the number of clock buffers which are vulnerable to process variations. Traditional design methodologies usually let the clock router independently undertake the clock network minimization. Since clock routing is based on register locations, register placement actu-ally strongly influences the clock network size. This paper describes a clock network design methodology that optimizes register placement. For a given cell placement result, incremental modifications are per-formed based on the clock skew specifications by moving registers toward preferred locations that may re-duce the clock network size. At the same time, the side-effects to logic cell placement, such as signal net wirelength and critical path delay, are controlled. Test results on benchmark circuits show that the methodol-ogy can considerably reduce clock network size with limited impact on signal net wirelength and critical path delay.

  3. An Ontology Based Methodology for Satellite Data Semantic Interoperability

    Directory of Open Access Journals (Sweden)

    ABBURU, S.

    2015-08-01

    Full Text Available Satellites and ocean based observing system consists of various sensors and configurations. These observing systems transmit data in heterogeneous file formats and heterogeneous vocabulary from various data centers. These data centers maintain a centralized data management system that disseminates the observations to various research communities. Currently, different data naming conventions are being used by existing observing systems, thus leading to semantic heterogeneity. In this work, sensor data interoperability and semantics of the data are being addressed through ontologies. The present work provides an effective technical solution to address semantic heterogeneity through semantic technologies. These technologies provide interoperability, capability to build knowledge base, and framework for semantic information retrieval by developing an effective concept vocabulary through domain ontologies. The paper aims at a new methodology to interlink the multidisciplinary and heterogeneous sensor data products. A four phase methodology has been implemented to address satellite data semantic interoperability. The paper concludes with the evaluation of the methodology by linking and interfacing multiple ontologies to arrive at ontology vocabulary for sensor observations. Data from Indian Meteorological satellite INSAT-3D satellite have been used as a typical example to illustrate the concepts. This work on similar lines can also be extended to other sensor observations.

  4. ONTOLOGY BASED DATA MINING METHODOLOGY FOR DISCRIMINATION PREVENTION

    Directory of Open Access Journals (Sweden)

    Nandana Nagabhushana

    2014-09-01

    Full Text Available Data Mining is being increasingly used in the field of automation of decision making processes, which involve extraction and discovery of information hidden in large volumes of collected data. Nonetheless, there are negative perceptions like privacy invasion and potential discrimination which contribute as hindrances to the use of data mining methodologies in software systems employing automated decision making. Loan granting, Employment, Insurance Premium calculation, Admissions in Educational Institutions etc., can make use of data mining to effectively prevent human biases pertaining to certain attributes like gender, nationality, race etc. in critical decision making. The proposed methodology prevents discriminatory rules ensuing due to the presence of certain information regarding sensitive discriminatory attributes in the data itself. Two aspects of novelty in the proposal are, first, the rule mining technique based on ontologies and the second, concerning generalization and transformation of the mined rules that are quantized as discriminatory, into non-discriminatory ones.

  5. A Network Based Methodology to Reveal Patterns in Knowledge Transfer

    Directory of Open Access Journals (Sweden)

    Orlando López-Cruz

    2015-12-01

    Full Text Available This paper motivates, presents and demonstrates in use a methodology based in complex network analysis to support research aimed at identification of sources in the process of knowledge transfer at the interorganizational level. The importance of this methodology is that it states a unified model to reveal knowledge sharing patterns and to compare results from multiple researches on data from different periods of time and different sectors of the economy. This methodology does not address the underlying statistical processes. To do this, national statistics departments (NSD provide documents and tools at their websites. But this proposal provides a guide to model information inferences gathered from data processing revealing links between sources and recipients of knowledge being transferred and that the recipient detects as main source to new knowledge creation. Some national statistics departments set as objective for these surveys the characterization of innovation dynamics in firms and to analyze the use of public support instruments. From this characterization scholars conduct different researches. Measures of dimensions of the network composed by manufacturing firms and other organizations conform the base to inquiry the structure that emerges from taking ideas from other organizations to incept innovations. These two sets of data are actors of a two- mode-network. The link between two actors (network nodes, one acting as the source of the idea. The second one acting as the destination comes from organizations or events organized by organizations that “provide” ideas to other group of firms. The resulting demonstrated design satisfies the objective of being a methodological model to identify sources in knowledge transfer of knowledge effectively used in innovation.

  6. Development of a statistically based access delay timeline methodology.

    Energy Technology Data Exchange (ETDEWEB)

    Rivera, W. Gary; Robinson, David Gerald; Wyss, Gregory Dane; Hendrickson, Stacey M. Langfitt

    2013-02-01

    The charter for adversarial delay is to hinder access to critical resources through the use of physical systems increasing an adversarys task time. The traditional method for characterizing access delay has been a simple model focused on accumulating times required to complete each task with little regard to uncertainty, complexity, or decreased efficiency associated with multiple sequential tasks or stress. The delay associated with any given barrier or path is further discounted to worst-case, and often unrealistic, times based on a high-level adversary, resulting in a highly conservative calculation of total delay. This leads to delay systems that require significant funding and personnel resources in order to defend against the assumed threat, which for many sites and applications becomes cost prohibitive. A new methodology has been developed that considers the uncertainties inherent in the problem to develop a realistic timeline distribution for a given adversary path. This new methodology incorporates advanced Bayesian statistical theory and methodologies, taking into account small sample size, expert judgment, human factors and threat uncertainty. The result is an algorithm that can calculate a probability distribution function of delay times directly related to system risk. Through further analysis, the access delay analyst or end user can use the results in making informed decisions while weighing benefits against risks, ultimately resulting in greater system effectiveness with lower cost.

  7. MS-based analytical methodologies to characterize genetically modified crops.

    Science.gov (United States)

    García-Cañas, Virginia; Simó, Carolina; León, Carlos; Ibáñez, Elena; Cifuentes, Alejandro

    2011-01-01

    The development of genetically modified crops has had a great impact on the agriculture and food industries. However, the development of any genetically modified organism (GMO) requires the application of analytical procedures to confirm the equivalence of the GMO compared to its isogenic non-transgenic counterpart. Moreover, the use of GMOs in foods and agriculture faces numerous criticisms from consumers and ecological organizations that have led some countries to regulate their production, growth, and commercialization. These regulations have brought about the need of new and more powerful analytical methods to face the complexity of this topic. In this regard, MS-based technologies are increasingly used for GMOs analysis to provide very useful information on GMO composition (e.g., metabolites, proteins). This review focuses on the MS-based analytical methodologies used to characterize genetically modified crops (also called transgenic crops). First, an overview on genetically modified crops development is provided, together with the main difficulties of their analysis. Next, the different MS-based analytical approaches applied to characterize GM crops are critically discussed, and include "-omics" approaches and target-based approaches. These methodologies allow the study of intended and unintended effects that result from the genetic transformation. This information is considered to be essential to corroborate (or not) the equivalence of the GM crop with its isogenic non-transgenic counterpart.

  8. A PROPOSED METHODOLOGY FOR STRAIN BASED FAILURE CRITERIA

    Energy Technology Data Exchange (ETDEWEB)

    Wu, T

    2008-05-01

    This paper proposes an alternative methodology to determine the failure criteria for use in dynamic simulations of radioactive material shipping packages in the events of hypothetical accident conditions. The current stress failure criteria defined in the Nuclear Regulatory Guide 7.6 [1] and the ASME Code, Section III, Appendix F [2] for Level D Service Loads are based on the ultimate strength of uniaxial tensile test specimen rather than on the material fracture in the state of multi-axial stresses. On the other hand, the proposed strain-based failure criteria are directly related to the material failure mechanisms in multi-axial stresses. In addition, unlike the stress-based criteria, the strain-based failure criteria are applicable to the evaluation of cumulative damages caused by the sequential loads in the hypothetical accident events as required by the Nuclear Regulatory Guide 7.8 [4].

  9. Investment Strategies Optimization based on a SAX-GA Methodology

    CERN Document Server

    Canelas, António M L; Horta, Nuno C G

    2013-01-01

    This book presents a new computational finance approach combining a Symbolic Aggregate approXimation (SAX) technique with an optimization kernel based on genetic algorithms (GA). While the SAX representation is used to describe the financial time series, the evolutionary optimization kernel is used in order to identify the most relevant patterns and generate investment rules. The proposed approach considers several different chromosomes structures in order to achieve better results on the trading platform The methodology presented in this book has great potential on investment markets.

  10. Stratospheric ozone depletion over Antarctica - Role of aerosols based on SAGE II satellite observations

    Science.gov (United States)

    Lin, N.-H.; Saxena, V. K.

    1992-01-01

    The physical characteristics of the Antarctic stratospheric aerosol are investigated via a comprehensive analysis of the SAGE II data during the most severe ozone depletion episode of October 1987. The aerosol size distribution is found to be bimodal in several instances using the randomized minimization search technique, which suggests that the distribution of a single mode may be used to fit the data in the retrieved size range only at the expense of resolution for the larger particles. On average, in the region below 18 km, a wavelike perturbation with the upstream tilting for the parameters of mass loading, total number, and surface area concentration is found to be located just above the region of the most severe ozone depletion.

  11. Adjoint-based uncertainty quantification and sensitivity analysis for reactor depletion calculations

    Science.gov (United States)

    Stripling, Hayes Franklin

    Depletion calculations for nuclear reactors model the dynamic coupling between the material composition and neutron flux and help predict reactor performance and safety characteristics. In order to be trusted as reliable predictive tools and inputs to licensing and operational decisions, the simulations must include an accurate and holistic quantification of errors and uncertainties in its outputs. Uncertainty quantification is a formidable challenge in large, realistic reactor models because of the large number of unknowns and myriad sources of uncertainty and error. We present a framework for performing efficient uncertainty quantification in depletion problems using an adjoint approach, with emphasis on high-fidelity calculations using advanced massively parallel computing architectures. This approach calls for a solution to two systems of equations: (a) the forward, engineering system that models the reactor, and (b) the adjoint system, which is mathematically related to but different from the forward system. We use the solutions of these systems to produce sensitivity and error estimates at a cost that does not grow rapidly with the number of uncertain inputs. We present the framework in a general fashion and apply it to both the source-driven and k-eigenvalue forms of the depletion equations. We describe the implementation and verification of solvers for the forward and ad- joint equations in the PDT code, and we test the algorithms on realistic reactor analysis problems. We demonstrate a new approach for reducing the memory and I/O demands on the host machine, which can be overwhelming for typical adjoint algorithms. Our conclusion is that adjoint depletion calculations using full transport solutions are not only computationally tractable, they are the most attractive option for performing uncertainty quantification on high-fidelity reactor analysis problems.

  12. Reversible Logic Based Concurrent Error Detection Methodology For Emerging Nanocircuits

    CERN Document Server

    Thapliyal, Himanshu

    2011-01-01

    Reversible logic has promising applications in emerging nanotechnologies, such as quantum computing, quantum dot cellular automata and optical computing, etc. Faults in reversible logic circuits that result in multi-bit error at the outputs are very tough to detect, and thus in literature, researchers have only addressed the problem of online testing of faults that result single-bit error at the outputs based on parity preserving logic. In this work, we propose a methodology for the concurrent error detection in reversible logic circuits to detect faults that can result in multi-bit error at the outputs. The methodology is based on the inverse property of reversible logic and is termed as 'inverse and compare' method. By using the inverse property of reversible logic, all the inputs can be regenerated at the outputs. Thus, by comparing the original inputs with the regenerated inputs, the faults in reversible circuits can be detected. Minimizing the garbage outputs is one of the main goals in reversible logic ...

  13. A strategy for selective detection based on interferent depleting and redox cycling using the plane-recessed microdisk array electrodes

    Energy Technology Data Exchange (ETDEWEB)

    Zhu Feng [State Key Laboratory for Physical Chemistry of Solid Surfaces and Department of Chemistry, College of Chemistry and Chemical Engineering, Xiamen University, Xiamen, Fujian 361005 (China); Yan Jiawei, E-mail: jwyan@xmu.edu.cn [State Key Laboratory for Physical Chemistry of Solid Surfaces and Department of Chemistry, College of Chemistry and Chemical Engineering, Xiamen University, Xiamen, Fujian 361005 (China); Lu Miao [Pen-Tung Sah Micro-Nano Technology Research Center, Xiamen University, Xiamen, Fujian 361005 (China); Zhou Yongliang; Yang Yang; Mao Bingwei [State Key Laboratory for Physical Chemistry of Solid Surfaces and Department of Chemistry, College of Chemistry and Chemical Engineering, Xiamen University, Xiamen, Fujian 361005 (China)

    2011-10-01

    Highlights: > A novel strategy based on a combination of interferent depleting and redox cycling is proposed for the plane-recessed microdisk array electrodes. > The strategy break up the restriction of selectively detecting a species that exhibits reversible reaction in a mixture with one that exhibits an irreversible reaction. > The electrodes enhance the current signal by redox cycling. > The electrodes can work regardless of the reversibility of interfering species. - Abstract: The fabrication, characterization and application of the plane-recessed microdisk array electrodes for selective detection are demonstrated. The electrodes, fabricated by lithographic microfabrication technology, are composed of a planar film electrode and a 32 x 32 recessed microdisk array electrode. Different from commonly used redox cycling operating mode for array configurations such as interdigitated array electrodes, a novel strategy based on a combination of interferent depleting and redox cycling is proposed for the electrodes with an appropriate configuration. The planar film electrode (the plane electrode) is used to deplete the interferent in the diffusion layer. The recessed microdisk array electrode (the microdisk array), locating within the diffusion layer of the plane electrode, works for detecting the target analyte in the interferent-depleted diffusion layer. In addition, the microdisk array overcomes the disadvantage of low current signal for a single microelectrode. Moreover, the current signal of the target analyte that undergoes reversible electron transfer can be enhanced due to the redox cycling between the plane electrode and the microdisk array. Based on the above working principle, the plane-recessed microdisk array electrodes break up the restriction of selectively detecting a species that exhibits reversible reaction in a mixture with one that exhibits an irreversible reaction, which is a limitation of single redox cycling operating mode. The advantages of the

  14. Methodology of citrate-based biomaterial development and application

    Science.gov (United States)

    Tran, M. Richard

    Biomaterials play central roles in modern strategies of regenerative medicine and tissue engineering. Attempts to find tissue-engineered solutions to cure various injuries or diseases have led to an enormous increase in the number of polymeric biomaterials over the past decade. The breadth of new materials arises from the multiplicity of anatomical locations, cell types, and mode of application, which all place application-specific requirements on the biomaterial. Unfortunately, many of the currently available biodegradable polymers are limited in their versatility to meet the wide range of requirements for tissue engineering. Therefore, a methodology of biomaterial development, which is able to address a broad spectrum of requirements, would be beneficial to the biomaterial field. This work presents a methodology of citrate-based biomaterial design and application to meet the multifaceted needs of tissue engineering. We hypothesize that (1) citric acid, a non-toxic metabolic product of the body (Krebs Cycle), can be exploited as a universal multifunctional monomer and reacted with various diols to produce a new class of soft biodegradable elastomers with the flexibility to tune the material properties of the resulting material to meet a wide range of requirements; (2) the newly developed citrate-based polymers can be used as platform biomaterials for the design of novel tissue engineering scaffolding; and (3) microengineering approaches in the form thin scaffold sheets, microchannels, and a new porogen design can be used to generate complex cell-cell and cell-microenvironment interactions to mimic tissue complexity and architecture. To test these hypotheses, we first developed a methodology of citrate-based biomaterial development through the synthesis and characterization of a family of in situ crosslinkable and urethane-doped elastomers, which are synthesized using simple, cost-effective strategies and offer a variety methods to tailor the material properties to

  15. Powerline Communications Channel Modelling Methodology Based on Statistical Features

    CERN Document Server

    Tan, Bo

    2012-01-01

    This paper proposes a new channel modelling method for powerline communications networks based on the multipath profile in the time domain. The new channel model is developed to be applied in a range of Powerline Communications (PLC) research topics such as impulse noise modelling, deployment and coverage studies, and communications theory analysis. To develop the methodology, channels are categorised according to their propagation distance and power delay profile. The statistical multipath parameters such as path arrival time, magnitude and interval for each category are analyzed to build the model. Each generated channel based on the proposed statistical model represents a different realisation of a PLC network. Simulation results in similar the time and frequency domains show that the proposed statistical modelling method, which integrates the impact of network topology presents the PLC channel features as the underlying transmission line theory model. Furthermore, two potential application scenarios are d...

  16. Performance tradeoff between lateral and interdigitated doping patterns for high speed carrier-depletion based silicon modulators.

    Science.gov (United States)

    Yu, Hui; Pantouvaki, Marianna; Van Campenhout, Joris; Korn, Dietmar; Komorowska, Katarzyna; Dumon, Pieter; Li, Yanlu; Verheyen, Peter; Absil, Philippe; Alloatti, Luca; Hillerkuss, David; Leuthold, Juerg; Baets, Roel; Bogaerts, Wim

    2012-06-04

    Carrier-depletion based silicon modulators with lateral and interdigitated PN junctions are compared systematically on the same fabrication platform. The interdigitated diode is shown to outperform the lateral diode in achieving a low VπLπ of 0.62 V∙cm with comparable propagation loss at the expense of a higher depletion capacitance. The low VπLπ of the interdigitated PN junction is employed to demonstrate 10 Gbit/s modulation with 7.5 dB extinction ration from a 500 µm long device whose static insertion loss is 2.8 dB. In addition, up to 40 Gbit/s modulation is demonstrated for a 3 mm long device comprising a lateral diode and a co-designed traveling wave electrode.

  17. Electroencephalogram-based methodology for determining unconsciousness during depopulation.

    Science.gov (United States)

    Benson, E R; Alphin, R L; Rankin, M K; Caputo, M P; Johnson, A L

    2012-12-01

    When an avian influenza or virulent Newcastle disease outbreak occurs within commercial poultry, key steps involved in managing a fast-moving poultry disease can include: education; biosecurity; diagnostics and surveillance; quarantine; elimination of infected poultry through depopulation or culling, disposal, and disinfection; and decreasing host susceptibility. Available mass emergency depopulation procedures include whole-house gassing, partial-house gassing, containerized gassing, and water-based foam. To evaluate potential depopulation methods, it is often necessary to determine the time to the loss of consciousness (LOC) in poultry. Many current approaches to evaluating LOC are qualitative and require visual observation of the birds. This study outlines an electroencephalogram (EEG) frequency domain-based approach for determining the point at which a bird loses consciousness. In this study, commercial broilers were used to develop the methodology, and the methodology was validated with layer hens. In total, 42 data sets from 13 broilers aged 5-10 wk and 12 data sets from four spent hens (age greater than 1 yr) were collected and analyzed. A wireless EEG transmitter was surgically implanted, and each bird was monitored during individual treatment with isoflurane anesthesia. EEG data were evaluated using a frequency-based approach. The alpha/delta (A/D, alpha: 8-12 Hz, delta: 0.5-4 Hz) ratio and loss of posture (LOP) were used to determine the point at which the birds became unconscious. Unconsciousness, regardless of the method of induction, causes suppression in alpha and a rise in the delta frequency component, and this change is used to determine unconsciousness. There was no statistically significant difference between time to unconsciousness as measured by A/D ratio or LOP, and the A/D values were correlated at the times of unconsciousness. The correlation between LOP and A/D ratio indicates that the methodology is appropriate for determining

  18. Engineering Design Optimization Based on Intelligent Response Surface Methodology

    Institute of Scientific and Technical Information of China (English)

    SONG Guo-hui; WU Yu; LI Cong-xin

    2008-01-01

    An intelligent response surface methodology (IRSM) was proposed to achieve the most competitivemetal forming products, in which artificial intelligence technologies are introduced into the optimization process.It is used as simple and inexpensive replacement for computationally expensive simulation model. In IRSM,the optimal design space can be reduced greatly without any prior information about function distribution.Also, by identifying the approximation error region, new design points can be supplemented correspondingly toimprove the response surface model effectively. The procedure is iterated until the accuracy reaches the desiredthreshold value. Thus, the global optimization can be performed based on this substitute model. Finally, wepresent an optimization design example about roll forming of a "U" channel product.

  19. New Approach For Prediction Groundwater Depletion

    Science.gov (United States)

    Moustafa, Mahmoud

    2017-01-01

    Current approaches to quantify groundwater depletion involve water balance and satellite gravity. However, the water balance technique includes uncertain estimation of parameters such as evapotranspiration and runoff. The satellite method consumes time and effort. The work reported in this paper proposes using failure theory in a novel way to predict groundwater saturated thickness depletion. An important issue in the failure theory proposed is to determine the failure point (depletion case). The proposed technique uses depth of water as the net result of recharge/discharge processes in the aquifer to calculate remaining saturated thickness resulting from the applied pumping rates in an area to evaluate the groundwater depletion. Two parameters, the Weibull function and Bayes analysis were used to model and analyze collected data from 1962 to 2009. The proposed methodology was tested in a nonrenewable aquifer, with no recharge. Consequently, the continuous decline in water depth has been the main criterion used to estimate the depletion. The value of the proposed approach is to predict the probable effect of the current applied pumping rates on the saturated thickness based on the remaining saturated thickness data. The limitation of the suggested approach is that it assumes the applied management practices are constant during the prediction period. The study predicted that after 300 years there would be an 80% probability of the saturated aquifer which would be expected to be depleted. Lifetime or failure theory can give a simple alternative way to predict the remaining saturated thickness depletion with no time-consuming processes such as the sophisticated software required.

  20. A methodology for physically based rockfall hazard assessment

    Directory of Open Access Journals (Sweden)

    G. B. Crosta

    2003-01-01

    Full Text Available Rockfall hazard assessment is not simple to achieve in practice and sound, physically based assessment methodologies are still missing. The mobility of rockfalls implies a more difficult hazard definition with respect to other slope instabilities with minimal runout. Rockfall hazard assessment involves complex definitions for "occurrence probability" and "intensity". This paper is an attempt to evaluate rockfall hazard using the results of 3-D numerical modelling on a topography described by a DEM. Maps portraying the maximum frequency of passages, velocity and height of blocks at each model cell, are easily combined in a GIS in order to produce physically based rockfall hazard maps. Different methods are suggested and discussed for rockfall hazard mapping at a regional and local scale both along linear features or within exposed areas. An objective approach based on three-dimensional matrixes providing both a positional "Rockfall Hazard Index" and a "Rockfall Hazard Vector" is presented. The opportunity of combining different parameters in the 3-D matrixes has been evaluated to better express the relative increase in hazard. Furthermore, the sensitivity of the hazard index with respect to the included variables and their combinations is preliminarily discussed in order to constrain as objective as possible assessment criteria.

  1. Nanoscale field effect optical modulators based on depletion of epsilon-near-zero films

    Science.gov (United States)

    Lu, Zhaolin; Shi, Kaifeng; Yin, Peichuan

    2016-12-01

    The field effect in metal-oxide-semiconductor (MOS) capacitors plays a key role in field-effect transistors (FETs), which are the fundamental building blocks of modern digital integrated circuits. Recent works show that the field effect can also be used to make optical/plasmonic modulators. In this paper, we report the numerical investigation of field effect electro-absorption modulators each made of an ultrathin epsilon-near-zero (ENZ) film, as the active material, sandwiched in a silicon or plasmonic waveguide. Without a bias, the ENZ films maximize the attenuation of the waveguides and the modulators work at the OFF state; on the other hand, depletion of the carriers in the ENZ films greatly reduces the attenuation and the modulators work at the ON state. The double capacitor gating scheme with two 10-nm HfO2 films as the insulator is used to enhance the modulation by the field effect. The depletion requires about 10 V across the HfO2 layers. According to our simulation, extinction ratio up to 3.44 dB can be achieved in a 500-nm long Si waveguide with insertion loss only 0.71 dB (85.0% pass); extinction ratio up to 7.86 dB can be achieved in a 200-nm long plasmonic waveguide with insertion loss 1.11 dB (77.5% pass). The proposed modulators may find important applications in future on-chip or chip-to-chip optical interconnection.

  2. Simulation environment based on the Universal Verification Methodology

    Science.gov (United States)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  3. Nanoscale Field Effect Optical Modulators Based on Depletion of Epsilon-Near-Zero Films

    CERN Document Server

    Lu, Zhaolin; Shi, Kaifeng

    2015-01-01

    The field effect in metal-oxide-semiconductor (MOS) capacitors plays a key role in field-effect transistors (FETs), which are the fundamental building blocks of modern digital integrated circuits. Recent works show that the field effect can also be used to make optical/plasmonic modulators. In this paper, we report field effect electro-absorption modulators (FEOMs) each made of an ultrathin epsilon-near-zero (ENZ) film, as the active material, sandwiched in a silicon or plasmonic waveguide. Without a bias, the ENZ film maximizes the attenuation of the waveguides and the modulators work at the OFF state; contrariwise, depletion of the carriers in the ENZ film greatly reduces the attenuation and the modulators work at the ON state. The double capacitor gating scheme is used to enhance the modulation by the field effect. According to our simulation, extinction ratio up to 3.44 dB can be achieved in a 500-nm long Si waveguide with insertion loss only 0.71 dB (85.0%); extinction ratio up to 7.86 dB can be achieved...

  4. Hazardous materials transportation: a risk-analysis-based routing methodology.

    Science.gov (United States)

    Leonelli, P; Bonvicini, S; Spadoni, G

    2000-01-07

    This paper introduces a new methodology based on risk analysis for the selection of the best route for the transport of a hazardous substance. In order to perform this optimisation, the network is considered as a graph composed by nodes and arcs; each arc is assigned a cost per unit vehicle travelling on it and a vehicle capacity. After short discussion about risk measures suitable for linear risk sources, the arc capacities are introduced by comparison between the societal and individual risk measures of each arc with hazardous materials transportation risk criteria; then arc costs are defined in order to take into account both transportation out-of-pocket expenses and risk-related costs. The optimisation problem can thus be formulated as a 'minimum cost flow problem', which consists of determining for a specific hazardous substance the cheapest flow distribution, honouring the arc capacities, from the origin nodes to the destination nodes. The main features of the optimisation procedure, implemented on the computer code OPTIPATH, are presented. Test results about shipments of ammonia are discussed and finally further research developments are proposed.

  5. Materialized View Selection Approach Using Tree Based Methodology

    Directory of Open Access Journals (Sweden)

    MR. P. P. KARDE

    2010-10-01

    Full Text Available In large databases particularly in distributed database, query response time plays an important role as timely access to information and it is the basic requirement of successful business application. A data warehouse uses multiple materialized views to efficiently process a given set of queries. Quick response time and accuracy areimportant factors in the success of any database. The materialization of all views is not possible because of the space constraint and maintenance cost constraint. Selection of Materialized views is one of the most important decisions in designing a data warehouse for optimal efficiency. Selecting a suitable set of views that minimizesthe total cost associated with the materialized views and is the key component in data warehousing. Materialized views are found to be very useful for fast query processing. This paper gives the results of proposed tree based materialized view selection algorithm for query processing. In distributed environment where database is distributed over the nodes on which query should get executed and also plays an important role. This paper also proposes node selection algorithm for fast materialized view selection in distributed environment. And finally it is found that the proposed methodology performs better for query processing as compared to other materializedview selection strategies.

  6. Lean methodology: an evidence-based practice approach for healthcare improvement.

    Science.gov (United States)

    Johnson, Pauline M; Patterson, Claire J; OʼConnell, Mary P

    2013-12-10

    Lean methodology, an evidence-based practice approach adopted from Toyota, is grounded on the pillars of respect for people and continuous improvement. This article describes the use of Lean methodology to improve healthcare outcomes for patients with community-acquired pneumonia. Nurse practitioners and other clinicians should be knowledgeable about this methodology and become leaders in Lean transformation.

  7. Evaluating Supply Chain Management: A Methodology Based on a Theoretical Model

    OpenAIRE

    Alexandre Tadeu Simon; Luiz Carlos Di Serio; Silvio Roberto Ignacio Pires; Guilherme Silveira Martins

    2015-01-01

    Despite the increasing interest in supply chain management (SCM) by researchers and practitioners, there is still a lack of academic literature concerning topics such as methodologies to guide and support SCM evaluation. Most developed methodologies have been provided by consulting companies and are restricted in their publication and use. This article presents a methodology for evaluating companies’ degree of adherence to a SCM conceptual model. The methodology is based on Cooper, Lambert an...

  8. Intrinsic Depletion or Not

    DEFF Research Database (Denmark)

    Klösgen, Beate; Bruun, Sara; Hansen, Søren;

    with an AFM (2).    The intuitive explanation for the depletion based on "hydrophobic mismatch" between the obviously hydrophilic bulk phase of water next to the hydrophobic polymer. It would thus be an intrinsic property of all interfaces between non-matching materials. The detailed physical interaction path......  The presence of a depletion layer of water along extended hydrophobic interfaces, and a possibly related formation of nanobubbles, is an ongoing discussion. The phenomenon was initially reported when we, years ago, chose thick films (~300-400Å) of polystyrene as cushions between a crystalline...

  9. Analytical base-collector depletion capacitance in vertical SiGe heterojunction bipolar transistors fabricated on CMOS-compatible silicon on insulator

    Institute of Scientific and Technical Information of China (English)

    Xu Xiao-Bo; Zhang He-Ming; Hu Hui-Yong; Ma Jian-Li; Xu Li-Jun

    2011-01-01

    The base-collector depletion capacitance for vertical SiGe npn heterojunction bipolar transistors (HBTs) on silicon on insulator (SOI) is split into vertical and lateral parts. This paper proposes a novel analytical depletion capacitance model of this structure for the first time. A large discrepancy is predicted when the present model is compared with the conventional depletion model, and it is shown that the capacitance decreases with the increase of the reverse collectorbase bias-and shows a kink as the reverse collector-base bias reaches the effective vertical punch-through voltage while the voltage differs with the collector doping concentrations, which is consistent with measurement results. The model can be employed for a fast evaluation of the depletion capacitance of an SOI SiGe HBT and has useful applications on the design and simulation of high performance SiGe circuits and devices.

  10. Integrated Methodology for Information System Change Control Based on Enterprise Architecture Models

    Directory of Open Access Journals (Sweden)

    Pirta Ruta

    2015-12-01

    Full Text Available The information system (IS change management and governance, according to the best practices, are defined and described in several international methodologies, standards, and frameworks (ITIL, COBIT, ValIT etc.. These methodologies describe IS change management aspects from the viewpoint of their particular enterprise resource management area. The areas are mainly viewed in a partly isolated environment, and the integration of the existing methodologies is insufficient for providing unified and controlled methodological support for holistic IS change management. In this paper, an integrated change management methodology is introduced. The methodology consists of guidelines for IS change control by integrating the following significant resource management areas – information technology (IT governance, change management and enterprise architecture (EA change management. In addition, the methodology includes lists of controls applicable at different phases. The approach is based on re-use and fusion of principles used by related methodologies as well as on empirical observations about typical IS change management mistakes in enterprises.

  11. Methodological bases of innovative training of specialists in nanotechnology field

    Directory of Open Access Journals (Sweden)

    FIGOVSKY Oleg Lvovich

    2016-10-01

    Full Text Available The performance of innovative training system aimed at highly intellectual specialists in the area of nanotechnologies for Kazakhstan’s economy demands establishment and development of nanotechnological market in the country, teaching of innovative engineering combined with consistent research, integration of trained specialists with latest technologies and sciences at the international level. Methodological aspects of training competitive specialists for nanotechnological field are specific. The paper presents methodological principles of innovative training of specialists for science-intensive industry that were realized according to grant given by the Ministry of Education and Science of the Republic of Kazakhstan.

  12. Methodological Innovation in Practice-Based Design Doctorates

    Science.gov (United States)

    Yee, Joyce S. R.

    2010-01-01

    This article presents a selective review of recent design PhDs that identify and analyse the methodological innovation that is occurring in the field, in order to inform future provision of research training. Six recently completed design PhDs are used to highlight possible philosophical and practical models that can be adopted by future PhD…

  13. Methodological bases of innovative training of specialists in nanotechnology field

    OpenAIRE

    FIGOVSKY Oleg Lvovich; SHAMELKHANOVA Nelya A.; AIDAROVA Saule B.

    2016-01-01

    The performance of innovative training system aimed at highly intellectual specialists in the area of nanotechnologies for Kazakhstan’s economy demands establishment and development of nanotechnological market in the country, teaching of innovative engineering combined with consistent research, integration of trained specialists with latest technologies and sciences at the international level. Methodological aspects of training competitive specialists for nanotechnological field are spe...

  14. A Methodology for Simulation-based Job Performance Assessment

    Science.gov (United States)

    2008-01-01

    Job performance measurement is of critical importance to any organization’s health. It is important not only to recognize and reward good performance...methodology for developing simulations for job performance assessment. We then describe a performance assessment simulation for Light-Wheeled Vehicle

  15. Eutrophication of mangroves linked to depletion of foliar and soil base cations.

    Science.gov (United States)

    Fauzi, Anas; Skidmore, Andrew K; Heitkönig, Ignas M A; van Gils, Hein; Schlerf, Martin

    2014-12-01

    There is growing concern that increasing eutrophication causes degradation of coastal ecosystems. Studies in terrestrial ecosystems have shown that increasing the concentration of nitrogen in soils contributes to the acidification process, which leads to leaching of base cations. To test the effects of eutrophication on the availability of base cations in mangroves, we compared paired leaf and soil nutrient levels sampled in Nypa fruticans and Rhizophora spp. on a severely disturbed, i.e. nutrient loaded, site (Mahakam delta) with samples from an undisturbed, near-pristine site (Berau delta) in East Kalimantan, Indonesia. The findings indicate that under pristine conditions, the availability of base cations in mangrove soils is determined largely by salinity. Anthropogenic disturbances on the Mahakam site have resulted in eutrophication, which is related to lower levels of foliar and soil base cations. Path analysis suggests that increasing soil nitrogen reduces soil pH, which in turn reduces the levels of foliar and soil base cations in mangroves.

  16. An infrared image based methodology for breast lesions screening

    Science.gov (United States)

    Morais, K. C. C.; Vargas, J. V. C.; Reisemberger, G. G.; Freitas, F. N. P.; Oliari, S. H.; Brioschi, M. L.; Louveira, M. H.; Spautz, C.; Dias, F. G.; Gasperin, P.; Budel, V. M.; Cordeiro, R. A. G.; Schittini, A. P. P.; Neto, C. D.

    2016-05-01

    The objective of this paper is to evaluate the potential of utilizing a structured methodology for breast lesions screening, based on infrared imaging temperature measurements of a healthy control group to establish expected normality ranges, and of breast cancer patients, previously diagnosed through biopsies of the affected regions. An analysis of the systematic error of the infrared camera skin temperature measurements was conducted in several different regions of the body, by direct comparison to high precision thermistor temperature measurements, showing that infrared camera temperatures are consistently around 2 °C above the thermistor temperatures. Therefore, a method of conjugated gradients is proposed to eliminate the infrared camera direct temperature measurement imprecision, by calculating the temperature difference between two points to cancel out the error. The method takes into account the human body approximate bilateral symmetry, and compares measured dimensionless temperature difference values (Δ θ bar) between two symmetric regions of the patient's breast, that takes into account the breast region, the surrounding ambient and the individual core temperatures, and doing so, the results interpretation for different individuals become simple and non subjective. The range of normal whole breast average dimensionless temperature differences for 101 healthy individuals was determined, and admitting that the breasts temperatures exhibit a unimodal normal distribution, the healthy normal range for each region was considered to be the dimensionless temperature difference plus/minus twice the standard deviation of the measurements, Δ θ bar ‾ + 2σ Δ θ bar ‾ , in order to represent 95% of the population. Forty-seven patients with previously diagnosed breast cancer through biopsies were examined with the method, which was capable of detecting breast abnormalities in 45 cases (96%). Therefore, the conjugated gradients method was considered effective

  17. Bone drilling methodology and tool based on position measurements.

    Science.gov (United States)

    Díaz, Iñaki; Gil, Jorge Juan; Louredo, Marcos

    2013-11-01

    Bone drilling, despite being a very common procedure in hospitals around the world, becomes very challenging when performed close to organs such as the cochlea or when depth control is critical for avoiding damage to surrounding tissue. To date, several mechatronic prototypes have been proposed to assist surgeons by automatically detecting bone layer transitions and breakthroughs. However, none of them is currently accurate enough to be part of the surgeon's standard equipment. The present paper shows a test bench specially designed to evaluate prior methodologies and analyze their drawbacks. Afterward, a new layer detection methodology with improved performance is described and tested. Finally, the prototype of a portable mechatronic bone drill that takes advantage of the proposed detection algorithm is presented.

  18. AONBench: A Methodology for Benchmarking XML Based Service Oriented Applications

    Directory of Open Access Journals (Sweden)

    Abdul Waheed

    2007-09-01

    Full Text Available Service Oriented Architectures (SOA and applications increasingly rely on network infrastructure instead of back-end servers. Cisco system Application Oriented Networking (AON initiative exemplifies this trend. Benchmarking such infrastructure and their services is expected to play an important role in the networking industry. We present AONBech specifications and methodology to benchmark networked XML application servers and appliances. AONBench is not a benchmarking tool. It is a specification and methodology for performance measurements, which leverages from existing XML microbenchmarks and uses HTTP for end-to-end communication. We implement AONBench specifications for end-to-end performance measurements through public domain HTTP load generation tool ApacheBench and Apache web server. We present three case studies of using AONBench for architecting real application oriented networking products.

  19. Radiochemical Analysis Methodology for uranium Depletion Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Scatena-Wachel DE

    2007-01-09

    This report provides sufficient material for a test sponsor with little or no radiochemistry background to understand and follow physics irradiation test program execution. Most irradiation test programs employ similar techniques and the general details provided here can be applied to the analysis of other irradiated sample types. Aspects of program management directly affecting analysis quality are also provided. This report is not an in-depth treatise on the vast field of radiochemical analysis techniques and related topics such as quality control. Instrumental technology is a very fast growing field and dramatic improvements are made each year, thus the instrumentation described in this report is no longer cutting edge technology. Much of the background material is still applicable and useful for the analysis of older experiments and also for subcontractors who still retain the older instrumentation.

  20. Evaluating an Inquiry-based Bioinformatics Course Using Q Methodology

    Science.gov (United States)

    Ramlo, Susan E.; McConnell, David; Duan, Zhong-Hui; Moore, Francisco B.

    2008-06-01

    Faculty at a Midwestern metropolitan public university recently developed a course on bioinformatics that emphasized collaboration and inquiry. Bioinformatics, essentially the application of computational tools to biological data, is inherently interdisciplinary. Thus part of the challenge of creating this course was serving the needs and backgrounds of a diverse set of students, predominantly computer science and biology undergraduate and graduate students. Although the researchers desired to investigate student views of the course, they were interested in the potentially different perspectives. Q methodology, a measure of subjectivity, allowed the researchers to determine the various student perspectives in the bioinformatics course.

  1. A Nuclear Reactor Transient Methodology Based on Discrete Ordinates Method

    Directory of Open Access Journals (Sweden)

    Shun Zhang

    2014-01-01

    Full Text Available With the rapid development of nuclear power industry, simulating and analyzing the reactor transient are of great significance for the nuclear safety. The traditional diffusion theory is not suitable for small volume or strong absorption problem. In this paper, we have studied the application of discrete ordinates method in the numerical solution of space-time kinetics equation. The fully implicit time integration was applied and the precursor equations were solved by analytical method. In order to improve efficiency of the transport theory, we also adopted some advanced acceleration methods. Numerical results of the TWIGL benchmark problem presented demonstrate the accuracy and efficiency of this methodology.

  2. A CWT-based methodology for piston slap experimental characterization

    Science.gov (United States)

    Buzzoni, M.; Mucchi, E.; Dalpiaz, G.

    2017-03-01

    Noise and vibration control in mechanical systems has become ever more significant for automotive industry where the comfort of the passenger compartment represents a challenging issue for car manufacturers. The reduction of piston slap noise is pivotal for a good design of IC engines. In this scenario, a methodology has been developed for the vibro-acoustic assessment of IC diesel engines by means of design changes in piston to cylinder bore clearance. Vibration signals have been analysed by means of advanced signal processing techniques taking advantage of cyclostationarity theory. The procedure departs from the analysis of the Continuous Wavelet Transform (CWT) in order to identify a representative frequency band of piston slap phenomenon. Such a frequency band has been exploited as the input data in the further signal processing analysis that involves the envelope analysis of the second order cyclostationary component of the signal. The second order harmonic component has been used as the benchmark parameter of piston slap noise. An experimental procedure of vibrational benchmarking is proposed and verified at different operational conditions in real IC engines actually equipped on cars. This study clearly underlines the crucial role of the transducer positioning when differences among real piston-to-cylinder clearances are considered. In particular, the proposed methodology is effective for the sensors placed on the outer cylinder wall in all the tested conditions.

  3. The effect of acute tyrosine phenylalanine depletion on emotion-based decision-making in healthy adults.

    Science.gov (United States)

    Vrshek-Schallhorn, Suzanne; Wahlstrom, Dustin; White, Tonya; Luciana, Monica

    2013-04-01

    Despite interest in dopamine's role in emotion-based decision-making, few reports of the effects of dopamine manipulations are available in this area in humans. This study investigates dopamine's role in emotion-based decision-making through a common measure of this construct, the Iowa Gambling Task (IGT), using Acute Tyrosine Phenylalanine Depletion (ATPD). In a between-subjects design, 40 healthy adults were randomized to receive either an ATPD beverage or a balanced amino acid beverage (a control) prior to completing the IGT, as well as pre- and post-manipulation blood draws for the neurohormone prolactin. Together with conventional IGT performance metrics, choice selections and response latencies were examined separately for good and bad choices before and after several key punishment events. Changes in response latencies were also used to predict total task performance. Prolactin levels increased significantly in the ATPD group but not in the control group. However, no significant group differences in performance metrics were detected, nor were there sex differences in outcome measures. However, the balanced group's bad deck latencies speeded up across the task, while the ATPD group's latencies remained adaptively hesitant. Additionally, modulation of latencies to the bad decks predicted total score for the ATPD group only. One interpretation is that ATPD subtly attenuated reward salience and altered the approach by which individuals achieved successful performance, without resulting in frank group differences in task performance.

  4. D2.1 - An EA Active, Problem Based Learning Methodology - EAtrain2

    DEFF Research Database (Denmark)

    Ryberg, Thomas; Georgsen, Marianne; Buus, Lillian

    This deliverable reports on the work undertaken in work package 2 with the key objective to develop a learning methodology for web 2.0 mediated Enterprise Architecture (EA) learning building on a problem based learning (PBL) approach. The deliverable reports not only on the methodology but also...

  5. Systematic screening methodology and energy efficient design of ionic liquid-based separation processes

    DEFF Research Database (Denmark)

    Kulajanpeng, Kusuma; Suriyapraphadilok, Uthaiporn; Gani, Rafiqul

    2016-01-01

    A systematic methodology for the screening of ionic liquids (ILs) as entrainers and for the design of ILs-based separation processes in various homogeneous binary azeotropic mixtures has been developed. The methodology focuses on the homogeneous binary aqueous azeotropic systems (for example, wat...

  6. A Methodology for Integrating Computer-Based Learning Tools in Science Curricula

    Science.gov (United States)

    Papadouris, Nicos; Constantinou, Constantinos P.

    2009-01-01

    This paper demonstrates a methodology for effectively integrating computer-based learning tools in science teaching and learning. This methodology provides a means of systematic analysis to identify the capabilities of particular software tools and to formulate a series of competencies relevant to physical science that could be developed by means…

  7. Legal and methodological bases of comprehensive forensic enquiry of pornography

    Directory of Open Access Journals (Sweden)

    Berdnikov D.V.

    2016-03-01

    Full Text Available The article gives an analysis of the legal definition of pornography. The author identified descriptive and target criteria groups which are required for the analysis and analyses the content of descriptive criteria of pornography and the way how they should be documented. Fixing attention to the anatomical and physiological characteristics of the sexual relations is determine as necessary target criterion. It is noted that the term "pornography" is a legal and cannot be subject of expertise. That is why author underlined some methodological basis of complex psycho-linguistic and psycho-art expertise. The article presents general issue depends on expert conclusion and studies cases where the research is necessary to involve doctors, as well as criteria for expert's opinion. Besides that, author defined subject, object and main tasks of psychological studies of pornographic information.

  8. Simulation Environment Based on the Universal Verification Methodology

    CERN Document Server

    Fiergolski, Adrian

    2016-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC desi...

  9. Design of an Emulsion-based Personal Detergent through a Model-based Chemical Product Design Methodology

    DEFF Research Database (Denmark)

    Mattei, Michele; Hill, Michael; Kontogeorgis, Georgios;

    2013-01-01

    An extended systematic methodology for the design of emulsion-based Chemical products is presented. The methodology consists of a model-based framework involving seven sequential hierarchical steps: starting with the identification of the needs to be satisfied by the product and then adding one...

  10. Drag &Drop, Mixed-Methodology-based Lab-on-Chip Design Optimization Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall objective is to develop a ?mixed-methodology?, drag and drop, component library (fluidic-lego)-based, system design and optimization tool for complex...

  11. Efficacy of a rituximab regimen based on B cell depletion in thrombotic thrombocytopenic purpura with suboptimal response to standard treatment: Results of a phase II, multicenter noncomparative study.

    Science.gov (United States)

    Benhamou, Ygal; Paintaud, Gilles; Azoulay, Elie; Poullin, Pascale; Galicier, Lionel; Desvignes, Céline; Baudel, Jean-Luc; Peltier, Julie; Mira, Jean-Paul; Pène, Frédéric; Presne, Claire; Saheb, Samir; Deligny, Christophe; Rousseau, Alexandra; Féger, Frédéric; Veyradier, Agnès; Coppo, Paul

    2016-12-01

    The standard four-rituximab infusions treatment in acquired thrombotic thrombocytopenic purpura (TTP) remains empirical. Peripheral B cell depletion is correlated with the decrease in serum concentrations of anti-ADAMTS13 and associated with clinical response. To assess the efficacy of a rituximab regimen based on B cell depletion, 24 TTP patients were enrolled in this prospective multicentre single arm phase II study and then compared to patients from a previous study. Patients with a suboptimal response to a plasma exchange-based regimen received two infusions of rituximab 375 mg m(-2) within 4 days, and a third dose at day +15 of the first infusion if peripheral B cells were still detectable. Primary endpoint was the assessment of the time required to platelet count recovery from the first plasma exchange. Three patients died after the first rituximab administration. In the remaining patients, the B cell-driven treatment hastened remission and ADAMTS13 activity recovery as a result of rapid anti-ADAMTS13 depletion in a similar manner to the standard four-rituximab infusions schedule. The 1-year relapse-free survival was also comparable between both groups. A rituximab regimen based on B cell depletion is feasible and provides comparable results than with the four-rituximab infusions schedule. This regimen could represent a new standard in TTP. This trial was registered at www.clinicaltrials.gov (NCT00907751). Am. J. Hematol. 91:1246-1251, 2016. © 2016 Wiley Periodicals, Inc.

  12. Generic Methodology for Field Calibration of Nacelle-Based Wind Lidars

    OpenAIRE

    Antoine Borraccino; Michael Courtney; Rozenn Wagner

    2016-01-01

    Nacelle-based Doppler wind lidars have shown promising capabilities to assess power performance, detect yaw misalignment or perform feed-forward control. The power curve application requires uncertainty assessment. Traceable measurements and uncertainties of nacelle-based wind lidars can be obtained through a methodology applicable to any type of existing and upcoming nacelle lidar technology. The generic methodology consists in calibrating all the inputs of the wind field reconstruction algo...

  13. Object-oriented analysis and design: a methodology for modeling the computer-based patient record.

    Science.gov (United States)

    Egyhazy, C J; Eyestone, S M; Martino, J; Hodgson, C L

    1998-08-01

    The article highlights the importance of an object-oriented analysis and design (OOAD) methodology for the computer-based patient record (CPR) in the military environment. Many OOAD methodologies do not adequately scale up, allow for efficient reuse of their products, or accommodate legacy systems. A methodology that addresses these issues is formulated and used to demonstrate its applicability in a large-scale health care service system. During a period of 6 months, a team of object modelers and domain experts formulated an OOAD methodology tailored to the Department of Defense Military Health System and used it to produce components of an object model for simple order processing. This methodology and the lessons learned during its implementation are described. This approach is necessary to achieve broad interoperability among heterogeneous automated information systems.

  14. A methodology to assess the contribution of biorefineries to a sustainable bio-based economy

    Energy Technology Data Exchange (ETDEWEB)

    Maga, Daniel

    2015-07-01

    Within this thesis for the first time an integrative methodology to assess the sustainability of biorefineries and bio-based products has been developed which is based on a fundamental understanding of sustainability as presented in the Brundtland report. The applied integrative concept of sustainability as developed by the Institute for Technology Assessment and Systems Analysis (ITAS) overcomes the widespread thinking in three pillars of sustainability and opens up new perspectives. The methodology developed addresses innovative life cycle assessment evaluation methods on midpoint level as well as on the area of protection and adopts state-of-the-art assessment procedures e.g. to determine water deprivation. It goes far beyond the scope of conventional LCA studies and examines effects on human health, on the environment, on the development of knowledge and physical capital, and on regional development and acceptance. In order to validate the developed method it was applied to an algae biorefinery currently under development and construction in the south of Spain. For this assessment for the first time extensive process data was collected of a real algae biorefinery which uses municipal waste water as a culture medium for microalgae. The use of waste water allows to reduce the demand for fresh water and avoids additional fertilisation of microalgae. Moreover, the analysed algae biorefinery replaces conventional waste water treatment by a biological purification and produces biogas by an anaerobic pretreatment of waste water as well as by anaerobic digestion of algae. After several purification steps the biogas can be used as automotive fuel and thus contributes to further development and increased use of biofuels. On the one hand the sustainability assessment shows that this way of waste water treatment contributes to climate protection and to the conservation of fossil energy carrier. On the other hand approximately ten times more land is needed and twenty times

  15. ICT-Based, Cross-Cultural Communication: A Methodological Perspective

    Science.gov (United States)

    Larsen, Niels; Bruselius-Jensen, Maria; Danielsen, Dina; Nyamai, Rachael; Otiende, James; Aagaard-Hansen, Jens

    2014-01-01

    The article discusses how cross-cultural communication based on information and communication technologies (ICT) may be used in participatory health promotion as well as in education in general. The analysis draws on experiences from a health education research project with grade 6 (approx. 12 years) pupils in Nairobi (Kenya) and Copenhagen…

  16. Knowledge-based methodology in pattern recognition and understanding

    OpenAIRE

    Haton, Jean-Paul

    1987-01-01

    The interpretation and understanding of complex patterns (e.g. speech, images or other kinds of mono- or multi-dimensional signals) is related both to pattern recognition and artificial intelligence since it necessitates numerical processing as well as symbolic knoledge-based reasoning techniques. This paper presents a state-of-the-art in the field, including basic concepts and practical applications.

  17. Simulation Based Performance Assessment : Methodology and case study

    NARCIS (Netherlands)

    Kraker, J.K. de; Noordkamp, H.W.; Fitski, H.J.; Barros, A.I

    2009-01-01

    During the various phases of the Defense acquisition process, and in the early design phases, many decisions must be made concerning the performance and cost of the new equipment. Often many of these decisions are made while having only a limited view of their consequences or based on subjective inf

  18. Design Based Research Methodology for Teaching with Technology in English

    Science.gov (United States)

    Jetnikoff, Anita

    2015-01-01

    Design based research (DBR) is an appropriate method for small scale educational research projects involving collaboration between teachers, students and researchers. It is particularly useful in collaborative projects where an intervention is implemented and evaluated in a grounded context. The intervention can be technological, or a new program…

  19. Relational and Object-Oriented Methodology in Data Bases Systems

    Directory of Open Access Journals (Sweden)

    Marian Pompiliu CRISTESCU

    2006-01-01

    Full Text Available Database programming languages integrate concepts of databases and programming languages to provide both implementation tools for data-intensive applications and high-level user interfaces to databases. Frequently, database programs contain a large amount of application knowledge which is hidden in the procedural code and thus difficult to maintain with changing data and user views. This paper presents a first attempt to improve the situation by supporting the integrated definition and management of data and rules based on a setoriented and predicative approach. The use of database technology for integrated fact and rule base management is shown to have some important advantages in terms of fact and rule integrity, question-answering, and explanation of results.

  20. Simulation-Based Methodologies for Global Optimization and Planning

    Science.gov (United States)

    2013-10-11

    Markov Decision Processes Simulation optimization problems arising in supply chain management, path planning for unmanned aerial vehicles, financial...sitivities such as the Greeks on a basket of stocks when Monte Carlo simulation is employed. We focus on a class of derivatives called mountain range options...high quality solutions by randomly s Simulation -based optimization, global optimization, Markov Decision Processes, risk Steven I. Marcus 301-405-7589

  1. Transcriptome-based identification of pro- and antioxidative gene expression in kidney cortex of nitric oxide-depleted rats

    NARCIS (Netherlands)

    Wesseling, Sebastiaan; Joles, Jaap A.; van Goor, Harry; Bluyssen, Hans A.; Kemmeren, Patrick; Holstege, Frank C.; Koomans, Hein A.; Braam, Branko

    2007-01-01

    Nitric oxide (NO) depletion in rats induces severe endothelial dysfunction within 4 days. Subsequently, hypertension and renal injury develop, which are ameliorated by alpha-tocopherol (VitE) cotreatment. The hypothesis of the present study was that NO synthase (NOS) inhibition induces a renal corti

  2. CRDIAC: Coupled Reactor Depletion Instrument with Automated Control

    Energy Technology Data Exchange (ETDEWEB)

    Steven K. Logan

    2012-08-01

    When modeling the behavior of a nuclear reactor over time, it is important to understand how the isotopes in the reactor will change, or transmute, over that time. This is especially important in the reactor fuel itself. Many nuclear physics modeling codes model how particles interact in the system, but do not model this over time. Thus, another code is used in conjunction with the nuclear physics code to accomplish this. In our code, Monte Carlo N-Particle (MCNP) codes and the Multi Reactor Transmutation Analysis Utility (MRTAU) were chosen as the codes to use. In this way, MCNP would produce the reaction rates in the different isotopes present and MRTAU would use cross sections generated from these reaction rates to determine how the mass of each isotope is lost or gained. Between these two codes, the information must be altered and edited for use. For this, a Python 2.7 script was developed to aid the user in getting the information in the correct forms. This newly developed methodology was called the Coupled Reactor Depletion Instrument with Automated Controls (CRDIAC). As is the case in any newly developed methodology for modeling of physical phenomena, CRDIAC needed to be verified against similar methodology and validated against data taken from an experiment, in our case AFIP-3. AFIP-3 was a reduced enrichment plate type fuel tested in the ATR. We verified our methodology against the MCNP Coupled with ORIGEN2 (MCWO) method and validated our work against the Post Irradiation Examination (PIE) data. When compared to MCWO, the difference in concentration of U-235 throughout Cycle 144A was about 1%. When compared to the PIE data, the average bias for end of life U-235 concentration was about 2%. These results from CRDIAC therefore agree with the MCWO and PIE data, validating and verifying CRDIAC. CRDIAC provides an alternative to using ORIGEN-based methodology, which is useful because CRDIAC's depletion code, MRTAU, uses every available isotope in its

  3. Publishing FAIR Data: an exemplar methodology utilizing PHI-base

    Directory of Open Access Journals (Sweden)

    Alejandro eRodríguez Iglesias

    2016-05-01

    Full Text Available Pathogen-Host interaction data is core to our understanding of disease processes and their molecular/genetic bases. Facile access to such core data is particularly important for the plant sciences, where individual genetic and phenotypic observations have the added complexity of being dispersed over a wide diversity of plant species versus the relatively fewer host species of interest to biomedical researchers. Recently, an international initiative interested in scholarly data publishing proposed that all scientific data should be FAIR - Findable, Accessible, Interoperable, and Reusable. In this work, we describe the process of migrating a database of notable relevance to the plant sciences - the Pathogen-Host Interaction Database (PHI-base - to a form that conforms to each of the FAIR Principles. We discuss the technical and architectural decisions, and the migration pathway, including observations of the difficulty and/or fidelity of each step. We examine how multiple FAIR principles can be addressed simultaneously through careful design decisions, including making data FAIR for both humans and machines with minimal duplication of effort. We note how FAIR data publishing involves more than data reformatting, requiring features beyond those exhibited by most life science Semantic Web or Linked Data resources. We explore the value-added by completing this FAIR data transformation, and then test the result through integrative questions that could not easily be asked over traditional Web-based data resources. Finally, we demonstrate the utility of providing explicit and reliable access to provenance information, which we argue enhances citation rates by encouraging and facilitating transparent scholarly reuse of these valuable data holdings.

  4. Publishing FAIR Data: An Exemplar Methodology Utilizing PHI-Base

    Science.gov (United States)

    Rodríguez-Iglesias, Alejandro; Rodríguez-González, Alejandro; Irvine, Alistair G.; Sesma, Ane; Urban, Martin; Hammond-Kosack, Kim E.; Wilkinson, Mark D.

    2016-01-01

    Pathogen-Host interaction data is core to our understanding of disease processes and their molecular/genetic bases. Facile access to such core data is particularly important for the plant sciences, where individual genetic and phenotypic observations have the added complexity of being dispersed over a wide diversity of plant species vs. the relatively fewer host species of interest to biomedical researchers. Recently, an international initiative interested in scholarly data publishing proposed that all scientific data should be “FAIR”—Findable, Accessible, Interoperable, and Reusable. In this work, we describe the process of migrating a database of notable relevance to the plant sciences—the Pathogen-Host Interaction Database (PHI-base)—to a form that conforms to each of the FAIR Principles. We discuss the technical and architectural decisions, and the migration pathway, including observations of the difficulty and/or fidelity of each step. We examine how multiple FAIR principles can be addressed simultaneously through careful design decisions, including making data FAIR for both humans and machines with minimal duplication of effort. We note how FAIR data publishing involves more than data reformatting, requiring features beyond those exhibited by most life science Semantic Web or Linked Data resources. We explore the value-added by completing this FAIR data transformation, and then test the result through integrative questions that could not easily be asked over traditional Web-based data resources. Finally, we demonstrate the utility of providing explicit and reliable access to provenance information, which we argue enhances citation rates by encouraging and facilitating transparent scholarly reuse of these valuable data holdings. PMID:27433158

  5. Publishing FAIR Data: An Exemplar Methodology Utilizing PHI-Base.

    Science.gov (United States)

    Rodríguez-Iglesias, Alejandro; Rodríguez-González, Alejandro; Irvine, Alistair G; Sesma, Ane; Urban, Martin; Hammond-Kosack, Kim E; Wilkinson, Mark D

    2016-01-01

    Pathogen-Host interaction data is core to our understanding of disease processes and their molecular/genetic bases. Facile access to such core data is particularly important for the plant sciences, where individual genetic and phenotypic observations have the added complexity of being dispersed over a wide diversity of plant species vs. the relatively fewer host species of interest to biomedical researchers. Recently, an international initiative interested in scholarly data publishing proposed that all scientific data should be "FAIR"-Findable, Accessible, Interoperable, and Reusable. In this work, we describe the process of migrating a database of notable relevance to the plant sciences-the Pathogen-Host Interaction Database (PHI-base)-to a form that conforms to each of the FAIR Principles. We discuss the technical and architectural decisions, and the migration pathway, including observations of the difficulty and/or fidelity of each step. We examine how multiple FAIR principles can be addressed simultaneously through careful design decisions, including making data FAIR for both humans and machines with minimal duplication of effort. We note how FAIR data publishing involves more than data reformatting, requiring features beyond those exhibited by most life science Semantic Web or Linked Data resources. We explore the value-added by completing this FAIR data transformation, and then test the result through integrative questions that could not easily be asked over traditional Web-based data resources. Finally, we demonstrate the utility of providing explicit and reliable access to provenance information, which we argue enhances citation rates by encouraging and facilitating transparent scholarly reuse of these valuable data holdings.

  6. JOB SHOP METHODOLOGY BASED ON AN ANT COLONY

    Directory of Open Access Journals (Sweden)

    OMAR CASTRILLON

    2009-01-01

    Full Text Available The purpose of this study is to reduce the total process time (Makespan and to increase the machines working time, in a job shop environment, using a heuristic based on ant colony optimization. This work is developed in two phases: The first stage describes the identification and definition of heuristics for the sequential processes in the job shop. The second stage shows the effectiveness of the system in the traditional programming of production. A good solution, with 99% efficiency is found using this technique.

  7. Design methodology for bio-based processing: Biodiesel and fatty alcohol production

    DEFF Research Database (Denmark)

    Simasatikul, Lida; Arpornwichanop, Amornchai; Gani, Rafiqul

    2012-01-01

    A systematic design methodology is developed for producing two main products plus side products starting with one or more bio-based renewable source. A superstructure that includes all possible reaction and separation operations is generated through thermodynamic insights and available data. The ....... Economic analysis and net present value are determined to find the best economically and operationally feasible process. The application of the methodology is presented through a case study involving biodiesel and fatty alcohol productions....

  8. Design methodology for bio-based processing: Biodiesel and fatty alcohol production

    DEFF Research Database (Denmark)

    Simasatikul, Lida; Arpornwichanopa, Amornchai; Gani, Rafiqul

    2013-01-01

    A systematic design methodology is developed for producing multiple main products plus side products starting with one or more bio-based renewable source. A superstructure that includes all possible reaction and separation operations is generated through thermodynamic insights and available data........ Economic analysis and net present value are determined to find the best economically and operationally feasible process. The application of the methodology is presented through a case study involving biodiesel and fatty alcohol productions....

  9. Evaluation methodology for query-based scene understanding systems

    Science.gov (United States)

    Huster, Todd P.; Ross, Timothy D.; Culbertson, Jared L.

    2015-05-01

    In this paper, we are proposing a method for the principled evaluation of scene understanding systems in a query-based framework. We can think of a query-based scene understanding system as a generalization of typical sensor exploitation systems where instead of performing a narrowly defined task (e.g., detect, track, classify, etc.), the system can perform general user-defined tasks specified in a query language. Examples of this type of system have been developed as part of DARPA's Mathematics of Sensing, Exploitation, and Execution (MSEE) program. There is a body of literature on the evaluation of typical sensor exploitation systems, but the open-ended nature of the query interface introduces new aspects to the evaluation problem that have not been widely considered before. In this paper, we state the evaluation problem and propose an approach to efficiently learn about the quality of the system under test. We consider the objective of the evaluation to be to build a performance model of the system under test, and we rely on the principles of Bayesian experiment design to help construct and select optimal queries for learning about the parameters of that model.

  10. Comparing econometric and survey-based methodologies in measuring offshoring

    DEFF Research Database (Denmark)

    Refslund, Bjarke

    2016-01-01

    Offshoring, understood as the geographical relocation of companies’ activities to another country, is a key feature of contemporary globalization and has growing social, political and economic implications. However, the phenomenon remains poorly examined at aggregated macro levels such as the nat......Offshoring, understood as the geographical relocation of companies’ activities to another country, is a key feature of contemporary globalization and has growing social, political and economic implications. However, the phenomenon remains poorly examined at aggregated macro levels...... such as the national or regional level. Most macro analyses are based on proxies and trade statistics with limitations. Drawing on unique Danish survey data, this article demonstrates how survey data can provide important insights into the national scale and impacts of offshoring, including changes of employment...

  11. Action orientation overcomes the ego depletion effect.

    Science.gov (United States)

    Dang, Junhua; Xiao, Shanshan; Shi, Yucai; Mao, Lihua

    2015-04-01

    It has been consistently demonstrated that initial exertion of self-control had negative influence on people's performance on subsequent self-control tasks. This phenomenon is referred to as the ego depletion effect. Based on action control theory, the current research investigated whether the ego depletion effect could be moderated by individuals' action versus state orientation. Our results showed that only state-oriented individuals exhibited ego depletion. For individuals with action orientation, however, their performance was not influenced by initial exertion of self-control. The beneficial effect of action orientation against ego depletion in our experiment results from its facilitation for adapting to the depleting task.

  12. Problems and Issues in Using Computer- Based Support Tools to Enhance 'Soft' Systems Methodologies

    Directory of Open Access Journals (Sweden)

    Mark Stansfield

    2001-11-01

    Full Text Available This paper explores the issue of whether computer-based support tools can enhance the use of 'soft' systems methodologies as applied to real-world problem situations. Although work has been carried out by a number of researchers in applying computer-based technology to concepts and methodologies relating to 'soft' systems thinking such as Soft Systems Methodology (SSM, such attempts appear to be still in their infancy and have not been applied widely to real-world problem situations. This paper will highlight some of the problems that may be encountered in attempting to develop computer-based support tools for 'soft' systems methodologies. Particular attention will be paid to an attempt by the author to develop a computer-based support tool for a particular 'soft' systems method of inquiry known as the Appreciative Inquiry Method that is based upon Vickers' notion of 'appreciation' (Vickers, 196S and Checkland's SSM (Checkland, 1981. The final part of the paper will explore some of the lessons learnt from developing and applying the computer-based support tool to a real world problem situation, as well as considering the feasibility of developing computer-based support tools for 'soft' systems methodologies. This paper will put forward the point that a mixture of manual and computer-based tools should be employed to allow a methodology to be used in an unconstrained manner, but the benefits provided by computer-based technology should be utilised in supporting and enhancing the more mundane and structured tasks.

  13. Measure of Landscape Heterogeneity by Agent-Based Methodology

    Science.gov (United States)

    Wirth, E.; Szabó, Gy.; Czinkóczky, A.

    2016-06-01

    With the rapid increase of the world's population, the efficient food production is one of the key factors of the human survival. Since biodiversity and heterogeneity is the basis of the sustainable agriculture, the authors tried to measure the heterogeneity of a chosen landscape. The EU farming and subsidizing policies (EEA, 2014) support landscape heterogeneity and diversity, nevertheless exact measurements and calculations apart from statistical parameters (standard deviation, mean), do not really exist. In the present paper the authors' goal is to find an objective, dynamic method that measures landscape heterogeneity. It is achieved with the so called agent-based modelling, where randomly dispatched dynamic scouts record the observed land cover parameters and sum up the features of a new type of land. During the simulation the agents collect a Monte Carlo integral as a diversity landscape potential which can be considered as the unit of the `greening' measure. As a final product of the ABM method, a landscape potential map is obtained that can serve as a tool for objective decision making to support agricultural diversity.

  14. Memristor-Based Computing Architecture: Design Methodologies and Circuit Techniques

    Science.gov (United States)

    2013-03-01

    3.2.6 Ten-synapse Circuit……………………………………………………………………….9 3.3 Case Study – Synapse-based ALU Design………………………………………………10 3.3.1 1-bit Adder ...3.3.4 4-bit ALU as Adder , Subtractor, and Decade Counter…………………………………..14 3.3.5 8-bit ALU as Adder , Subtractor, and Decade Counter...training………………………………………......10 13 Schematic of 1-bit adder -subtractor block……………………………………………..10 14 Schematic of 4-bit binary counter built with

  15. Methodological Aspects of Building Science-based Sports Training System for Taekwondo Sportsmen

    Directory of Open Access Journals (Sweden)

    Ananchenko Konstantin

    2016-10-01

    Full Text Available The authors have solved topical scientific problems in the article: 1 the research base in the construction of theoretical and methodological foundations of sports training, based on taekwondo has been analysed; 2 the organization and methodological requirements for the training sessions of taekwondo have been researched; 3 the necessity of interaction processes of natural development and adaptation to physical activity of young taekwondo sportsmen has been grounded; 4 the necessity of scientific evidence of building young fighters training loads in microcycles, based on their individualization has been proved.

  16. A study on development of methodology and guidelines of risk based regulation for optimization of regulation

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Sang Hoon; Chang, Soon Hong; Kang, Kee Sik; Paek, Won Phil; Kim, Han Gon; Chang, Hyeon Seop [Korea Association for Nuclear Technology, Taejon (Korea, Republic of)

    1995-12-15

    This project consists of the three phase as follow : define the RBR(Risk Based Regulation) concept and analysis of state of art to RBR to NRC, EPRI etc., develop the application area and guideline of RBR to the selected area, develop the regulatory guideline with considering the plant specific conditions in detail. For the first year of this study elementary work for risk based regulation concept was analysed and performed as follows : state of the art of RBR research status, methodology establishment for usage of PSA(Probabilistic Safety Assessment), establishment of the methodology of risk based regulation, and application area for RBR.

  17. The multi-copy simultaneous search methodology: a fundamental tool for structure-based drug design.

    Science.gov (United States)

    Schubert, Christian R; Stultz, Collin M

    2009-08-01

    Fragment-based ligand design approaches, such as the multi-copy simultaneous search (MCSS) methodology, have proven to be useful tools in the search for novel therapeutic compounds that bind pre-specified targets of known structure. MCSS offers a variety of advantages over more traditional high-throughput screening methods, and has been applied successfully to challenging targets. The methodology is quite general and can be used to construct functionality maps for proteins, DNA, and RNA. In this review, we describe the main aspects of the MCSS method and outline the general use of the methodology as a fundamental tool to guide the design of de novo lead compounds. We focus our discussion on the evaluation of MCSS results and the incorporation of protein flexibility into the methodology. In addition, we demonstrate on several specific examples how the information arising from the MCSS functionality maps has been successfully used to predict ligand binding to protein targets and RNA.

  18. Towards a self-adaptive service-oriented methodology based on extended SOMA

    Institute of Scientific and Technical Information of China (English)

    Alireza PARVIZI-MOSAED‡; Shahrouz MOAVEN; Jafar HABIBI; Ghazaleh BEIGI; Mahdieh NASER-SHARIAT

    2015-01-01

    We propose a self-adaptive process (SAP) that maintains the software architecture quality using the MAPE-K standard model. The proposed process can be plugged into various software development processes and service-oriented meth-odologies due to its explicitly defined inputs and outputs. To this aim, the proposed SAP is integrated with the service-oriented modeling and application (SOMA) methodology in a two-layered structure to create a novel methodology, named self-adaptive service-oriented architecture methodology (SASOAM), which provides a semi-automatic self-aware method by the composition of architectural tactics. Moreover, the maintenance activity of SOMA is improved using architectural and adaptive patterns, which results in controlling the software architecture quality. The improvement in the maintainability of SOMA is demonstrated by an analytic hierarchy process (AHP) based evaluation method. Furthermore, the proposed method is applied to a case study to represent the feasibility and practicality of SASOAM.

  19. Integration of process design and controller design for chemical processes using model-based methodology

    DEFF Research Database (Denmark)

    Abd.Hamid, Mohd-Kamaruddin; Sin, Gürkan; Gani, Rafiqul

    2010-01-01

    In this paper, a novel systematic model-based methodology for performing integrated process design and controller design (IPDC) for chemical processes is presented. The methodology uses a decomposition method to solve the IPDC typically formulated as a mathematical programming (optimization...... that satisfy design, control and cost criteria. The advantage of the proposed methodology is that it is systematic, makes use of thermodynamic-process knowledge and provides valuable insights to the solution of IPDC problems in chemical engineering practice....... with constraints) problem. Accordingly the optimization problem is decomposed into four sub-problems: (i) pre-analysis, (ii) design analysis, (iii) controller design analysis, and (iv) final selection and verification, which are relatively easier to solve. The methodology makes use of thermodynamic-process...

  20. Evaluating Supply Chain Management: A Methodology Based on a Theoretical Model

    Directory of Open Access Journals (Sweden)

    Alexandre Tadeu Simon

    2015-01-01

    Full Text Available Despite the increasing interest in supply chain management (SCM by researchers and practitioners, there is still a lack of academic literature concerning topics such as methodologies to guide and support SCM evaluation. Most developed methodologies have been provided by consulting companies and are restricted in their publication and use. This article presents a methodology for evaluating companies’ degree of adherence to a SCM conceptual model. The methodology is based on Cooper, Lambert and Pagh’s original contribution and involves analysis of eleven referential axes established from key business processes, horizontal structures, and initiatives & practices. We analyze the applicability of the proposed model based on findings from interviews with experts - academics and practitioners - as well as from case studies of three focal firms and their supply chains. In general terms, the methodology can be considered a diagnostic instrument that allows companies to evaluate their maturity regarding SCM practices. From this diagnosis, firms can identify and implement activities to improve degree of adherence to the reference model and achieve SCM benefits. The methodology aims to contribute to SCM theory development. It is an initial, but structured, reference for translating a theoretical approach into practical aspects.

  1. A Step-by-Step Design Methodology for a Base Case Vanadium Redox-Flow Battery

    Science.gov (United States)

    Moore, Mark; Counce, Robert M.; Watson, Jack S.; Zawodzinski, Thomas A.; Kamath, Haresh

    2012-01-01

    The purpose of this work is to develop an evolutionary procedure to be used by Chemical Engineering students for the base-case design of a Vanadium Redox-Flow Battery. The design methodology is based on the work of Douglas (1985) and provides a profitability analysis at each decision level so that more profitable alternatives and directions can be…

  2. Methodology for Web Services Adoption Based on Technology Adoption Theory and Business Process Analyses

    Institute of Scientific and Technical Information of China (English)

    AN Liping; YAN Jianyuan; TONG Lingyun

    2008-01-01

    Web services use an emerging service-oriented architecture for distributed computing. Many organizations are either in the process of adopting web services technology or evaluating this option for incorporation into their enterprise information architectures. Implementation of this new technology requires careful assessment of the needs and capabilities of an organization to formulate adoption strategies. This paper presents a methodology for web services adoption based on technology adoption theory and business process analyses. The methodology suggests that strategies, business areas, and functions within an organization should be considered based on the existing organizational information technology status during the process of adopting web services to support the business needs and requirements.

  3. A general methodology for mobility analysis of mechanisms based on constraint screw theory

    Institute of Scientific and Technical Information of China (English)

    HUANG Zhen; LIU JingFang; ZENG DaXing

    2009-01-01

    It is well known that the traditional Grubler-Kutzbach formula fails to calculate the mobility of some classical mechanisms or many modern parallel robots, and this situation seriously hampers mechani-cal innovation. To seek an efficient and universal method for mobility calculation has been a heated topic in the sphere of mechanism. The modified Grubler-Kutzbach criterion proposed by us achieved success in calculating the mobility of a lot of highly complicated mechanisms, especially the mobility of all recent parallel mechanisms listed by Gogu, and the Bennett mechanism known for its particular difficulty. With wide applications of the criterion, a systematic methodology has recently formed. This paper systematically presents the methodology based on the screw theory for the first time and ana-lyzes six representative puzzling mechanisms. In addition, the methodology is convenient for judgment of the instantaneous or full-cycle mobility, and has become an effective and general method of great scientific value and practical significance. In the first half, this paper introduces the basic screw theory,then it presents the effective methodology formed within this decade. The second half of this paperpresents how to apply the methodology by analyzing the mobility of several puzzling mechanisms.Finally, this paper contrasts and analyzes some different methods and interprets the essential reason for validity of our methodology.

  4. A general methodology for mobility analysis of mechanisms based on constraint screw theory

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    It is well known that the traditional Grübler-Kutzbach formula fails to calculate the mobility of some classical mechanisms or many modern parallel robots,and this situation seriously hampers mechani-cal innovation.To seek an efficient and universal method for mobility calculation has been a heated topic in the sphere of mechanism.The modified Grübler-Kutzbach criterion proposed by us achieved success in calculating the mobility of a lot of highly complicated mechanisms,especially the mobility of all recent parallel mechanisms listed by Gogu,and the Bennett mechanism known for its particular difficulty.With wide applications of the criterion,a systematic methodology has recently formed.This paper systematically presents the methodology based on the screw theory for the first time and ana-lyzes six representative puzzling mechanisms.In addition,the methodology is convenient for judgment of the instantaneous or full-cycle mobility,and has become an effective and general method of great scientific value and practical significance.In the first half,this paper introduces the basic screw theory,then it presents the effective methodology formed within this decade.The second half of this paper presents how to apply the methodology by analyzing the mobility of several puzzling mechanisms.Finally,this paper contrasts and analyzes some different methods and interprets the essential reason for validity of our methodology.

  5. Hybrid pn-junction solar cells based on layers of inorganic nanocrystals and organic semiconductors: optimization of layer thickness by considering the width of the depletion region.

    Science.gov (United States)

    Saha, Sudip K; Guchhait, Asim; Pal, Amlan J

    2014-03-07

    We report the formation and characterization of hybrid pn-junction solar cells based on a layer of copper diffused silver indium disulfide (AgInS2@Cu) nanoparticles and another layer of copper phthalocyanine (CuPc) molecules. With copper diffusion in the nanocrystals, their optical absorption and hence the activity of the hybrid pn-junction solar cells was extended towards the near-IR region. To decrease the particle-to-particle separation for improved carrier transport through the inorganic layer, we replaced the long-chain ligands of copper-diffused nanocrystals in each monolayer with short-ones. Under illumination, the hybrid pn-junctions yielded a higher short-circuit current as compared to the combined contribution of the Schottky junctions based on the components. A wider depletion region at the interface between the two active layers in the pn-junction device as compared to that of the Schottky junctions has been considered to analyze the results. Capacitance-voltage characteristics under a dark condition supported such a hypothesis. We also determined the width of the depletion region in the two layers separately so that a pn-junction could be formed with a tailored thickness of the two materials. Such a "fully-depleted" device resulted in an improved photovoltaic performance, primarily due to lessening of the internal resistance of the hybrid pn-junction solar cells.

  6. A novel method to isolate protein N-terminal peptides from proteome samples using sulfydryl tagging and gold-nanoparticle-based depletion.

    Science.gov (United States)

    Li, Lanting; Wu, Runqing; Yan, Guoquan; Gao, Mingxia; Deng, Chunhui; Zhang, Xiangmin

    2016-01-01

    A novel method to isolate global N-termini using sulfydryl tagging and gold-nanoparticle-based depletion (STagAu method) is presented. The N-terminal and lysine amino groups were first completely dimethylated at the protein level, after which the proteins were digested. The newly generated internal peptides were tagged with sulfydryl by Traut's reagent through digested N-terminal amines in yields of 96%. The resulting sulfydryl peptides were depleted through binding onto nano gold composite materials. The Au-S bond is stable and widely used in materials science. Nano gold composite materials showed nearly complete depletion of sulfydryl peptides. A set of the acetylated and dimethylated N-terminal peptides were analyzed by liquid chromatography-tandem mass spectrometry. This method was demonstrated to be an efficient N-terminus enrichment method because of the use of an effective derivatization reaction, in combination with robust and relative easy to implement Au-S coupling. We identified 632 N-terminal peptides from 386 proteins in a mouse liver sample. The STagAu approach presented is therefore a facile and efficient method for mass-spectrometry-based analysis of proteome N-termini or protease-generated cleavage products.

  7. Synthesis of Schiff Bases via Environmentally Benign and Energy-Efficient Greener Methodologies

    Directory of Open Access Journals (Sweden)

    Arshi Naqvi

    2009-01-01

    Full Text Available Non classical methods (water based reaction, microwave and grindstone chemistry were used for the preparation of Schiff bases from 3-chloro-4-fluoro aniline and several benzaldehydes. The key raw materials were allowed to react in water, under microwave irradiation and grindstone. These methodologies constitute an energy-efficient and environmentally benign greener chemistry version of the classical condensation reactions for Schiff bases formation.

  8. The Idea of National HRD: An Analysis Based on Economics and Theory Development Methodology

    Science.gov (United States)

    Wang, Greg G.; Swanson, Richard A.

    2008-01-01

    Recent human resource development (HRD) literature focuses attention on national HRD (NHRD) research and represents problems in both HRD identity and research methodology. Based on a review of development economics and international development literature, this study analyzes the existing NHRD literature with respect to the theory development…

  9. Teaching Research Methodology Using a Project-Based Three Course Sequence Critical Reflections on Practice

    Science.gov (United States)

    Braguglia, Kay H.; Jackson, Kanata A.

    2012-01-01

    This article presents a reflective analysis of teaching research methodology through a three course sequence using a project-based approach. The authors reflect critically on their experiences in teaching research methods courses in an undergraduate business management program. The introduction of a range of specific techniques including student…

  10. Future Directions in Adventure-based Therapy Research: Methodological Considerations and Design Suggestions.

    Science.gov (United States)

    Newes, Sandra L.

    2001-01-01

    More methodologically sound research in adventure therapy is needed if the field is to claim empirically-based efficacy as a treatment modality. Some considerations for conducting outcome studies in adventure therapy relate to standardization, multiple domain assessment, regression techniques, objective assessment of participant change, client and…

  11. Performance Assessment of the Wave Dragon Wave Energy Converter Based on the EquiMar Methodology

    DEFF Research Database (Denmark)

    Parmeggiani, Stefano; Chozas, Julia Fernandez; Pecher, Arthur;

    2011-01-01

    on experimental data deriving from sea trials rather than solely on numerical predictions. The study applies this methodology to evaluate the performance of Wave Dragon at two locations in the North Sea, based on the data acquired during the sea trials of a 1:4.5 scale prototype. Indications about power...

  12. Project-based learning in organizations : towards a methodology for learning in groups

    NARCIS (Netherlands)

    Poell, R.F.; van der Krogt, F.J.

    2003-01-01

    This article introduces a methodology for employees in organizations to set up and carry out their own group learning projects. It is argued that employees can use project-based learning to make their everyday learning more systematic at times, without necessarily formalizing it. The article emphasi

  13. The Integration of Project-Based Methodology into Teaching in Machine Translation

    Science.gov (United States)

    Madkour, Magda

    2016-01-01

    This quantitative-qualitative analytical research aimed at investigating the effect of integrating project-based teaching methodology into teaching machine translation on students' performance. Data was collected from the graduate students in the College of Languages and Translation, at Imam Muhammad Ibn Saud Islamic University, Riyadh, Saudi…

  14. EA Training 2.0 Newsletter #3 - EA Active, Problem Based Learning Methodology

    DEFF Research Database (Denmark)

    Buus, Lillian; Ryberg, Thomas; Sroga, Magdalena

    2010-01-01

    The main products of the project are innovative, active problem-based learning methodology for EA education and training, EA courses for university students and private and public sector employees, and an Enterprise Architecture competence ontology including a complete specification of skills...

  15. The Change towards a Teaching Methodology Based on Competences: A Case Study in a Spanish University

    Science.gov (United States)

    Gonzalez, Jose Maria G.; Arquero Montaño, Jose Luis; Hassall, Trevor

    2014-01-01

    The European Higher Education Area (EHEA) has promoted the implementation of a teaching methodology based on competences. Drawing on New Institutional Sociology, the present work aims to identify and improve knowledge concerning the factors which are hindering that change in the Spanish university system. This is investigated using a case study…

  16. ACTIVITY-BASED PRINCIPLE IMPLEMENTATION IN THE METHODOLOGICAL CONCEPT OF N. E. JERGANOVA

    Directory of Open Access Journals (Sweden)

    N. K. Chapaev

    2016-01-01

    Full Text Available The aim of the publication is a philosophical-pedagogical conceptualization and the discovery of the mechanism of application of the principle while solving problems of program-methodological support of vocational education on the basis of summarizing the scientific heritage of N. E. Jerganova.Methods. The research is based on the hermeneutic methodology. The leading methods of analysis are the following: interpretation, conceptualization, understanding, as well as analysis, comparing, contrasting, synthesis, generalization.Results. Based on the analysis of the texts of the works of N. E. Jerganova, interpretational model of actualization principle of activity in developing methods of professional education is developed. According to this model, personal and professional development of man is attained primarily due to the actualization of his/her intellectual and creative potential, turning it into relatively self-sufficient entity of educational activity.Scientific novelty. Philosophical-methodological foundations and technological tools of the application of the principle activities in the process of problem solving of software and methodological support of professional education are disclosed; the expediency of development of activity-based strategy of professional education in modern conditions is proved.Practical significance. The thesis and conclusions of the article can be used in the development of the modern concept of methodological activities, in the process of program-methodological support of educational activity in professional education organizations.

  17. A Compensation-Based Optimization Methodology for Gain-Boosted OPAMP

    Science.gov (United States)

    2004-05-14

    Reprint_._ 4. TiLu AWo 3UNTnU S. FUNDNIG UUMEAS A Compensation-Based Optimization Methodollogy for N00014-94-1-0931 Gain-Boosted OPAMP Jie Yuan and Nabil...Unlimited 13. •SW.&•CT %Af~xum200l..uord,) A gain-boosted OPAMP design methodology is presented. The methodology provides a systematic way of gain...boosted OPAMP optimization in terms- of AC response and settling performance. The evolution of the major poles and zeros of the gain- boosted OPAMP is

  18. A Novel HW/SW Based NoC Router Self-Testing Methodology

    OpenAIRE

    Nazari, Masoom; Lighvan, Mina Zolfy; Koozekonani, Ziaeddin Daie; Sadeghi, Ali

    2016-01-01

    Network-on-Chip (NoC) architecture has been proposed to solve the global communication problem of complex Systems-on-Chips (SoCs). However, NoC testing is a main challenging problem yet. In this article, we propose novel test architecture for NoC router testing. The proposed test architecture uses both advantages of Software Based Self-Testing (SBST) and Built in Self-Testing (BIST) methodologies. In this methodology, we propose custom test instructions with regarding the ISA of NoC processor...

  19. MRI脑测谎实验方法学%Brain-Based MRI lie detection experiment methodology

    Institute of Scientific and Technical Information of China (English)

    李文石; 张好; 胡清泉; 苏香; 郭亮

    2006-01-01

    The brain-based MRI lie detection experiment methodology is reviewed for the first time, including the magnetic resonance imaging paradigm,the double-block deign,the equidistance hit-ball and the test mechanice,This paper illustrates the research results of 3D MRI lie detection and the contrastive experiment of otopoint mapping brain signature lie detection,ingeminates the lie-Truth Law(PT/PL ≤0.618)which we get from the statistic of the world MRI reports. The conclusion points out the essence of this technology,its advantages and disadvantages,and the evolution of this methodology.

  20. Methodology to estimate parameters of an excitation system based on experimental conditions

    Energy Technology Data Exchange (ETDEWEB)

    Saavedra-Montes, A.J. [Carrera 80 No 65-223, Bloque M8 oficina 113, Escuela de Mecatronica, Universidad Nacional de Colombia, Medellin (Colombia); Calle 13 No 100-00, Escuela de Ingenieria Electrica y Electronica, Universidad del Valle, Cali, Valle (Colombia); Ramirez-Scarpetta, J.M. [Calle 13 No 100-00, Escuela de Ingenieria Electrica y Electronica, Universidad del Valle, Cali, Valle (Colombia); Malik, O.P. [2500 University Drive N.W., Electrical and Computer Engineering Department, University of Calgary, Calgary, Alberta (Canada)

    2011-01-15

    A methodology to estimate the parameters of a potential-source controlled rectifier excitation system model is presented in this paper. The proposed parameter estimation methodology is based on the characteristics of the excitation system. A comparison of two pseudo random binary signals, two sampling periods for each one, and three estimation algorithms is also presented. Simulation results from an excitation control system model and experimental results from an excitation system of a power laboratory setup are obtained. To apply the proposed methodology, the excitation system parameters are identified at two different levels of the generator saturation curve. The results show that it is possible to estimate the parameters of the standard model of an excitation system, recording two signals and the system operating in closed loop with the generator. The normalized sum of squared error obtained with experimental data is below 10%, and with simulation data is below 5%. (author)

  1. INSTALLING AN ERP SYSTEM WITH A METHODOLOGY BASED ON THE PRINCIPLES OF GOAL DIRECTED PROJECT MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Ioannis Zafeiropoulos

    2010-01-01

    Full Text Available This paper describes a generic methodology to support the process of modelling, adaptation and implementation (MAI of Enterprise Resource Planning Systems (ERPS based on the principles of goal directed project management (GDPM. The proposed methodology guides the project manager through specific stages in order to successfully complete the ERPS implementation. The development of the proper MAI methodology is deemed necessary because it will simplify the installation process of ERPS. The goal directed project management method was chosen since it provides a way of focusing all changes towards a predetermined goal. The main stages of the methodology are the promotion and preparation steps, the proposal, the contract, the implementation and the completion. The methodology was applied as a pilot application by a major ERPS development company. Important benefits were the easy and effective guidance for all installation and analysis stages, the faster installation for the ERPS and the control and cost reduction for the installation, in terms of time, manpower, technological equipment and other resources.

  2. INSTALLING AN ERP SYSTEM WITH A METHODOLOGY BASED ON THE PRINCIPLES OF GOAL DIRECTED PROJECT MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Ioannis Zafeiropoulos

    2009-12-01

    Full Text Available This paper describes a generic methodology to support the process of modelling, adaptation and implementation (MAI of Enterprise Resource Planning Systems (ERPS based on the principles of goal directed project management (GDPM. The proposed methodology guides the project manager through specific stages in order to successfully complete the ERPS implementation. The development of the proper MAI methodology is deemed necessary because it will simplify the installation process of ERPS. The goal directed project management method was chosen since it provides a way of focusing all changes towards a predetermined goal. The main stages of the methodology are the promotion and preparation steps, the proposal, the contract, the implementation and the completion. The methodology was applied as a pilot application by a major ERPS development company. Important benefits were the easy and effective guidance for all installation and analysis stages, the faster installation for the ERPS and the control and cost reduction for the installation, in terms of time, manpower, technological equipment and other resources.

  3. Design-Based Research: Is This a Suitable Methodology for Short-Term Projects?

    Science.gov (United States)

    Pool, Jessica; Laubscher, Dorothy

    2016-01-01

    This article reports on a design-based methodology of a thesis in which a fully face-to-face contact module was converted into a blended learning course. The purpose of the article is to report on how design-based phases, in the form of micro-, meso- and macro-cycles were applied to improve practice and to generate design principles. Design-based…

  4. Application of the Simulation Based Reliability Analysis on the LBB methodology

    OpenAIRE

    Pečínka L.; Švrček M.

    2008-01-01

    Guidelines on how to demonstrate the existence of Leak Before Break (LBB) have been developed in many western countries. These guidelines, partly based on NUREG/CR-6765, define the steps that should be fulfilled to get a conservative assessment of LBB acceptability. As a complement and also to help identify the key parameters that influence the resulting leakage and failure probabilities, the application of Simulation Based Reliability Analysis is under development. The used methodology will ...

  5. A Study on Ductility of Prestressed Concrete Pier Based on Response Surface Methodology

    OpenAIRE

    Wang, H.; Zhang, Y; Qin, S.

    2016-01-01

    The ductility of prestressed concrete pier is studied based on response surface methodology. Referring to the pervious prestressed concrete pier, based on Box-Behnken design, the ductility of 25 prestressed concrete piers is calculated by numerical method. The relationship between longitudinal reinforcement ratio, shear reinforcement ratio, prestressed tendon quantity, concrete compressive strength and ductility factor is gotten. The influence of the longitudinal reinforcement ratio, the shea...

  6. Instruments and Experiments Control Methodology Based on IVI and LXI Technologies

    OpenAIRE

    Unai Hernandez

    2011-01-01

    In this paper we present a new model to control the instruments and experiments in a remote laboratory. This model is based on LAN networks and a control methodology through reusable drivers. The objective is to obtain a software control architecture independent of the hardware of the laboratory, so each institution can use its own equipments and experiments based on its needs and with minimal restrictions regarding to the hardware of the lab.

  7. A methodological approach based on indirect sampling to survey the homeless people

    OpenAIRE

    Claudia De Vitiis; Stefano Falorsi; Francesca Inglese; Alessandra Masi; Nicoletta Pannuzi; Monica Russo

    2014-01-01

    The Italian National Institute of Statistics carried out the first survey on homeless population. The survey aims at estimating the unknown size and some demographic and social characteristics of this population. The methodological strategy used to investigate homeless population could not follow the standard approaches of official statistics usually based on the use of population lists. The sample strategy for the homeless survey refers to the theory of indirect sampling, based on the use of...

  8. 5.0. Depletion, activation, and spent fuel source terms

    Energy Technology Data Exchange (ETDEWEB)

    Wieselquist, William A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-04-01

    SCALE’s general depletion, activation, and spent fuel source terms analysis capabilities are enabled through a family of modules related to the main ORIGEN depletion/irradiation/decay solver. The nuclide tracking in ORIGEN is based on the principle of explicitly modeling all available nuclides and transitions in the current fundamental nuclear data for decay and neutron-induced transmutation and relies on fundamental cross section and decay data in ENDF/B VII. Cross section data for materials and reaction processes not available in ENDF/B-VII are obtained from the JEFF-3.0/A special purpose European activation library containing 774 materials and 23 reaction channels with 12,617 neutron-induced reactions below 20 MeV. Resonance cross section corrections in the resolved and unresolved range are performed using a continuous-energy treatment by data modules in SCALE. All nuclear decay data, fission product yields, and gamma-ray emission data are developed from ENDF/B-VII.1 evaluations. Decay data include all ground and metastable state nuclides with half-lives greater than 1 millisecond. Using these data sources, ORIGEN currently tracks 174 actinides, 1149 fission products, and 974 activation products. The purpose of this chapter is to describe the stand-alone capabilities and underlying methodology of ORIGEN—as opposed to the integrated depletion capability it provides in all coupled neutron transport/depletion sequences in SCALE, as described in other chapters.

  9. Ozone Depletion Caused by Rocket Engine Emissions: A Fundamental Limit on the Scale and Viability of Space-Based Geoengineering Schemes

    Science.gov (United States)

    Ross, M. N.; Toohey, D.

    2008-12-01

    Emissions from solid and liquid propellant rocket engines reduce global stratospheric ozone levels. Currently ~ one kiloton of payloads are launched into earth orbit annually by the global space industry. Stratospheric ozone depletion from present day launches is a small fraction of the ~ 4% globally averaged ozone loss caused by halogen gases. Thus rocket engine emissions are currently considered a minor, if poorly understood, contributor to ozone depletion. Proposed space-based geoengineering projects designed to mitigate climate change would require order of magnitude increases in the amount of material launched into earth orbit. The increased launches would result in comparable increases in the global ozone depletion caused by rocket emissions. We estimate global ozone loss caused by three space-based geoengineering proposals to mitigate climate change: (1) mirrors, (2) sunshade, and (3) space-based solar power (SSP). The SSP concept does not directly engineer climate, but is touted as a mitigation strategy in that SSP would reduce CO2 emissions. We show that launching the mirrors or sunshade would cause global ozone loss between 2% and 20%. Ozone loss associated with an economically viable SSP system would be at least 0.4% and possibly as large as 3%. It is not clear which, if any, of these levels of ozone loss would be acceptable under the Montreal Protocol. The large uncertainties are mainly caused by a lack of data or validated models regarding liquid propellant rocket engine emissions. Our results offer four main conclusions. (1) The viability of space-based geoengineering schemes could well be undermined by the relatively large ozone depletion that would be caused by the required rocket launches. (2) Analysis of space- based geoengineering schemes should include the difficult tradeoff between the gain of long-term (~ decades) climate control and the loss of short-term (~ years) deep ozone loss. (3) The trade can be properly evaluated only if our

  10. Visualization of stratospheric ozone depletion and the polar vortex

    Science.gov (United States)

    Treinish, Lloyd A.

    1995-01-01

    Direct analysis of spacecraft observations of stratospheric ozone yields information about the morphology of annual austral depletion. Visual correlation of ozone with other atmospheric data illustrates the diurnal dynamics of the polar vortex and contributions from the upper troposphere, including the formation and breakup of the depletion region each spring. These data require care in their presentation to minimize the introduction of visualization artifacts that are erroneously interpreted as data features. Non geographically registered data of differing mesh structures can be visually correlated via cartographic warping of base geometries without interpolation. Because this approach is independent of the realization technique, it provides a framework for experimenting with many visualization strategies. This methodology preserves the fidelity of the original data sets in a coordinate system suitable for three-dimensional, dynamic examination of atmospheric phenomena.

  11. A Transformation-oriented Methodology to Knowledge-based Conceptual Data Warehouse Design

    Directory of Open Access Journals (Sweden)

    Opim S. Sitompul

    2006-01-01

    Full Text Available Applications of artificial intelligence (AI technology in the form of knowledge-based systems within the context of database design have been extensively researched particularly to provide support within the conceptual design phase. However, a similar approach to the task of data warehouse design has yet to be seriously initiated. In this paper, we proposed a design methodology for conceptual data warehouse design called the transformation-oriented methodology, which transforms an Entity-Relationship (ER model into a multidimensional model based on a series of transformation and analysis rules. The transformation-oriented methodology translates the ER model into a specification language model and transformed it into an initial problem domain model. A set of synthesis and diagnosis rules will then gradually transform the problem domain model into the multidimensional model. A prototype KB tool called the DWDesigner has been developed to implement the aforementioned methodology. The multidimensional model produces by the DWDesigner as output is presented in a graphical form for better visualization. Testing has been conducted to a number of design problems, such as university, business and hospital domains and consistent results have been achieved.

  12. A new methodology to define homogeneous regions through an entropy based clustering method

    Science.gov (United States)

    Ridolfi, E.; Rianna, M.; Trani, G.; Alfonso, L.; Di Baldassarre, G.; Napolitano, F.; Russo, F.

    2016-10-01

    One of the most crucial steps in flow frequency studies is the definition of Homogenous Regions (HRs), i.e. areas with similar hydrological behavior. This is essential in ungauged catchments, as HR allows information to be transferred from a neighboring river basin. This study proposes a new, entropy-based approach to define HRs, in which regions are defined as homogeneous if their hydrometric stations capture redundant information. The problem is handled through the definition of the Information Transferred Index (ITI) as the ratio between redundant information and the total information provided by pairs of stations. The methodology is compared with a traditional, distance-based clustering method through a Monte Carlo experiment and a jack-knife procedure. Results indicate that the ITI-based method performs well, adding value to current methodologies to define HRs.

  13. A fault diagnosis methodology for rolling element bearings based on advanced signal pretreatment and autoregressive modelling

    Science.gov (United States)

    Al-Bugharbee, Hussein; Trendafilova, Irina

    2016-05-01

    This study proposes a methodology for rolling element bearings fault diagnosis which gives a complete and highly accurate identification of the faults present. It has two main stages: signals pretreatment, which is based on several signal analysis procedures, and diagnosis, which uses a pattern-recognition process. The first stage is principally based on linear time invariant autoregressive modelling. One of the main contributions of this investigation is the development of a pretreatment signal analysis procedure which subjects the signal to noise cleaning by singular spectrum analysis and then stationarisation by differencing. So the signal is transformed to bring it close to a stationary one, rather than complicating the model to bring it closer to the signal. This type of pretreatment allows the use of a linear time invariant autoregressive model and improves its performance when the original signals are non-stationary. This contribution is at the heart of the proposed method, and the high accuracy of the diagnosis is a result of this procedure. The methodology emphasises the importance of preliminary noise cleaning and stationarisation. And it demonstrates that the information needed for fault identification is contained in the stationary part of the measured signal. The methodology is further validated using three different experimental setups, demonstrating very high accuracy for all of the applications. It is able to correctly classify nearly 100 percent of the faults with regard to their type and size. This high accuracy is the other important contribution of this methodology. Thus, this research suggests a highly accurate methodology for rolling element bearing fault diagnosis which is based on relatively simple procedures. This is also an advantage, as the simplicity of the individual processes ensures easy application and the possibility for automation of the entire process.

  14. Attack Methodology Analysis: Emerging Trends in Computer-Based Attack Methodologies and Their Applicability to Control System Networks

    Energy Technology Data Exchange (ETDEWEB)

    Bri Rolston

    2005-06-01

    Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills, and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between

  15. A Novel Consensus-Based Particle Swarm Optimization-Assisted Trust-Tech Methodology for Large-Scale Global Optimization.

    Science.gov (United States)

    Zhang, Yong-Feng; Chiang, Hsiao-Dong

    2016-06-20

    A novel three-stage methodology, termed the "consensus-based particle swarm optimization (PSO)-assisted Trust-Tech methodology," to find global optimal solutions for nonlinear optimization problems is presented. It is composed of Trust-Tech methods, consensus-based PSO, and local optimization methods that are integrated to compute a set of high-quality local optimal solutions that can contain the global optimal solution. The proposed methodology compares very favorably with several recently developed PSO algorithms based on a set of small-dimension benchmark optimization problems and 20 large-dimension test functions from the CEC 2010 competition. The analytical basis for the proposed methodology is also provided. Experimental results demonstrate that the proposed methodology can rapidly obtain high-quality optimal solutions that can contain the global optimal solution. The scalability of the proposed methodology is promising.

  16. Intrinsic Depletion or Not

    DEFF Research Database (Denmark)

    Klösgen, Beate; Bruun, Sara; Hansen, Søren;

      The presence of a depletion layer of water along extended hydrophobic interfaces, and a possibly related formation of nanobubbles, is an ongoing discussion. The phenomenon was initially reported when we, years ago, chose thick films (~300-400Å) of polystyrene as cushions between a crystalline...... giving rise to depletion layers, and the mechanisms and border conditions that control their presence and extension require still clarification. Recently, careful systematic reflectivity experiments were re-done on the same system. No depletion layers were found, and it was conjectured that the whole...

  17. Development of the MCNPX depletion capability: A Monte Carlo linked depletion method that automates the coupling between MCNPX and CINDER90 for high fidelity burnup calculations

    Science.gov (United States)

    Fensin, Michael Lorne

    and supported radiation transport code, for further development of a Monte Carlo-linked depletion methodology which is essential to the future development of advanced reactor technologies that exceed the limitations of current deterministic based methods.

  18. A Vision-Based Methodology to Dynamically Track and Describe Cell Deformation during Cell Micromanipulation

    Science.gov (United States)

    Karimirad, Fatemeh; Shirinzadeh, Bijan; Yan, Wenyi; Fatikow, Sergej

    2013-02-01

    The main objective of this article is to mechanize the procedure of tracking and describing the various phases of deformation of a biological circular cell during micromanipulation. The devised vision-based methodology provides a real-time strategy to track and describe the cell deformation by extracting a geometric feature called dimple angle. An algorithm based on Snake was established to acquire the boundary of the indenting cell and measure the aforementioned feature. Micromanipulation experiments were conducted for zebrafish embryos. Experimental results were used to characterize the deformation of the manipulating embryo by the devised geometric parameter. The results demonstrated the high capability of the methodology. The proposed method is applicable to the micromanipulation of other circular biological embryos such as injection of the mouse oocyte/embryo. Supplemental materials are available for this article. Go to the publisher's online edition of the International Journal of Optomechatronics to view the supplemental files.

  19. A Novel Clustering Methodology Based on Modularity Optimisation for Detecting Authorship Affinities in Shakespearean Era Plays.

    Science.gov (United States)

    Naeni, Leila M; Craig, Hugh; Berretta, Regina; Moscato, Pablo

    2016-01-01

    In this study we propose a novel, unsupervised clustering methodology for analyzing large datasets. This new, efficient methodology converts the general clustering problem into the community detection problem in graph by using the Jensen-Shannon distance, a dissimilarity measure originating in Information Theory. Moreover, we use graph theoretic concepts for the generation and analysis of proximity graphs. Our methodology is based on a newly proposed memetic algorithm (iMA-Net) for discovering clusters of data elements by maximizing the modularity function in proximity graphs of literary works. To test the effectiveness of this general methodology, we apply it to a text corpus dataset, which contains frequencies of approximately 55,114 unique words across all 168 written in the Shakespearean era (16th and 17th centuries), to analyze and detect clusters of similar plays. Experimental results and comparison with state-of-the-art clustering methods demonstrate the remarkable performance of our new method for identifying high quality clusters which reflect the commonalities in the literary style of the plays.

  20. A Novel Clustering Methodology Based on Modularity Optimisation for Detecting Authorship Affinities in Shakespearean Era Plays

    Science.gov (United States)

    Craig, Hugh; Berretta, Regina; Moscato, Pablo

    2016-01-01

    In this study we propose a novel, unsupervised clustering methodology for analyzing large datasets. This new, efficient methodology converts the general clustering problem into the community detection problem in graph by using the Jensen-Shannon distance, a dissimilarity measure originating in Information Theory. Moreover, we use graph theoretic concepts for the generation and analysis of proximity graphs. Our methodology is based on a newly proposed memetic algorithm (iMA-Net) for discovering clusters of data elements by maximizing the modularity function in proximity graphs of literary works. To test the effectiveness of this general methodology, we apply it to a text corpus dataset, which contains frequencies of approximately 55,114 unique words across all 168 written in the Shakespearean era (16th and 17th centuries), to analyze and detect clusters of similar plays. Experimental results and comparison with state-of-the-art clustering methods demonstrate the remarkable performance of our new method for identifying high quality clusters which reflect the commonalities in the literary style of the plays. PMID:27571416

  1. Model-based interpretation of the ECG: a methodology for temporal and spatial reasoning.

    OpenAIRE

    Tong, D. A.; Widman, L. E.

    1992-01-01

    A new software architecture for automatic interpretation of the electrocardiogram is presented. Using the hypothesize-and-test paradigm, a semi-quantitative physiological model and production rule-based knowledge are combined to reason about time- and space-varying characteristics of complex heart rhythms. A prototype system implementing the methodology accepts a semi-quantitative description of the onset and morphology of the P waves and QRS complexes that are observed in the body-surface el...

  2. A Model-Based Methodology for Spray-Drying Process Development

    OpenAIRE

    Dobry, Dan E.; Settell, Dana M.; Baumann, John M.; Ray, Rod J.; Graham, Lisa J; Beyerinck, Ron A.

    2009-01-01

    Solid amorphous dispersions are frequently used to improve the solubility and, thus, the bioavailability of poorly soluble active pharmaceutical ingredients (APIs). Spray-drying, a well-characterized pharmaceutical unit operation, is ideally suited to producing solid amorphous dispersions due to its rapid drying kinetics. This paper describes a novel flowchart methodology based on fundamental engineering models and state-of-the-art process characterization techniques that ensure that spray-dr...

  3. Enlightenment of ISO Methodology to Assess and Communicate the Economic Benefits of Consensus-Based Standards

    Institute of Scientific and Technical Information of China (English)

    Wang Zhongmin

    2011-01-01

    In March 2010,ISO released the Methodology to Assess and Communicate the Economic Benefits of Consensus-Based Standards,which is of a great theoretical and practical significance.Standards and standardization is not a simple administrative work,but a kind of highly professional and technical business.The reality is just the opposite in that the standardization work worldwide always lacks the support of theories.The case is even worse in China.

  4. An Intuitionistic Fuzzy Methodology for Component-Based Software Reliability Optimization

    DEFF Research Database (Denmark)

    Madsen, Henrik; Grigore, Albeanu; Popenţiuvlǎdicescu, Florin

    2012-01-01

    Component-based software development is the current methodology facilitating agility in project management, software reuse in design and implementation, promoting quality and productivity, and increasing the reliability and performability. This paper illustrates the usage of intuitionistic fuzzy...... degree approach in modelling the quality of entities in imprecise software reliability computing in order to optimize management results. Intuitionistic fuzzy optimization algorithms are proposed to be used for complex software systems reliability optimization under various constraints....

  5. Learning Theory Bases of Communicative Methodology and the Notional/Functional Syllabus

    OpenAIRE

    Jacqueline D., Beebe

    1992-01-01

    This paper examines the learning theories that underlie the philosophy and practices known as communicative language teaching methodology. These theories are identified first as a reaction against the behavioristic learning theory of audiolingualism. Approaches to syllabus design based on both the "weak" version of communicative language teaching-learning to use the second language-and the "strong" version-using the second language to learn it-are examined. The application of cognitive theory...

  6. Forest soil nutrient status after 10 years of experimental acidification and base cation depletion : results from 2 long-term soil productivity sites in the central Appalachians

    Energy Technology Data Exchange (ETDEWEB)

    Adams, M.B. [United States Dept. of Agriculture Forest Service, Parsons, WV (United States); Burger, J.A. [Virginia Tech University, Blacks Burg, VA (United States)

    2010-07-01

    This study assessed the hypothesis that soil based cation depletion is an effect of acidic deposition in forests located in the central Appalachians. The effects of experimentally induced base cation depletion were evaluated in relation to long-term soil productivity and the sustainability of forest stands. Whole-tree harvesting was conducted along with the removal of dead wood litter in order to remove all aboveground nutrients. Ammonium sulfate fertilizer was added at annual rates of 40.6 kg S/ha and 35.4 kg N/h in order to increase the leaching of calcium (Ca) and magnesium (Mg) from the soil. A randomized complete block design was used in 4 or 5 treatment applications in a mixed hardwood experimental forest located in West Virginia and in a cherry-maple forest located in a national forest in West Virginia. Soils were sampled over a 10-year period. The study showed that significant changes in soil Mg, N and some other nutrients occurred over time. However, biomass did not differ significantly among the different treatment options used.

  7. AN INDUCTIVE, INTERACTIVE AND ADAPTIVE HYBRID PROBLEM-BASED LEARNING METHODOLOGY: APPLICATION TO STATISTICS

    Directory of Open Access Journals (Sweden)

    ADA ZHENG

    2011-10-01

    Full Text Available We have developed an innovative hybrid problem-based learning (PBL methodology. The methodology has the following distinctive features: i Each complex question was decomposed into a set of coherent finer subquestions by following the carefully designed criteria to maintain a delicate balance between guiding the students and inspiring them to think independently. This learning methodology enabled the students to solve the complex questions progressively in an inductive context. ii Facilitated by the utilization of our web-based learning systems, the teacher was able to interact with the students intensively and could allocate more teaching time to provide tailor-made feedback for individual student. The students were actively engaged in the learning activities, stimulated by the intensive interaction. iii The answers submitted by the students could be automatically consolidated in the report of the Moodle system in real-time. The teacher could adjust the teaching schedule and focus of the class to adapt to the learning progress of the students by analysing the automatically generated report and log files of the web-based learning system. As a result, the attendance rate of the students increased from about 50% to more than 90%, and the students’ learning motivation have been significantly enhanced.

  8. A Fault Diagnosis Methodology for Gear Pump Based on EEMD and Bayesian Network.

    Science.gov (United States)

    Liu, Zengkai; Liu, Yonghong; Shan, Hongkai; Cai, Baoping; Huang, Qing

    2015-01-01

    This paper proposes a fault diagnosis methodology for a gear pump based on the ensemble empirical mode decomposition (EEMD) method and the Bayesian network. Essentially, the presented scheme is a multi-source information fusion based methodology. Compared with the conventional fault diagnosis with only EEMD, the proposed method is able to take advantage of all useful information besides sensor signals. The presented diagnostic Bayesian network consists of a fault layer, a fault feature layer and a multi-source information layer. Vibration signals from sensor measurement are decomposed by the EEMD method and the energy of intrinsic mode functions (IMFs) are calculated as fault features. These features are added into the fault feature layer in the Bayesian network. The other sources of useful information are added to the information layer. The generalized three-layer Bayesian network can be developed by fully incorporating faults and fault symptoms as well as other useful information such as naked eye inspection and maintenance records. Therefore, diagnostic accuracy and capacity can be improved. The proposed methodology is applied to the fault diagnosis of a gear pump and the structure and parameters of the Bayesian network is established. Compared with artificial neural network and support vector machine classification algorithms, the proposed model has the best diagnostic performance when sensor data is used only. A case study has demonstrated that some information from human observation or system repair records is very helpful to the fault diagnosis. It is effective and efficient in diagnosing faults based on uncertain, incomplete information.

  9. Addressing Ozone Layer Depletion

    Science.gov (United States)

    Access information on EPA's efforts to address ozone layer depletion through regulations, collaborations with stakeholders, international treaties, partnerships with the private sector, and enforcement actions under Title VI of the Clean Air Act.

  10. Generic Methodology for Field Calibration of Nacelle-Based Wind Lidars

    Directory of Open Access Journals (Sweden)

    Antoine Borraccino

    2016-11-01

    Full Text Available Nacelle-based Doppler wind lidars have shown promising capabilities to assess power performance, detect yaw misalignment or perform feed-forward control. The power curve application requires uncertainty assessment. Traceable measurements and uncertainties of nacelle-based wind lidars can be obtained through a methodology applicable to any type of existing and upcoming nacelle lidar technology. The generic methodology consists in calibrating all the inputs of the wind field reconstruction algorithms of a lidar. These inputs are the line-of-sight velocity and the beam position, provided by the geometry of the scanning trajectory and the lidar inclination. The line-of-sight velocity is calibrated in atmospheric conditions by comparing it to a reference quantity based on classic instrumentation such as cup anemometers and wind vanes. The generic methodology was tested on two commercially developed lidars, one continuous wave and one pulsed systems, and provides consistent calibration results: linear regressions show a difference of ∼0.5% between the lidar-measured and reference line-of-sight velocities. A comprehensive uncertainty procedure propagates the reference uncertainty to the lidar measurements. At a coverage factor of two, the estimated line-of-sight velocity uncertainty ranges from 3.2% at 3 m · s − 1 to 1.9% at 16 m · s − 1 . Most of the line-of-sight velocity uncertainty originates from the reference: the cup anemometer uncertainty accounts for ∼90% of the total uncertainty. The propagation of uncertainties to lidar-reconstructed wind characteristics can use analytical methods in simple cases, which we demonstrate through the example of a two-beam system. The newly developed calibration methodology allows robust evaluation of a nacelle lidar’s performance and uncertainties to be established. Calibrated nacelle lidars may consequently be further used for various wind turbine applications in confidence.

  11. The development of a neuroscience-based methodology for the nuclear energy learning/teaching process

    Energy Technology Data Exchange (ETDEWEB)

    Barabas, Roberta de C.; Sabundjian, Gaiane, E-mail: robertabarabas@usp.br, E-mail: gdjian@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    When compared to other energy sources such as fossil fuels, coal, oil, and gas, nuclear energy has perhaps the lowest impact on the environment. Moreover, nuclear energy has also benefited other fields such as medicine, pharmaceutical industry, and agriculture, among others. However, despite all benefits that result from the peaceful uses of nuclear energy, the theme is still addressed with prejudice. Education may be the starting point for public acceptance of nuclear energy as it provides pedagogical approaches, learning environments, and human resources, which are essential conditions for effective learning. So far nuclear energy educational researches have been conducted using only conventional assessment methods. The global educational scenario has demonstrated absence of neuroscience-based methods for the teaching of nuclear energy, and that may be an opportunity for developing new strategic teaching methods that will help demystifying the theme consequently improving public acceptance of this type of energy. This work aims to present the first step of a methodology in progress based on researches in neuroscience to be applied to Brazilian science teachers in order to contribute to an effective teaching/learning process. This research will use the Implicit Association Test (IAT) to verify implicit attitudes of science teachers concerning nuclear energy. Results will provide data for the next steps of the research. The literature has not reported a similar neuroscience-based methodology applied to the nuclear energy learning/teaching process; therefore, this has demonstrated to be an innovating methodology. The development of the methodology is in progress and the results will be presented in future works. (author)

  12. A methodology towards virtualisation-based high performance simulation platform supporting multidisciplinary design of complex products

    Science.gov (United States)

    Ren, Lei; Zhang, Lin; Tao, Fei; (Luke) Zhang, Xiaolong; Luo, Yongliang; Zhang, Yabin

    2012-08-01

    Multidisciplinary design of complex products leads to an increasing demand for high performance simulation (HPS) platforms. One great challenge is how to achieve high efficient utilisation of large-scale simulation resources in distributed and heterogeneous environments. This article reports a virtualisation-based methodology to realise a HPS platform. This research is driven by the issues concerning large-scale simulation resources deployment and complex simulation environment construction, efficient and transparent utilisation of fine-grained simulation resources and high reliable simulation with fault tolerance. A framework of virtualisation-based simulation platform (VSIM) is first proposed. Then the article investigates and discusses key approaches in VSIM, including simulation resources modelling, a method to automatically deploying simulation resources for dynamic construction of system environment, and a live migration mechanism in case of faults in run-time simulation. Furthermore, the proposed methodology is applied to a multidisciplinary design system for aircraft virtual prototyping and some experiments are conducted. The experimental results show that the proposed methodology can (1) significantly improve the utilisation of fine-grained simulation resources, (2) result in a great reduction in deployment time and an increased flexibility for simulation environment construction and (3)achieve fault tolerant simulation.

  13. IFC BIM-Based Methodology for Semi-Automated Building Energy Performance Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Bazjanac, Vladimir

    2008-07-01

    Building energy performance (BEP) simulation is still rarely used in building design, commissioning and operations. The process is too costly and too labor intensive, and it takes too long to deliver results. Its quantitative results are not reproducible due to arbitrary decisions and assumptions made in simulation model definition, and can be trusted only under special circumstances. A methodology to semi-automate BEP simulation preparation and execution makes this process much more effective. It incorporates principles of information science and aims to eliminate inappropriate human intervention that results in subjective and arbitrary decisions. This is achieved by automating every part of the BEP modeling and simulation process that can be automated, by relying on data from original sources, and by making any necessary data transformation rule-based and automated. This paper describes the new methodology and its relationship to IFC-based BIM and software interoperability. It identifies five steps that are critical to its implementation, and shows what part of the methodology can be applied today. The paper concludes with a discussion of application to simulation with EnergyPlus, and describes data transformation rules embedded in the new Geometry Simplification Tool (GST).

  14. A safety assessment methodology applied to CNS/ATM-based air traffic control system

    Energy Technology Data Exchange (ETDEWEB)

    Vismari, Lucio Flavio, E-mail: lucio.vismari@usp.b [Safety Analysis Group (GAS), School of Engineering at University of Sao Paulo (Poli-USP), Av. Prof. Luciano Gualberto, Trav.3, n.158, Predio da Engenharia de Eletricidade, Sala C2-32, CEP 05508-900, Sao Paulo (Brazil); Batista Camargo Junior, Joao, E-mail: joaocamargo@usp.b [Safety Analysis Group (GAS), School of Engineering at University of Sao Paulo (Poli-USP), Av. Prof. Luciano Gualberto, Trav.3, n.158, Predio da Engenharia de Eletricidade, Sala C2-32, CEP 05508-900, Sao Paulo (Brazil)

    2011-07-15

    In the last decades, the air traffic system has been changing to adapt itself to new social demands, mainly the safe growth of worldwide traffic capacity. Those changes are ruled by the Communication, Navigation, Surveillance/Air Traffic Management (CNS/ATM) paradigm , based on digital communication technologies (mainly satellites) as a way of improving communication, surveillance, navigation and air traffic management services. However, CNS/ATM poses new challenges and needs, mainly related to the safety assessment process. In face of these new challenges, and considering the main characteristics of the CNS/ATM, a methodology is proposed at this work by combining 'absolute' and 'relative' safety assessment methods adopted by the International Civil Aviation Organization (ICAO) in ICAO Doc.9689 , using Fluid Stochastic Petri Nets (FSPN) as the modeling formalism, and compares the safety metrics estimated from the simulation of both the proposed (in analysis) and the legacy system models. To demonstrate its usefulness, the proposed methodology was applied to the 'Automatic Dependent Surveillance-Broadcasting' (ADS-B) based air traffic control system. As conclusions, the proposed methodology assured to assess CNS/ATM system safety properties, in which FSPN formalism provides important modeling capabilities, and discrete event simulation allowing the estimation of the desired safety metric.

  15. Rule-based Expert Systems for Selecting Information Systems Development Methodologies

    Directory of Open Access Journals (Sweden)

    Abdel Nasser H. Zaied

    2013-08-01

    Full Text Available Information Systems (IS are increasingly becoming regarded as crucial to an organization's success. Information Systems Development Methodologies (ISDMs are used by organizations to structure the information system development process. ISDMs are essential for structuring project participants’ thinking and actions; therefore ISDMs play an important role to achieve successful projects. There are different ISDMs and no methodology can claim that it can be applied to any organization. The problem facing decision makers is how to select an appropriate development methodology that may increase the probability of system success. This paper takes this issue into account when study ISDMs and provides a Rule-based Expert System as a tool for selecting appropriate ISDMs. The proposed expert system consists of three main phases to automate the process of selecting ISDMs.Three approaches were used to test the proposed expert system. Face validation through six professors and six IS professionals, predictive validation through twenty four experts and blind validation through nine employees working in IT field.The results show that the proposed system was found to be run without any errors, offered a friendly user interface and its suggestions matching user expectations with 95.8%. It also can help project managers, systems' engineers, systems' developers, consultants, and planners in the process of selecting the suitable ISDM. Finally, the results show that the proposed Rule-based Expert System can facilities the selection process especially for new users and non-specialist in Information System field.

  16. Specification for the VERA Depletion Benchmark Suite

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kang Seog [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-12-17

    CASL-X-2015-1014-000 iii Consortium for Advanced Simulation of LWRs EXECUTIVE SUMMARY The CASL neutronics simulator MPACT is under development for the neutronics and T-H coupled simulation for the pressurized water reactor. MPACT includes the ORIGEN-API and internal depletion module to perform depletion calculations based upon neutron-material reaction and radioactive decay. It is a challenge to validate the depletion capability because of the insufficient measured data. One of the detoured methods to validate it is to perform a code-to-code comparison for benchmark problems. In this study a depletion benchmark suite has been developed and a detailed guideline has been provided to obtain meaningful computational outcomes which can be used in the validation of the MPACT depletion capability.

  17. A Model-Based Methodology for Spray-Drying Process Development.

    Science.gov (United States)

    Dobry, Dan E; Settell, Dana M; Baumann, John M; Ray, Rod J; Graham, Lisa J; Beyerinck, Ron A

    2009-09-01

    Solid amorphous dispersions are frequently used to improve the solubility and, thus, the bioavailability of poorly soluble active pharmaceutical ingredients (APIs). Spray-drying, a well-characterized pharmaceutical unit operation, is ideally suited to producing solid amorphous dispersions due to its rapid drying kinetics. This paper describes a novel flowchart methodology based on fundamental engineering models and state-of-the-art process characterization techniques that ensure that spray-drying process development and scale-up are efficient and require minimal time and API. This methodology offers substantive advantages over traditional process-development methods, which are often empirical and require large quantities of API and long development times. This approach is also in alignment with the current guidance on Pharmaceutical Development Q8(R1). The methodology is used from early formulation-screening activities (involving milligrams of API) through process development and scale-up for early clinical supplies (involving kilograms of API) to commercial manufacturing (involving metric tons of API). It has been used to progress numerous spray-dried dispersion formulations, increasing bioavailability of formulations at preclinical through commercial scales.

  18. A Novel Water Supply Network Sectorization Methodology Based on a Complete Economic Analysis, Including Uncertainties

    Directory of Open Access Journals (Sweden)

    Enrique Campbell

    2016-04-01

    Full Text Available The core idea behind sectorization of Water Supply Networks (WSNs is to establish areas partially isolated from the rest of the network to improve operational control. Besides the benefits associated with sectorization, some drawbacks must be taken into consideration by water operators: the economic investment associated with both boundary valves and flowmeters and the reduction of both pressure and system resilience. The target of sectorization is to properly balance these negative and positive aspects. Sectorization methodologies addressing the economic aspects mainly consider costs of valves and flowmeters and of energy, and the benefits in terms of water saving linked to pressure reduction. However, sectorization entails other benefits, such as the reduction of domestic consumption, the reduction of burst frequency and the enhanced capacity to detect and intervene over future leakage events. We implement a development proposed by the International Water Association (IWA to estimate the aforementioned benefits. Such a development is integrated in a novel sectorization methodology based on a social network community detection algorithm, combined with a genetic algorithm optimization method and Monte Carlo simulation. The methodology is implemented over a fraction of the WSN of Managua city, capital of Nicaragua, generating a net benefit of 25,572 $/year.

  19. A hybrid system identification methodology for wireless structural health monitoring systems based on dynamic substructuring

    Science.gov (United States)

    Dragos, Kosmas; Smarsly, Kay

    2016-04-01

    System identification has been employed in numerous structural health monitoring (SHM) applications. Traditional system identification methods usually rely on centralized processing of structural response data to extract information on structural parameters. However, in wireless SHM systems the centralized processing of structural response data introduces a significant communication bottleneck. Exploiting the merits of decentralization and on-board processing power of wireless SHM systems, many system identification methods have been successfully implemented in wireless sensor networks. While several system identification approaches for wireless SHM systems have been proposed, little attention has been paid to obtaining information on the physical parameters (e.g. stiffness, damping) of the monitored structure. This paper presents a hybrid system identification methodology suitable for wireless sensor networks based on the principles of component mode synthesis (dynamic substructuring). A numerical model of the monitored structure is embedded into the wireless sensor nodes in a distributed manner, i.e. the entire model is segmented into sub-models, each embedded into one sensor node corresponding to the substructure the sensor node is assigned to. The parameters of each sub-model are estimated by extracting local mode shapes and by applying the equations of the Craig-Bampton method on dynamic substructuring. The proposed methodology is validated in a laboratory test conducted on a four-story frame structure to demonstrate the ability of the methodology to yield accurate estimates of stiffness parameters. Finally, the test results are discussed and an outlook on future research directions is provided.

  20. A SystemC-Based Design Methodology for Digital Signal Processing Systems

    Directory of Open Access Journals (Sweden)

    Christian Haubelt

    2007-03-01

    Full Text Available Digital signal processing algorithms are of big importance in many embedded systems. Due to complexity reasons and due to the restrictions imposed on the implementations, new design methodologies are needed. In this paper, we present a SystemC-based solution supporting automatic design space exploration, automatic performance evaluation, as well as automatic system generation for mixed hardware/software solutions mapped onto FPGA-based platforms. Our proposed hardware/software codesign approach is based on a SystemC-based library called SysteMoC that permits the expression of different models of computation well known in the domain of digital signal processing. It combines the advantages of executability and analyzability of many important models of computation that can be expressed in SysteMoC. We will use the example of an MPEG-4 decoder throughout this paper to introduce our novel methodology. Results from a five-dimensional design space exploration and from automatically mapping parts of the MPEG-4 decoder onto a Xilinx FPGA platform will demonstrate the effectiveness of our approach.

  1. Modal macro-strain vector based damage detection methodology with long-gauge FBG sensors

    Science.gov (United States)

    Xu, Bin; Liu, Chongwu W.; Masri, Sami F.

    2009-07-01

    Advances in optic fiber sensing technology provide easy and reliable way for the vibration-based strain measurement of engineering structures. As a typical optic fiber sensing techniques with high accuracy and resolution, long-gauge Fiber Bragg Grating (FBG) sensors have been widely employed in health monitoring of civil engineering structures. Therefore, the development of macro strain-based identification methods is crucial for damage detection and structural condition evaluation. In the previous study by the authors, a damage detection algorithm for a beam structure with the direct use of vibration-based macro-strain measurement time history with neural networks had been proposed and validated with experimental measurements. In this paper, a damage locating and quantifying method was proposed using modal macrostrain vectors (MMSVs) which can be extracted from vibration induced macro-strain response measurement time series from long-gage FBG sensors. The performance of the proposed methodology for damage detection of a beam with different damage scenario was studied with numerical simulation firstly. Then, dynamic tests on a simply-supported steel beam with different damage scenarios were carried out and macro-strain measurements were employed to detect the damage severity. Results show that the proposed MMSV based structural identification and damage detection methodology can locate and identify the structural damage severity with acceptable accuracy.

  2. Methodology of the National School-based Health Survey in Malaysia, 2012.

    Science.gov (United States)

    Yusoff, Fadhli; Saari, Riyanti; Naidu, Balkish M; Ahmad, Noor Ani; Omar, Azahadi; Aris, Tahir

    2014-09-01

    The National School-Based Health Survey 2012 was a nationwide school health survey of students in Standard 4 to Form 5 (10-17 years of age), who were schooling in government schools in Malaysia during the period of data collection. The survey comprised 3 subsurveys: the Global School Health Survey (GSHS), the Mental Health Survey, and the National School-Based Nutrition Survey. The aim of the survey was to provide data on the health status of adolescents in Malaysia toward strengthening the adolescent health program in the country. The design of the survey was created to fulfill the requirements of the 3 subsurveys. A 2-stage stratified sampling method was adopted in the sampling. The methods for data collection were via questionnaire and physical examination. The National School-Based Health Survey 2012 adopted an appropriate methodology for a school-based survey to ensure valid and reliable findings.

  3. METHODOLOGICAL BASES OF ECOLOGICAL CULTURE FORMATION OF PUPILS ON THE BASIS OF ECO-DEVELOPMENT IDEAS

    Directory of Open Access Journals (Sweden)

    Natalia F. Vinokurova

    2016-01-01

    Full Text Available Aim. The article describes methodological bases of formation of ecological culture of students as the aim of innovative training for a sustainable future. The authors take into account international and the Russian experience, connected with development of ecological culture as an educational resource of society adaptation to environmental constraints, risks, crises and present-day consolidated actions towards sustainable development of civilization. Methods. The methodological basis of constructing of the model formation of pupils’ ecological culture is developed from the standpoint of the idea of eco-development (noosphere, co-evolution, sustainable development and a set of axiological, cultural, personal-activity, co-evolutionary, cultural and ecological approaches. Justified methodical basis has allowed to construct educational level of formation of ecological culture of pupils, comprising interconnected unity of the target, substantive, procedural, effectively and appraisal components. Results and scientific novelty. The article presents the results of many years research of authors on environmental education for sustainable development in the framework of the Nizhny Novgorod scientific school. A characteristic of ecological culture of students as the goal of environmental education based on ecodevelopment ideas is given. It is shown that the ecological culture of students directs them to new values in life meanings, methods of ecological-oriented actions and behavior, changing settings of the consumer society and ensuring the development of the younger generation of co-evolutionary, spiritual guidance in a postindustrial society. The authors’ model of the formation of ecological culture of pupils is represented by conjugation philosophical and methodological, theoretical, methodological and pedagogical levels that ensure the integrity and hierarchical pedagogical research on the issue. The article discloses a pedagogical assessment

  4. Methodology for cost analysis of film-based and filmless portable chest systems

    Science.gov (United States)

    Melson, David L.; Gauvain, Karen M.; Beardslee, Brian M.; Kraitsik, Michael J.; Burton, Larry; Blaine, G. James; Brink, Gary S.

    1996-05-01

    Many studies analyzing the costs of film-based and filmless radiology have focused on multi- modality, hospital-wide solutions. Yet due to the enormous cost of converting an entire large radiology department or hospital to a filmless environment all at once, institutions often choose to eliminate film one area at a time. Narrowing the focus of cost-analysis may be useful in making such decisions. This presentation will outline a methodology for analyzing the cost per exam of film-based and filmless solutions for providing portable chest exams to Intensive Care Units (ICUs). The methodology, unlike most in the literature, is based on parallel data collection from existing filmless and film-based ICUs, and is currently being utilized at our institution. Direct costs, taken from the perspective of the hospital, for portable computed radiography chest exams in one filmless and two film-based ICUs are identified. The major cost components are labor, equipment, materials, and storage. Methods for gathering and analyzing each of the cost components are discussed, including FTE-based and time-based labor analysis, incorporation of equipment depreciation, lease, and maintenance costs, and estimation of materials costs. Extrapolation of data from three ICUs to model hypothetical, hospital-wide film-based and filmless ICU imaging systems is described. Performance of sensitivity analysis on the filmless model to assess the impact of anticipated reductions in specific labor, equipment, and archiving costs is detailed. A number of indirect costs, which are not explicitly included in the analysis, are identified and discussed.

  5. A Fault Diagnosis Methodology for Gear Pump Based on EEMD and Bayesian Network.

    Directory of Open Access Journals (Sweden)

    Zengkai Liu

    Full Text Available This paper proposes a fault diagnosis methodology for a gear pump based on the ensemble empirical mode decomposition (EEMD method and the Bayesian network. Essentially, the presented scheme is a multi-source information fusion based methodology. Compared with the conventional fault diagnosis with only EEMD, the proposed method is able to take advantage of all useful information besides sensor signals. The presented diagnostic Bayesian network consists of a fault layer, a fault feature layer and a multi-source information layer. Vibration signals from sensor measurement are decomposed by the EEMD method and the energy of intrinsic mode functions (IMFs are calculated as fault features. These features are added into the fault feature layer in the Bayesian network. The other sources of useful information are added to the information layer. The generalized three-layer Bayesian network can be developed by fully incorporating faults and fault symptoms as well as other useful information such as naked eye inspection and maintenance records. Therefore, diagnostic accuracy and capacity can be improved. The proposed methodology is applied to the fault diagnosis of a gear pump and the structure and parameters of the Bayesian network is established. Compared with artificial neural network and support vector machine classification algorithms, the proposed model has the best diagnostic performance when sensor data is used only. A case study has demonstrated that some information from human observation or system repair records is very helpful to the fault diagnosis. It is effective and efficient in diagnosing faults based on uncertain, incomplete information.

  6. Comparative study of the Martian suprathermal electron depletions based on Mars Global Surveyor, Mars Express, and Mars Atmosphere and Volatile EvolutioN mission observations

    Science.gov (United States)

    Steckiewicz, M.; Garnier, P.; André, N.; Mitchell, D. L.; Andersson, L.; Penou, E.; Beth, A.; Fedorov, A.; Sauvaud, J.-A.; Mazelle, C.; Brain, D. A.; Espley, J. R.; McFadden, J.; Halekas, J. S.; Larson, D. E.; Lillis, R. J.; Luhmann, J. G.; Soobiah, Y.; Jakosky, B. M.

    2017-01-01

    Nightside suprathermal electron depletions have been observed at Mars by three spacecraft to date: Mars Global Surveyor, Mars Express, and the Mars Atmosphere and Volatile EvolutioN (MAVEN) mission. This spatial and temporal diversity of measurements allows us to propose here a comprehensive view of the Martian electron depletions through the first multispacecraft study of the phenomenon. We have analyzed data recorded by the three spacecraft from 1999 to 2015 in order to better understand the distribution of the electron depletions and their creation mechanisms. Three simple criteria adapted to each mission have been implemented to identify more than 134,500 electron depletions observed between 125 and 900 km altitude. The geographical distribution maps of the electron depletions detected by the three spacecraft confirm the strong link existing between electron depletions and crustal magnetic field at altitudes greater than 170 km. At these altitudes, the distribution of electron depletions is strongly different in the two hemispheres, with a far greater chance to observe an electron depletion in the Southern Hemisphere, where the strongest crustal magnetic sources are located. However, the unique MAVEN observations reveal that below a transition region near 160-170 km altitude the distribution of electron depletions is the same in both hemispheres, with no particular dependence on crustal magnetic fields. This result supports the suggestion made by previous studies that these low-altitudes events are produced through electron absorption by atmospheric CO2.

  7. Standard methodology for establishing the "state of the art" based on Six Sigma

    Science.gov (United States)

    Otero, M.; Pastor, A.; Portela, J. M.; Viguera, J. L.; Huerta, M. M.

    2012-04-01

    A research project must generate new questions and is of a cyclic nature, where the answers to the questions in today's research will be the base for the questions of tomorrow's research. [1]. To facilitate this task, a standard methodological tool is proposed which, as well as giving a general, ordered view of the search process to determine the state of art, also helps the researcher, in any circumstances, to find new lines of work or tendencies in any subject which may be the object of analysis.

  8. Smart learning objects for smart education in computer science theory, methodology and robot-based implementation

    CERN Document Server

    Stuikys, Vytautas

    2015-01-01

    This monograph presents the challenges, vision and context to design smart learning objects (SLOs) through Computer Science (CS) education modelling and feature model transformations. It presents the latest research on the meta-programming-based generative learning objects (the latter with advanced features are treated as SLOs) and the use of educational robots in teaching CS topics. The introduced methodology includes the overall processes to develop SLO and smart educational environment (SEE) and integrates both into the real education setting to provide teaching in CS using constructivist a

  9. A methodology for evaluation and selection of nanoparticle manufacturing processes based on sustainability metrics.

    Science.gov (United States)

    Naidu, Sasikumar; Sawhney, Rapinder; Li, Xueping

    2008-09-01

    A set of sustainability metrics, covering the economic, environmental and sociological dimensions of sustainability for evaluation of nanomanufacturing processes is developed. The metrics are divided into two categories namely industrial engineering metrics (process and safety metrics) and green chemistry metrics (environmental impact). The waste reduction algorithm (WAR) is used to determine the environmental impact of the processes and NAIADE (Novel Approach to Imprecise Assessment and Decision Environments) software is used for evaluation and decision analysis. The methodology is applied to three processes used for silica nanoparticle synthesis based on sol-gel and flame methods.

  10. A Methodology to Evaluate Object oriented Software Systems Using Change Requirement Traceability Based on Impact Analysis

    Directory of Open Access Journals (Sweden)

    Sunil T. D

    2014-06-01

    Full Text Available It is a well known fact that software maintenance plays a major role and finds importance in software development life cycle. As object - oriented programming has become the standard, it is very important to understand the problems of maintaining object -oriented software systems. This paper aims at evaluating object - oriented software system through change requirement traceability – based impact analysis methodology for non functional requirements using functional requirements . The major issues have been related to change impact algorithms and inheritance of functionality.

  11. Development of flaxseed fortified rice – corn flour blend based extruded product by response surface methodology

    OpenAIRE

    Ganorkar, P. M.; Jain, R. K.

    2014-01-01

    Flaxseed imparted the evidence of health benefits in human being. Response surface methodology (RSM) was employed to develop flaxseed fortified rice – corn flour blend based extruded product using twin screw extruder. The effect of roasted flaxseed flour (RFF) fortification (15–25 %), moisture content of feed (12–16 %, wb), extruder barrel temperature (120–140 °C) and screw speed (300–330 RPM) on expansion ratio (ER), breaking strength (BS), bulk density (BD) and overall acceptability (OAA) s...

  12. Location of Bioelectricity Plants in the Madrid Community Based on Triticale Crop: A Multicriteria Methodology

    Directory of Open Access Journals (Sweden)

    L. Romero

    2015-01-01

    Full Text Available This paper presents a work whose objective is, first, to quantify the potential of the triticale biomass existing in each of the agricultural regions in the Madrid Community through a crop simulation model based on regression techniques and multiple correlation. Second, a methodology for defining which area has the best conditions for the installation of electricity plants from biomass has been described and applied. The study used a methodology based on compromise programming in a discrete multicriteria decision method (MDM context. To make a ranking, the following criteria were taken into account: biomass potential, electric power infrastructure, road networks, protected spaces, and urban nuclei surfaces. The results indicate that, in the case of the Madrid Community, the Campiña region is the most suitable for setting up plants powered by biomass. A minimum of 17,339.9 tons of triticale will be needed to satisfy the requirements of a 2.2 MW power plant. The minimum range of action for obtaining the biomass necessary in Campiña region would be 6.6 km around the municipality of Algete, based on Geographic Information Systems. The total biomass which could be made available in considering this range in this region would be 18,430.68 t.

  13. Optimization-based methodology for the development of wastewater facilities for energy and nutrient recovery.

    Science.gov (United States)

    Puchongkawarin, C; Gomez-Mont, C; Stuckey, D C; Chachuat, B

    2015-12-01

    A paradigm shift is currently underway from an attitude that considers wastewater streams as a waste to be treated, to a proactive interest in recovering materials and energy from these streams. This paper is concerned with the development and application of a systematic, model-based methodology for the development of wastewater resource recovery systems that are both economically attractive and sustainable. With the array of available treatment and recovery options growing steadily, a superstructure modeling approach based on rigorous mathematical optimization appears to be a natural approach for tackling these problems. The development of reliable, yet simple, performance and cost models is a key issue with this approach in order to allow for a reliable solution based on global optimization. We argue that commercial wastewater simulators can be used to derive such models, and we illustrate this approach with a simple resource recovery system. The results show that the proposed methodology is computationally tractable, thereby supporting its application as a decision support system for selection of promising resource recovery systems whose development is worth pursuing.

  14. Implementation methodology of practices based on scientific evidence for assistance in natural delivery: a pilot study

    Directory of Open Access Journals (Sweden)

    Clodoaldo Tentes Côrtes

    2015-10-01

    Full Text Available AbstractOBJECTIVEPresenting methodology for transferring knowledge to improve maternal outcomes in natural delivery based on scientific evidence.METHOD: An intervention study conducted in the maternity hospital of Itapecerica da Serra, SP, with 50 puerperal women and 102 medical records from July to November 2014. The PACES tool from Joanna Briggs Institute, consisting of pre-clinical audit (phase 1, implementation of best practice (phase 2 and Follow-up Clinical Audit (phase 3 was used. Data were analyzed by comparing results of phases 1 and 3 with Fisher's exact test and a significance level of 5%.RESULTSThe vertical position was adopted by the majority of puerperal women with statistical difference between phases 1 and 3. A significant increase in bathing/showering, walking and massages for pain relief was found from the medical records. No statistical difference was found in other practices and outcomes. Barriers and difficulties in the implementation of evidence-based practices have been identified. Variables were refined, techniques and data collection instruments were verified, and an intervention proposal was made.CONCLUSIONThe study found possibilities for implementing a methodology of practices based on scientific evidence for assistance in natural delivery.

  15. Shear-affected depletion interaction

    NARCIS (Netherlands)

    July, C.; Kleshchanok, D.; Lang, P.R.

    2012-01-01

    We investigate the influence of flow fields on the strength of the depletion interaction caused by disc-shaped depletants. At low mass concentration of discs, it is possible to continuously decrease the depth of the depletion potential by increasing the applied shear rate until the depletion force i

  16. Pediatric hydrocephalus: systematic literature review and evidence-based guidelines. Part 1: Introduction and methodology.

    Science.gov (United States)

    Flannery, Ann Marie; Mitchell, Laura

    2014-11-01

    This clinical systematic review of and evidence-based guidelines for the treatment of pediatric hydrocephalus were developed by a physician volunteer task force. They are provided as an educational tool based on an assessment of current scientific and clinical information as well as accepted approaches to treatment. They are not intended to be a fixed protocol, because some patients may require more or less treatment. In Part 1, the authors introduce the reader to the complex topic of hydrocephalus and the lack of consensus on its appropriate treatment. The authors describe the development of the Pediatric Hydrocephalus Systematic Review and Evidence-Based Guidelines Task Force charged with reviewing the literature and recommending treatments for hydrocephalus, and they set out the basic methodology used throughout the specific topics covered in later chapters.

  17. Achieving process intensification form the application of a phenomena based synthesis, Design and intensification methodology

    DEFF Research Database (Denmark)

    Babi, Deenesh Kavi; Lutze, Philip; Woodley, John

    of PI still faces challenges [2] because the identification and design of intensified processes is not simple [3]. Lutze et al [3] has developed a systematic PI synthesis/design method at the unit operations (Unit-Ops) level, where the search space is based on a knowledge-base of existing PI equipment......). This enables the use of apriori knowledge of the Unit-Ops as well as the possibility to design new Unit-Ops. A first version for a phenomena-based synthesis/design (PhenPI) methodology has been developed [5] in which a process flowsheet is generated through the use of involved phenomena such as mixing, phase...... solution approach which breaks down the complex mathematical synthesis/design problem into manageable sub-problems (6 steps). It allows the generation of PI options and their subsequent stepwise reduction of the search space and identification of the best intensified process option. In step-1, the problem...

  18. Unravelling emotional viewpoints on a bio-based economy using Q methodology.

    Science.gov (United States)

    Sleenhoff, Susanne; Cuppen, Eefje; Osseweijer, Patricia

    2015-10-01

    A transition to a bio-based economy will affect society and requires collective action from a broad range of stakeholders. This includes the public, who are largely unaware of this transition. For meaningful public engagement people's emotional viewpoints play an important role. However, what the public's emotions about the transition are and how they can be taken into account is underexposed in public engagement literature and practice. This article aims to unravel the public's emotional views of the bio-based economy as a starting point for public engagement. Using Q methodology with visual representations of a bio-based economy we found four emotional viewpoints: (1) compassionate environmentalist, (2) principled optimist, (3) hopeful motorist and (4) cynical environmentalist. These provide insight into the distinct and shared ways through which members of the public connect with the transition. Implications for public engagement are discussed.

  19. Metodologia de custeio para a ergonomia Ergonomics-based costing methodology

    Directory of Open Access Journals (Sweden)

    José Roberto Dourado Mafra

    2006-12-01

    Full Text Available Uma metodologia de Custeio para Ergonomia é apresentada neste artigo. Aqui o custeio é construído em paralelo ao processo da Análise de Ergonomia. Faz-se uma breve revisão da literatura. Essa metodologia de custeio abrange uma estimativa inicial de custos e a posterior aferição desses custos, decorrentes da ausência de Ergonomia no delineamento das situações em estudo; num outro momento, são feitos os cálculos dos custos das correções, ou investimentos necessários e a avaliação dos benefícios aportados pela nova concepção. A aplicação dessa metodologia é exemplificada em um estudo de caso de uma cozinha industrial, onde foi realizada uma Análise Ergonômica do Trabalho. No estudo de caso, a ausência de ergonomia é caracterizada por indicadores econômicos de efetividade na empresa. Conclui-se que essa metodologia de custeio mostra como problemas no desempenho impactam no negócio, economicamente, caracterizados em saúde, qualidade de vida e produtividade no trabalho. Nesse sentido, acredita-se ter contribuído com o estado da prática, contabilizando os custos e avaliando a viabilidade da solução.This paper discusses an ergonomics-based costing methodology, in which the costing process and the ergonomic work analysis are realized at the same time. A brief bibliographic review is presented. Two questions are pointed out regarding the economic evaluation of ergonomic interventions: one is the costing problem and the other evaluation itself. This costing methodology involves an initial costing estimate of the lack of ergonomics in the study case, followed by the checking of data validity; then, the costs of solutions are calculated and the benefits of the new conception are assessed. The methodology is applied to one example, i.e. a case study of an industrial kitchen, where an ergonomic work analysis was performed. In the studied case, the lack of ergonomics is characterized by economic indicators of company efficacy

  20. EUROCONTROL-Systemic Occurrence Analysis Methodology (SOAM)-A 'Reason'-based organisational methodology for analysing incidents and accidents

    Energy Technology Data Exchange (ETDEWEB)

    Licu, Tony [EUROCONTROL-European Organization for the Safety of Air Navigation and Dedale Asia Pacific, Safety Team, Rue de la Fusee, 96, 1130 Brussels (Belgium)]. E-mail: antonio.licu@eurocontrol.int; Cioran, Florin [EUROCONTROL-European Organization for the Safety of Air Navigation and Dedale Asia Pacific, Safety Team, Rue de la Fusee, 96, 1130 Brussels (Belgium)]. E-mail: florin.cioran@eurocontrol.int; Hayward, Brent [EUROCONTROL-European Organization for the Safety of Air Navigation and Dedale Asia Pacific, Safety Team, Rue de la Fusee, 96, 1130 Brussels (Belgium)]. E-mail: bhayward@dedale.net; Lowe, Andrew [EUROCONTROL-European Organization for the Safety of Air Navigation and Dedale Asia Pacific, Safety Team, Rue de la Fusee, 96, 1130 Brussels (Belgium)]. E-mail: alowe@dedale.net

    2007-09-15

    The Safety Occurrence Analysis Methodology (SOAM) developed for EUROCONTROL is an accident investigation methodology based on the Reason Model of organisational accidents. The purpose of a SOAM is to broaden the focus of an investigation from human involvement issues, also known as 'active failures of operational personnel' under Reason's original model, to include analysis of the latent conditions deeper within the organisation that set the context for the event. Such an approach is consistent with the tenets of Just Culture in which people are encouraged to provide full and open information about how incidents occurred, and are not penalised for errors. A truly systemic approach is not simply a means of transferring responsibility for a safety occurrence from front-line employees to senior managers. A consistent philosophy must be applied, where the investigation process seeks to correct deficiencies wherever they may be found, without attempting to apportion blame or liability.

  1. A new methodology for building energy benchmarking: An approach based on clustering concept and statistical models

    Science.gov (United States)

    Gao, Xuefeng

    Though many building energy benchmarking programs have been developed during the past decades, they hold certain limitations. The major concern is that they may cause misleading benchmarking due to not fully considering the impacts of the multiple features of buildings on energy performance. The existing methods classify buildings according to only one of many features of buildings -- the use type, which may result in a comparison between two buildings that are tremendously different in other features and not properly comparable as a result. This research aims to tackle this challenge by proposing a new methodology based on the clustering concept and statistical analysis. The clustering concept, which reflects on machine learning algorithms, classifies buildings based on a multi-dimensional domain of building features, rather than the single dimension of use type. Buildings with the greatest similarity of features that influence energy performance are classified into the same cluster, and benchmarked according to the centroid reference of the cluster. Statistical analysis is applied to find the most influential features impacting building energy performance, as well as provide prediction models for the new design energy consumption. The proposed methodology as applicable to both existing building benchmarking and new design benchmarking was discussed in this dissertation. The former contains four steps: feature selection, clustering algorithm adaptation, results validation, and interpretation. The latter consists of three parts: data observation, inverse modeling, and forward modeling. The experimentation and validation were carried out for both perspectives. It was shown that the proposed methodology could account for the total building energy performance and was able to provide a more comprehensive approach to benchmarking. In addition, the multi-dimensional clustering concept enables energy benchmarking among different types of buildings, and inspires a new

  2. Optimization of cocoa nib roasting based on sensory properties and colour using response surface methodology

    Directory of Open Access Journals (Sweden)

    D.M.H. A.H. Farah

    2012-05-01

    Full Text Available Roasting of cocoa beans is a critical stage for development of its desirable flavour, aroma and colour. Prior to roasting, cocoa bean may taste astringent, bitter, acidy, musty, unclean, nutty or even chocolate-like, depends on the bean sources and their preparations. After roasting, the bean possesses a typical intense cocoa flavour. The Maillard or non-enzymatic browning reactions is a very important process for the development of cocoa flavor, which occurs primarily during the roasting process and it has generally been agreed that the main flavor components, pyrazines formation is associated within this reaction involving amino acids and reducing sugars. The effect of cocoa nib roasting conditions on sensory properties and colour of cocoa beans were investigated in this study. Roasting conditions in terms of temperature ranged from 110 to 160OC and time ranged from 15 to 40 min were optimized by using Response Surface Methodology based on the cocoa sensory characteristics including chocolate aroma, acidity, astringency, burnt taste and overall acceptability. The analyses used 9- point hedonic scale with twelve trained panelist. The changes in colour due to the roasting condition were also monitored using chromameter. Result of this study showed that sensory quality of cocoa liquor increased with the increase in roasting time and temperature up to 160OC and up to 40 min, respectively. Based on the Response Surface Methodology, the optimised operating condition for the roaster was at temperature of 127OC and time of 25 min. The proposed roasting conditions were able to produce superior quality cocoa beans that will be very useful for cocoa manufactures.Key words : Cocoa, cocoa liquor, flavour, aroma, colour, sensory characteristic, response surface methodology.

  3. Methodology for establishing life curves based on condition monitoring data and expert judgements

    Energy Technology Data Exchange (ETDEWEB)

    Eggen, Arnt Ove; Welte, Thomas; Susa, Dejan

    2010-04-15

    Remaining lifetime and probability of failure of components in the power system are important issues for planning of maintenance and refurbishment. The concept of life curves provides an approach to utilize information about the technical condition (condition states) for modelling component degradation and for calculation of remaining lifetime and failure probability. This technical report describes methodologies for establishing life curves based on condition monitoring data and/or expert judgement. The methodologies presented in this report can be applied for different types of components and failure mechanisms. Thus, life curves provide a generic approach for degradation and lifetime modelling. The report describes methodologies that can be used to established life curves based on different sources of information, such as judgements provided by one or several experts, or condition monitoring data. Furthermore, establishing life curves using a combination of expert judgments and condition monitoring data is also described. Because different failure mechanisms and components are assessed separately by the life curve approach, the report also shows how analysis results can be aggregated. In addition, references are given to tools and software prototypes that can be used to establish the life curves, to calculate failure probability and remaining lifetime, and to aggregate the results. The report also presents a number of examples of life curves for selected components and failure mechanisms. The examples include different components of power transformers and circuit breakers. Furthermore, a study on water tree degraded XLPE cables and case study on wood poles are presented. These examples show the use of the life curve approach and can be used as basis for an own analysis. A file is attached to this report where some of the life curve examples are collected. (Author)

  4. Risk Based Inspection Methodology and Software Applied to Atmospheric Storage Tanks

    Science.gov (United States)

    Topalis, P.; Korneliussen, G.; Hermanrud, J.; Steo, Y.

    2012-05-01

    A new risk-based inspection (RBI) methodology and software is presented in this paper. The objective of this work is to allow management of the inspections of atmospheric storage tanks in the most efficient way, while, at the same time, accident risks are minimized. The software has been built on the new risk framework architecture, a generic platform facilitating efficient and integrated development of software applications using risk models. The framework includes a library of risk models and the user interface is automatically produced on the basis of editable schemas. This risk-framework-based RBI tool has been applied in the context of RBI for above-ground atmospheric storage tanks (AST) but it has been designed with the objective of being generic enough to allow extension to the process plants in general. This RBI methodology is an evolution of an approach and mathematical models developed for Det Norske Veritas (DNV) and the American Petroleum Institute (API). The methodology assesses damage mechanism potential, degradation rates, probability of failure (PoF), consequence of failure (CoF) in terms of environmental damage and financial loss, risk and inspection intervals and techniques. The scope includes assessment of the tank floor for soil-side external corrosion and product-side internal corrosion and the tank shell courses for atmospheric corrosion and internal thinning. It also includes preliminary assessment for brittle fracture and cracking. The data are structured according to an asset hierarchy including Plant, Production Unit, Process Unit, Tag, Part and Inspection levels and the data are inherited / defaulted seamlessly from a higher hierarchy level to a lower level. The user interface includes synchronized hierarchy tree browsing, dynamic editor and grid-view editing and active reports with drill-in capability.

  5. Flow methodology for methanol determination in biodiesel exploiting membrane-based extraction

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, Andre R.T.S. [REQUIMTE, Departamento de Quimica-Fisica, Faculdade de Farmacia da Universidade do Porto, Rua Anibal Cunha 164, 4099-030 Porto (Portugal); Saraiva, M. Lucia M.F.S. [REQUIMTE, Departamento de Quimica-Fisica, Faculdade de Farmacia da Universidade do Porto, Rua Anibal Cunha 164, 4099-030 Porto (Portugal)], E-mail: lsaraiva@ff.up.pt; Lima, Jose L.F.C. [REQUIMTE, Departamento de Quimica-Fisica, Faculdade de Farmacia da Universidade do Porto, Rua Anibal Cunha 164, 4099-030 Porto (Portugal); Korn, M. Gracas A. [Grupo de Pesquisa em Quimica Analitica, Instituto de Quimica, Universidade Federal da Bahia, Campus de Ondina, 40170-290 Salvador, Bahia (Brazil)

    2008-04-21

    A methodology based in flow analysis and membrane-based extraction has been applied to the determination of methanol in biodiesel samples. A hydrophilic membrane was used to perform the liquid-liquid extraction in the system with the organic sample fed to the donor side of the membrane and the methanol transfer to an aqueous acceptor buffer solution. The quantification of the methanol was then achieved in aqueous solution by the combined use of immobilised alcohol oxidase (AOD), soluble peroxidase and 2,2'-azino-bis(3-ethylbenzothiazoline-6-sulfonic acid) (ABTS). The optimization of parameters such as the type of membrane, the groove volume and configuration of the membrane unit, the appropriate organic solvent, sample injection volume, as well as immobilised packed AOD reactor was performed. Two dynamic analytical working ranges were achieved, up to 0.015% and up to 0.200% (m/m) methanol concentrations, just by changing the volume of acceptor aqueous solution. Detection limits of 0.0002% (m/m) and 0.007% (m/m) methanol were estimated, respectively. The decision limit (CC{alpha}) and the detection capacity (CC{beta}) were 0.206 and 0.211% (m/m), respectively. The developed methodology showed good precision, with a relative standard deviation (R.S.D.) <5.0% (n = 10). Biodiesel samples from different sources were then directly analyzed without any sample pre-treatment. Statistical evaluation showed good compliance, for a 95% confidence level, between the results obtained with the flow system and those furnished by the gas chromatography reference method. The proposed methodology turns out to be more environmental friendly and cost-effective than the reference method.

  6. Measurement-based auralization methodology for the assessment of noise mitigation measures

    Science.gov (United States)

    Thomas, Pieter; Wei, Weigang; Van Renterghem, Timothy; Botteldooren, Dick

    2016-09-01

    The effect of noise mitigation measures is generally expressed by noise levels only, neglecting the listener's perception. In this study, an auralization methodology is proposed that enables an auditive preview of noise abatement measures for road traffic noise, based on the direction dependent attenuation of a priori recordings made with a dedicated 32-channel spherical microphone array. This measurement-based auralization has the advantage that all non-road traffic sounds that create the listening context are present. The potential of this auralization methodology is evaluated through the assessment of the effect of an L-shaped mound. The angular insertion loss of the mound is estimated by using the ISO 9613-2 propagation model, the Pierce barrier diffraction model and the Harmonoise point-to-point model. The realism of the auralization technique is evaluated by listening tests, indicating that listeners had great difficulty in differentiating between a posteriori recordings and auralized samples, which shows the validity of the followed approaches.

  7. Model-Driven Methodology for Rapid Deployment of Smart Spaces Based on Resource-Oriented Architectures

    Directory of Open Access Journals (Sweden)

    José R. Casar

    2012-07-01

    Full Text Available Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT and Web of Things (WoT are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i to integrate sensing and actuating functionalities into everyday objects, and (ii to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD methodology based on the Model Driven Architecture (MDA. This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.

  8. Design Methodology for Magnetic Field-Based Soft Tri-Axis Tactile Sensors.

    Science.gov (United States)

    Wang, Hongbo; de Boer, Greg; Kow, Junwai; Alazmani, Ali; Ghajari, Mazdak; Hewson, Robert; Culmer, Peter

    2016-08-24

    Tactile sensors are essential if robots are to safely interact with the external world and to dexterously manipulate objects. Current tactile sensors have limitations restricting their use, notably being too fragile or having limited performance. Magnetic field-based soft tactile sensors offer a potential improvement, being durable, low cost, accurate and high bandwidth, but they are relatively undeveloped because of the complexities involved in design and calibration. This paper presents a general design methodology for magnetic field-based three-axis soft tactile sensors, enabling researchers to easily develop specific tactile sensors for a variety of applications. All aspects (design, fabrication, calibration and evaluation) of the development of tri-axis soft tactile sensors are presented and discussed. A moving least square approach is used to decouple and convert the magnetic field signal to force output to eliminate non-linearity and cross-talk effects. A case study of a tactile sensor prototype, MagOne, was developed. This achieved a resolution of 1.42 mN in normal force measurement (0.71 mN in shear force), good output repeatability and has a maximum hysteresis error of 3.4%. These results outperform comparable sensors reported previously, highlighting the efficacy of our methodology for sensor design.

  9. Model-driven methodology for rapid deployment of smart spaces based on resource-oriented architectures.

    Science.gov (United States)

    Corredor, Iván; Bernardos, Ana M; Iglesias, Josué; Casar, José R

    2012-01-01

    Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT) and Web of Things (WoT) are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i) to integrate sensing and actuating functionalities into everyday objects, and (ii) to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD) methodology based on the Model Driven Architecture (MDA). This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.

  10. Design Methodology for Magnetic Field-Based Soft Tri-Axis Tactile Sensors

    Directory of Open Access Journals (Sweden)

    Hongbo Wang

    2016-08-01

    Full Text Available Tactile sensors are essential if robots are to safely interact with the external world and to dexterously manipulate objects. Current tactile sensors have limitations restricting their use, notably being too fragile or having limited performance. Magnetic field-based soft tactile sensors offer a potential improvement, being durable, low cost, accurate and high bandwidth, but they are relatively undeveloped because of the complexities involved in design and calibration. This paper presents a general design methodology for magnetic field-based three-axis soft tactile sensors, enabling researchers to easily develop specific tactile sensors for a variety of applications. All aspects (design, fabrication, calibration and evaluation of the development of tri-axis soft tactile sensors are presented and discussed. A moving least square approach is used to decouple and convert the magnetic field signal to force output to eliminate non-linearity and cross-talk effects. A case study of a tactile sensor prototype, MagOne, was developed. This achieved a resolution of 1.42 mN in normal force measurement (0.71 mN in shear force, good output repeatability and has a maximum hysteresis error of 3.4%. These results outperform comparable sensors reported previously, highlighting the efficacy of our methodology for sensor design.

  11. Design Methodology for Magnetic Field-Based Soft Tri-Axis Tactile Sensors

    Science.gov (United States)

    Wang, Hongbo; de Boer, Greg; Kow, Junwai; Alazmani, Ali; Ghajari, Mazdak; Hewson, Robert; Culmer, Peter

    2016-01-01

    Tactile sensors are essential if robots are to safely interact with the external world and to dexterously manipulate objects. Current tactile sensors have limitations restricting their use, notably being too fragile or having limited performance. Magnetic field-based soft tactile sensors offer a potential improvement, being durable, low cost, accurate and high bandwidth, but they are relatively undeveloped because of the complexities involved in design and calibration. This paper presents a general design methodology for magnetic field-based three-axis soft tactile sensors, enabling researchers to easily develop specific tactile sensors for a variety of applications. All aspects (design, fabrication, calibration and evaluation) of the development of tri-axis soft tactile sensors are presented and discussed. A moving least square approach is used to decouple and convert the magnetic field signal to force output to eliminate non-linearity and cross-talk effects. A case study of a tactile sensor prototype, MagOne, was developed. This achieved a resolution of 1.42 mN in normal force measurement (0.71 mN in shear force), good output repeatability and has a maximum hysteresis error of 3.4%. These results outperform comparable sensors reported previously, highlighting the efficacy of our methodology for sensor design. PMID:27563908

  12. Satellite based radar interferometry to estimate large-scale soil water depletion from clay shrinkage: possibilities and limitations

    NARCIS (Netherlands)

    Brake, te B.; Hanssen, R.F.; Ploeg, van der M.J.; Rooij, de G.H.

    2013-01-01

    Satellite-based radar interferometry is a technique capable of measuring small surface elevation changes at large scales and with a high resolution. In vadose zone hydrology, it has been recognized for a long time that surface elevation changes due to swell and shrinkage of clayey soils can serve as

  13. Ozone depletion, paradigms, and politics

    Energy Technology Data Exchange (ETDEWEB)

    Iman, R.L.

    1993-10-01

    The destruction of the Earth`s protective ozone layer is a prime environmental concern. Industry has responded to this environmental problem by: implementing conservation techniques to reduce the emission of ozone-depleting chemicals (ODCs); using alternative cleaning solvents that have lower ozone depletion potentials (ODPs); developing new, non-ozone-depleting solvents, such as terpenes; and developing low-residue soldering processes. This paper presents an overview of a joint testing program at Sandia and Motorola to evaluate a low-residue (no-clean) soldering process for printed wiring boards (PWBs). Such processes are in widespread use in commercial applications because they eliminate the cleaning operation. The goal of this testing program was to develop a data base that could be used to support changes in the mil-specs. In addition, a joint task force involving industry and the military has been formed to conduct a follow-up evaluation of low-residue processes that encompass the concerns of the tri-services. The goal of the task force is to gain final approval of the low-residue technology for use in military applications.

  14. Attitudes toward simulation-based learning in nursing students: an application of Q methodology.

    Science.gov (United States)

    Yeun, Eun Ja; Bang, Ho Yoon; Ryoo, Eon Na; Ha, Eun-Ho

    2014-07-01

    SBL is a highly advanced educational method that promotes technical/non-technical skills, increases team competency, and increases health care team interaction in a safe health care environment with no potential for harm to the patient. Even though students may experience the same simulation, their reactions are not necessarily uniform. This study aims at identifying the diversely perceived attitudes of undergraduate nursing students toward simulation-based learning. This study design was utilized using a Q methodology, which analyzes the subjectivity of each type of attitude. Data were collected from 22 undergraduate nursing students who had an experience of simulation-based learning before going to the clinical setting. The 45 selected Q-statements from each of 22 participants were classified into the shape of a normal distribution using a 9-point scale. The collected data was analyzed using the pc-QUANL program. The results revealed two discrete groups of students toward simulation-based learning: 'adventurous immersion' and 'constructive criticism'. The findings revealed that teaching and learning strategies based on the two factors of attitudes could beneficially contribute to the customization of simulation-based learning. In nursing education and clinical practice, teaching and learning strategies based on types I and II can be used to refine an alternative learning approach that supports and complements clinical practice. Recommendations have been provided based on the findings.

  15. 1Click1View: Interactive Visualization Methodology for RNAi Cell-Based Microscopic Screening

    Directory of Open Access Journals (Sweden)

    Lukasz Zwolinski

    2013-01-01

    Full Text Available Technological advancements are constantly increasing the size and complexity of data resulting from large-scale RNA interference screens. This fact has led biologists to ask complex questions, which the existing, fully automated analyses are often not adequate to answer. We present a concept of 1Click1View (1C1V as a methodology for interactive analytic software tools. 1C1V can be applied for two-dimensional visualization of image-based screening data sets from High Content Screening (HCS. Through an easy-to-use interface, one-click, one-view concept, and workflow based architecture, visualization method facilitates the linking of image data with numeric data. Such method utilizes state-of-the-art interactive visualization tools optimized for fast visualization of large scale image data sets. We demonstrate our method on an HCS dataset consisting of multiple cell features from two screening assays.

  16. Vision-based methodology for collaborative management of qualitative criteria in design

    DEFF Research Database (Denmark)

    Tollestrup, Christian

    2006-01-01

    A Vision-based methodology is proposed as part of the management of qualitative criteria for design in early phases of the product development process for team based organisations. Focusing on abstract values and qualities for the product establishes a shared vision for the product amongst team...... members. Two anchor points are used for representing these values and qualities, the Value Mission and the Interaction Vision. Qualifying the meaning of these words trough triangulation of methods develops a shared mental model within the team. The composition of keywords within the Vision and Mission...... establishes a field of tension that summarises the abstract criteria and pinpoints the desired uniqueness of the product. The Interaction Vision allows the team members to design the behaviour of the product without deciding on physical features, thus focusing on the cognitive aspects of the product...

  17. Application of the Simulation Based Reliability Analysis on the LBB methodology

    Directory of Open Access Journals (Sweden)

    Pečínka L.

    2008-11-01

    Full Text Available Guidelines on how to demonstrate the existence of Leak Before Break (LBB have been developed in many western countries. These guidelines, partly based on NUREG/CR-6765, define the steps that should be fulfilled to get a conservative assessment of LBB acceptability. As a complement and also to help identify the key parameters that influence the resulting leakage and failure probabilities, the application of Simulation Based Reliability Analysis is under development. The used methodology will be demonstrated on the assessment of through wall leakage crack stability according R6 method. R6 is a known engineering assessment procedure for the evaluation of the integrity of the flawed structure. Influence of thermal ageing and seismic event has been elaborate.

  18. Delivering spacecraft control centers with embedded knowledge-based systems: The methodology issue

    Science.gov (United States)

    Ayache, S.; Haziza, M.; Cayrac, D.

    1994-01-01

    Matra Marconi Space (MMS) occupies a leading place in Europe in the domain of satellite and space data processing systems. The maturity of the knowledge-based systems (KBS) technology, the theoretical and practical experience acquired in the development of prototype, pre-operational and operational applications, make it possible today to consider the wide operational deployment of KBS's in space applications. In this perspective, MMS has to prepare the introduction of the new methods and support tools that will form the basis of the development of such systems. This paper introduces elements of the MMS methodology initiatives in the domain and the main rationale that motivated the approach. These initiatives develop along two main axes: knowledge engineering methods and tools, and a hybrid method approach for coexisting knowledge-based and conventional developments.

  19. Standardization of formulations for the acute amino acid depletion and loading tests.

    Science.gov (United States)

    Badawy, Abdulla A-B; Dougherty, Donald M

    2015-04-01

    The acute tryptophan depletion and loading and the acute tyrosine plus phenylalanine depletion tests are powerful tools for studying the roles of cerebral monoamines in behaviour and symptoms related to various disorders. The tests use either amino acid mixtures or proteins. Current amino acid mixtures lack specificity in humans, but not in rodents, because of the faster disposal of branched-chain amino acids (BCAAs) by the latter. The high content of BCAA (30-60%) is responsible for the poor specificity in humans and we recommend, in a 50g dose, a control formulation with a lowered BCAA content (18%) as a common control for the above tests. With protein-based formulations, α-lactalbumin is specific for acute tryptophan loading, whereas gelatine is only partially effective for acute tryptophan depletion. We recommend the use of the whey protein fraction glycomacropeptide as an alternative protein. Its BCAA content is ideal for specificity and the absence of tryptophan, tyrosine and phenylalanine render it suitable as a template for seven formulations (separate and combined depletion or loading and a truly balanced control). We invite the research community to participate in standardization of the depletion and loading methodologies by using our recommended amino acid formulation and developing those based on glycomacropeptide.

  20. Performance-Based Adaptive Gradient Descent Optimal Coefficient Fuzzy Sliding Mode Methodology

    Directory of Open Access Journals (Sweden)

    Hossein Rezaie

    2012-10-01

    Full Text Available Design a nonlinear controller for second order nonlinear uncertain dynamical systems is the main challenge in this paper. This paper focuses on the design and analysis of a chattering free Mamdani’s fuzzy-based tuning gradient descent optimal error-based fuzzy sliding mode controller for highly nonlinear dynamic six degrees of freedom robot manipulator, in presence of uncertainties. Conversely, pure sliding mode controller is used in many applications; it has two important drawbacks namely; chattering phenomenon and nonlinear equivalent dynamic formulation in uncertain dynamic parameter. In order to solve the uncertain nonlinear dynamic parameters, implement easily and avoid mathematical model base controller, Mamdani’s performance/error-based fuzzy logic methodology with two inputs and one output and 49 rules is applied to pure sliding mode controller. Pure sliding mode controller and error-based fuzzy sliding mode controller have difficulty in handling unstructured model uncertainties. To solve this problem applied fuzzy-based tuning method to error-based fuzzy sliding mode controller for adjusting the sliding surface gain. Since the sliding surface gain is adjusted by gradient descent optimization method. Fuzzy-based tuning gradient descent optimal error-based fuzzy sliding mode controller is stable model-free controller which eliminates the chattering phenomenon without to use the boundary layer saturation function. Lyapunov stability is proved in fuzzy-based tuning gradient descent optimal fuzzy sliding mode controller based on switching (sign function. This controller has acceptable performance in presence of uncertainty (e.g., overshoot=0%, rise time=0.8 second, steady state error = 1e-9 and RMS error=1.8e-12.

  1. A Visual Analytics Based Decision Support Methodology For Evaluating Low Energy Building Design Alternatives

    Science.gov (United States)

    Dutta, Ranojoy

    The ability to design high performance buildings has acquired great importance in recent years due to numerous federal, societal and environmental initiatives. However, this endeavor is much more demanding in terms of designer expertise and time. It requires a whole new level of synergy between automated performance prediction with the human capabilities to perceive, evaluate and ultimately select a suitable solution. While performance prediction can be highly automated through the use of computers, performance evaluation cannot, unless it is with respect to a single criterion. The need to address multi-criteria requirements makes it more valuable for a designer to know the "latitude" or "degrees of freedom" he has in changing certain design variables while achieving preset criteria such as energy performance, life cycle cost, environmental impacts etc. This requirement can be met by a decision support framework based on near-optimal "satisficing" as opposed to purely optimal decision making techniques. Currently, such a comprehensive design framework is lacking, which is the basis for undertaking this research. The primary objective of this research is to facilitate a complementary relationship between designers and computers for Multi-Criterion Decision Making (MCDM) during high performance building design. It is based on the application of Monte Carlo approaches to create a database of solutions using deterministic whole building energy simulations, along with data mining methods to rank variable importance and reduce the multi-dimensionality of the problem. A novel interactive visualization approach is then proposed which uses regression based models to create dynamic interplays of how varying these important variables affect the multiple criteria, while providing a visual range or band of variation of the different design parameters. The MCDM process has been incorporated into an alternative methodology for high performance building design referred to as

  2. Testing fully depleted CCD

    Science.gov (United States)

    Casas, Ricard; Cardiel-Sas, Laia; Castander, Francisco J.; Jiménez, Jorge; de Vicente, Juan

    2014-08-01

    The focal plane of the PAU camera is composed of eighteen 2K x 4K CCDs. These devices, plus four spares, were provided by the Japanese company Hamamatsu Photonics K.K. with type no. S10892-04(X). These detectors are 200 μm thick fully depleted and back illuminated with an n-type silicon base. They have been built with a specific coating to be sensitive in the range from 300 to 1,100 nm. Their square pixel size is 15 μm. The read-out system consists of a Monsoon controller (NOAO) and the panVIEW software package. The deafualt CCD read-out speed is 133 kpixel/s. This is the value used in the calibration process. Before installing these devices in the camera focal plane, they were characterized using the facilities of the ICE (CSIC- IEEC) and IFAE in the UAB Campus in Bellaterra (Barcelona, Catalonia, Spain). The basic tests performed for all CCDs were to obtain the photon transfer curve (PTC), the charge transfer efficiency (CTE) using X-rays and the EPER method, linearity, read-out noise, dark current, persistence, cosmetics and quantum efficiency. The X-rays images were also used for the analysis of the charge diffusion for different substrate voltages (VSUB). Regarding the cosmetics, and in addition to white and dark pixels, some patterns were also found. The first one, which appears in all devices, is the presence of half circles in the external edges. The origin of this pattern can be related to the assembly process. A second one appears in the dark images, and shows bright arcs connecting corners along the vertical axis of the CCD. This feature appears in all CCDs exactly in the same position so our guess is that the pattern is due to electrical fields. Finally, and just in two devices, there is a spot with wavelength dependence whose origin could be the result of a defectous coating process.

  3. Application of machine learning methodology for PET-based definition of lung cancer.

    Science.gov (United States)

    Kerhet, A; Small, C; Quon, H; Riauka, T; Schrader, L; Greiner, R; Yee, D; McEwan, A; Roa, W

    2010-02-01

    We applied a learning methodology framework to assist in the threshold-based segmentation of non-small-cell lung cancer (NSCLC) tumours in positron-emission tomography-computed tomography (PET-CT) imaging for use in radiotherapy planning. Gated and standard free-breathing studies of two patients were independently analysed (four studies in total). Each study had a pet-ct and a treatment-planning ct image. The reference gross tumour volume (GTV) was identified by two experienced radiation oncologists who also determined reference standardized uptake value (SUV) thresholds that most closely approximated the GTV contour on each slice. A set of uptake distribution-related attributes was calculated for each PET slice. A machine learning algorithm was trained on a subset of the PET slices to cope with slice-to-slice variation in the optimal suv threshold: that is, to predict the most appropriate suv threshold from the calculated attributes for each slice. The algorithm's performance was evaluated using the remainder of the pet slices. A high degree of geometric similarity was achieved between the areas outlined by the predicted and the reference SUV thresholds (Jaccard index exceeding 0.82). No significant difference was found between the gated and the free-breathing results in the same patient. In this preliminary work, we demonstrated the potential applicability of a machine learning methodology as an auxiliary tool for radiation treatment planning in NSCLC.

  4. Methodology for definition of yellow fever priority areas, based on environmental variables and multiple correspondence analyses.

    Science.gov (United States)

    Moreno, Eduardo Stramandinoli; Barata, Rita de Cássia Barradas

    2012-01-01

    Yellow fever (YF) is endemic in much of Brazil, where cases of the disease are reported every year. Since 2008, outbreaks of the disease have occurred in regions of the country where no reports had been registered for decades, which has obligated public health authorities to redefine risk areas for the disease. The aim of the present study was to propose a methodology of environmental risk analysis for defining priority municipalities for YF vaccination, using as example, the State of São Paulo, Brazil. The municipalities were divided into two groups (affected and unaffected by YF) and compared based on environmental parameters related to the disease's eco-epidemiology. Bivariate analysis was used to identify statistically significant associations between the variables and virus circulation. Multiple correspondence analysis (MCA) was used to evaluate the relationship among the variables and their contribution to the dynamics of YF in Sao Paulo. The MCA generated a factor that was able to differentiate between affected and unaffected municipalities and was used to determine risk levels. This methodology can be replicated in other regions, standardized, and adapted to each context.

  5. Ni-based Superalloy Development for VHTR - Methodology Using Design of Experiments and Thermodynamic Calculation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung Woo; Kim, Dong Jin [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    In this work, to develop novel structural materials for the IHX of a VHTR, a more systematic methodology using the design of experiments (DOE) and thermodynamic calculations was proposed. For 32 sets of designs of Ni-Cr-Co-Mo alloys with minor elements of W and Ta, the mass fraction of TCP phases and mechanical properties were calculated, and finally the chemical composition was optimized for further experimental studies by applying the proposed . The highly efficient generation of electricity and the production of massive hydrogen are possible using a very high temperature gas-cooled reactor (VHTR) among generation IV nuclear power plants. The structural material for an intermediate heat exchanger (IHX) among numerous components should be endurable at high temperature of up to 950 .deg. C during long-term operation. Impurities inevitably introduced in helium as a coolant facilitate the material degradation by corrosion at high temperature. This work is concerning a methodology of Ni-Cr-Co-Mo based superalloy developed for VHTR using the design of experiments (DOE) and thermodynamic calculationsmethodology.

  6. Methodology for definition of yellow fever priority areas, based on environmental variables and multiple correspondence analyses.

    Directory of Open Access Journals (Sweden)

    Eduardo Stramandinoli Moreno

    Full Text Available Yellow fever (YF is endemic in much of Brazil, where cases of the disease are reported every year. Since 2008, outbreaks of the disease have occurred in regions of the country where no reports had been registered for decades, which has obligated public health authorities to redefine risk areas for the disease. The aim of the present study was to propose a methodology of environmental risk analysis for defining priority municipalities for YF vaccination, using as example, the State of São Paulo, Brazil. The municipalities were divided into two groups (affected and unaffected by YF and compared based on environmental parameters related to the disease's eco-epidemiology. Bivariate analysis was used to identify statistically significant associations between the variables and virus circulation. Multiple correspondence analysis (MCA was used to evaluate the relationship among the variables and their contribution to the dynamics of YF in Sao Paulo. The MCA generated a factor that was able to differentiate between affected and unaffected municipalities and was used to determine risk levels. This methodology can be replicated in other regions, standardized, and adapted to each context.

  7. A GIS – Based Methodology for Land Suitability Evaluation in Veneto (NE Italy

    Directory of Open Access Journals (Sweden)

    Alba Gallo

    2014-12-01

    Full Text Available Since almost ten years, the Soil Science Research Group in Venice is carrying out studies on the characterization of soils in the Veneto region and their suitability for specific uses. Several areas have been investigated with the aim to select the best land use for a sustainable environment. The scenarios taken into consideration range from the Alpine and pre – Alpine region to the alluvial plain. Attention has been focused especially to land suitability for forestry, typical and niche crops, pasture and vineyard. The land evaluation procedure has been applied by a GIS – based methodology. Today, the GIS techniques are essential for the success of a correct and fast work, concerning the interpretation and processing of soil data and its display in form of map. Integrating information with crop and soil requirements, by means of "matching tables", it was possible to edit and manage land suitability maps for specific purposes. The applied methodology proved a useful and effective tool for sustainable land management.

  8. Rapid and Robust PCR-Based All-Recombinant Cloning Methodology.

    Directory of Open Access Journals (Sweden)

    Abhishek Anil Dubey

    Full Text Available We report here a PCR-based cloning methodology that requires no post-PCR modifications such as restriction digestion and phosphorylation of the amplified DNA. The advantage of the present method is that it yields only recombinant clones thus eliminating the need for screening. Two DNA amplification reactions by PCR are performed wherein the first reaction amplifies the gene of interest from a source template, and the second reaction fuses it with the designed expression vector fragments. These vector fragments carry the essential elements that are required for the fusion product selection. The entire process can be completed in less than 8 hours. Furthermore, ligation of the amplified DNA by a DNA ligase is not required before transformation, although the procedure yields more number of colonies upon transformation if ligation is carried out. As a proof-of-concept, we show the cloning and expression of GFP, adh, and rho genes. Using GFP production as an example, we further demonstrate that the E. coli T7 express strain can directly be used in our methodology for the protein expression immediately after PCR. The expressed protein is without or with 6xHistidine tag at either terminus, depending upon the chosen vector fragments. We believe that our method will find tremendous use in molecular and structural biology.

  9. A Statistical Methodology for Determination of Safety Systems Actuation Setpoints Based on Extreme Value Statistics

    Directory of Open Access Journals (Sweden)

    D. R. Novog

    2008-01-01

    Full Text Available This paper provides a novel and robust methodology for determination of nuclear reactor trip setpoints which accounts for uncertainties in input parameters and models, as well as accounting for the variations in operating states that periodically occur. Further it demonstrates that in performing best estimate and uncertainty calculations, it is critical to consider the impact of all fuel channels and instrumentation in the integration of these uncertainties in setpoint determination. This methodology is based on the concept of a true trip setpoint, which is the reactor setpoint that would be required in an ideal situation where all key inputs and plant responses were known, such that during the accident sequence a reactor shutdown will occur which just prevents the acceptance criteria from being exceeded. Since this true value cannot be established, the uncertainties in plant simulations and plant measurements as well as operational variations which lead to time changes in the true value of initial conditions must be considered. This paper presents the general concept used to determine the actuation setpoints considering the uncertainties and changes in initial conditions, and allowing for safety systems instrumentation redundancy. The results demonstrate unique statistical behavior with respect to both fuel and instrumentation uncertainties which has not previously been investigated.

  10. A methodology for evaluating potential KBS (Knowledge-Based Systems) applications

    Energy Technology Data Exchange (ETDEWEB)

    Melton, R.B.; DeVaney, D.M.; Whiting, M.A.; Laufmann, S.C.

    1989-06-01

    It is often difficult to assess how well Knowledge-Based Systems (KBS) techniques and paradigms may be applied to automating various tasks. This report describes the approach and organization of an assessment procedure that involves two levels of analysis. Level One can be performed by individuals with little technical expertise relative to KBS development, while Level Two is intended to be used by experienced KBS developers. The two levels review four groups of issues: goals, appropriateness, resources, and non-technical considerations. Those criteria are identified which are important at each step in the assessment. A qualitative methodology for scoring the task relative to the assessment criteria is provided to alloy analysts to make better informed decisions with regard to the potential effectiveness of applying KBS technology. In addition to this documentation, the assessment methodology has been implemented for personal computers use using the HYPERCARD{trademark} software on a Macintosh{trademark} computer. This interactive mode facilities small group analysis of potential KBS applications and permits a non-sequential appraisal with provisions for automated note-keeping and question scoring. The results provide a useful tool for assessing the feasibility of using KBS techniques in performing tasks in support of treaty verification or IC functions. 13 refs., 3 figs.

  11. An Evidence Based Methodology to Facilitate Public Library Non-fiction Collection Development

    Directory of Open Access Journals (Sweden)

    Matthew Kelly

    2015-12-01

    Full Text Available Objective – This research was designed as a pilot study to test a methodology for subject based collection analysis for public libraries. Methods – WorldCat collection data from eight Australian public libraries was extracted using the Collection Evaluation application. The data was aggregated and filtered to assess how the sample’s titles could be compared against the OCLC Conspectus subject categories. A hierarchy of emphasis emerged and this was divided into tiers ranging from 1% of the sample. These tiers were further analysed to quantify their representativeness against both the sample’s titles and the subject categories taken as a whole. The interpretive aspect of the study sought to understand the types of knowledge embedded in the tiers and was underpinned by hermeneutic phenomenology. Results – The study revealed that there was a marked tendency for a small percentage of subject categories to constitute a large proportion of the potential topicality that might have been represented in these types of collections. The study also found that distribution of the aggregated collection conformed to a Power Law distribution (80/20 so that approximately 80% of the collection was represented by 20% of the subject categories. The study also found that there were significant commonalities in the types of subject categories that were found in the designated tiers and that it may be possible to develop ontologies that correspond to the collection tiers. Conclusions – The evidence-based methodology developed in this pilot study has the potential for further development to help to improve the practice of collection development. The introduction of the concept of the epistemic role played by collection tiers is a promising aid to inform our understanding of knowledge organization for public libraries. The research shows a way forward to help to link subjective decision making with a scientifically based approach to managing knowledge

  12. Depletion of intense fields

    CERN Document Server

    Seipt, D; Marklund, M; Bulanov, S S

    2016-01-01

    The interaction of charged particles and photons with intense electromagnetic fields gives rise to multi-photon Compton and Breit-Wheeler processes. These are usually described in the framework of the external field approximation, where the electromagnetic field is assumed to have infinite energy. However, the multi-photon nature of these processes implies the absorption of a significant number of photons, which scales as the external field amplitude cubed. As a result, the interaction of a highly charged electron bunch with an intense laser pulse can lead to significant depletion of the laser pulse energy, thus rendering the external field approximation invalid. We provide relevant estimates for this depletion and find it to become important in the interaction between fields of amplitude $a_0 \\sim 10^3$ and electron bunches with charges of the order of nC.

  13. Generic System Verilog Universal Verification Methodology Based Reusable Verification Environment for Efficient Verification of Image Signal Processing IPS/SOCS

    Directory of Open Access Journals (Sweden)

    Abhishek Jain

    2012-12-01

    Full Text Available In this paper, we present Generic System Verilog Universal Verification Methodology based Reusable Verification Environment for efficient verification of Image Signal Processing IP’s/SoC’s. With the tight schedules on all projects it is important to have a strong verification methodology which contributes to First Silicon Success. Deploy methodologies which enforce full functional coverage and verification of corner cases through pseudo random test scenarios is required. Also, standardization of verification flow is needed. Previously, inside imaging group of ST, Specman (e/Verilog based Verification Environment forIP/Subsystem level verification and C/C++/Verilog based Directed Verification Environment for SoC Level Verification was used for Functional Verification. Different Verification Environments were used at IP level and SoC level. Different Verification/Validation Methodologies were used for SoC Verification across multiple sites. Verification teams were also looking for the ways how to catch bugs early in the design cycle? Thus, Generic System Verilog Universal Verification Methodology (UVM based Reusable Verification Environment is required to avoid the problem of having so many methodologies and provides a standard unified solution which compiles on all tools.

  14. A Model-Based Methodology for Simultaneous Design and Control of a Bioethanol Production Process

    DEFF Research Database (Denmark)

    Alvarado-Morales, Merlin; Abd.Hamid, Mohd-Kamaruddin; Sin, Gürkan

    2010-01-01

    . The PGC methodology is used to generate more efficient separation designs in terms of energy consumption by targeting the separation task at the largest DF. Both methodologies are highlighted through the application of two case studies, a bioethanol production process and a succinic acid production......In this work, a framework for the simultaneous solution of design and control problems is presented. Within this framework, two methodologies are presented, the integrated process design and controller design (IPDC) methodology and the process-group contribution (PGC) methodology. The concepts...... of attainable region (AR), driving force (DF), process-group (PG) and reverse simulation are used within these methodologies. The IPDC methodology is used to find the optimal design-control strategy of a process by locating the maximum point in the AR and DF diagrams for reactor and separator, respectively...

  15. Methodology of determining the uncertainty in the accessible geothermal resource base of identified hydrothermal convection systems

    Science.gov (United States)

    Nathenson, Manuel

    1978-01-01

    In order to quantify the uncertainty of estimates of the geothermal resource base in identified hydrothermal convection systems, a methodology is presented for combining estimates with uncertainties for temperature, area, and thickness of a geothermal reservoir into an estimate of the stored energy with uncertainty. Probability density functions for temperature, area, and thickness are assumed to be triangular in form. In order to calculate the probability distribution function for the stored energy in a single system or in many systems, a computer program for aggregating the input distribution functions using the Monte-Carlo method has been developed. To calculate the probability distribution of stored energy in a single system, an analytical expression is also obtained that is useful for calibrating the Monte Carlo approximation. For the probability distributions of stored energy in a single and in many systems, the central limit approximation is shown to give results ranging from good to poor.

  16. Generic Methodology for Field Calibration of Nacelle-Based Wind Lidars

    DEFF Research Database (Denmark)

    Borraccino, Antoine; Courtney, Michael; Wagner, Rozenn

    2016-01-01

    by the geometry of the scanning trajectory and the lidar inclination. The line-of-sight velocity is calibrated in atmospheric conditions by comparing it to a reference quantity based on classic instrumentation such as cup anemometers and wind vanes. The generic methodology was tested on two commercially developed...... lidars, one continuous wave and one pulsed systems, and provides consistent calibration results: linear regressions show a difference of ∼0.5% between the lidar-measured and reference line-of-sight velocities. A comprehensive uncertainty procedure propagates the reference uncertainty to the lidar...... measurements. At a coverage factor of two, the estimated line-of-sight velocity uncertainty ranges from 3.2% at 3 m·s-1 to 1.9% at 16 m·s-1. Most of the line-of-sight velocity uncertainty originates from the reference: the cup anemometer uncertainty accounts for 90% of the total uncertainty. The propagation...

  17. Homeland security R&D roadmapping : risk-based methodological options.

    Energy Technology Data Exchange (ETDEWEB)

    Brandt, Larry D.

    2008-12-01

    The Department of Energy (DOE) National Laboratories support the Department of Homeland Security (DHS) in the development and execution of a research and development (R&D) strategy to improve the nation's preparedness against terrorist threats. Current approaches to planning and prioritization of DHS research decisions are informed by risk assessment tools and processes intended to allocate resources to programs that are likely to have the highest payoff. Early applications of such processes have faced challenges in several areas, including characterization of the intelligent adversary and linkage to strategic risk management decisions. The risk-based analysis initiatives at Sandia Laboratories could augment the methodologies currently being applied by the DHS and could support more credible R&D roadmapping for national homeland security programs. Implementation and execution issues facing homeland security R&D initiatives within the national laboratories emerged as a particular concern in this research.

  18. Methodology for reliability allocation based on fault tree analysis and dualistic contrast

    Institute of Scientific and Technical Information of China (English)

    TONG Lili; CAO Xuewu

    2008-01-01

    Reliability allocation is a difficult multi-objective optimization problem.This paper presents a methodology for reliability allocation that can be applied to determine the reliability characteristics of reactor systems or subsystems.The dualistic contrast,known as one of the most powerful tools for optimization problems,is applied to the reliability allocation model of a typical system in this article.And the fault tree analysis,deemed to be one of the effective methods of reliability analysis,is also adopted.Thus a failure rate allocation model based on the fault tree analysis and dualistic contrast is achieved.An application on the emergency diesel generator in the nuclear power plant is given to illustrate the proposed method.

  19. A rigorous methodology for development and uncertainty analysis of group contribution based property models

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Abildskov, Jens; Sin, Gürkan

    . The GC model uses the Marrero-Gani (MR) method which considers the group contribution in different levels both functional and structural. The methodology helps improve accuracy and reliability of property modeling and provides a rigorous model quality check and assurance. This is expected to further......Property prediction models are a fundamental tool of process modeling and analysis, especially at the early stage of process development. Furthermore, property prediction models are the fundamental tool for Computer-aided molecular design used for the development of new refrigerants. Group...... contribution (GC) based prediction methods use structurally dependent parameters in order to determine the property of pure components. The aim of the GC parameter estimation is to find the best possible set of model parameters that fits the experimental data. In that sense, there is often a lack of attention...

  20. Software development methodology for computer based I&C systems of prototype fast breeder reactor

    Energy Technology Data Exchange (ETDEWEB)

    Manimaran, M., E-mail: maran@igcar.gov.in; Shanmugam, A.; Parimalam, P.; Murali, N.; Satya Murty, S.A.V.

    2015-10-15

    Highlights: • Software development methodology adopted for computer based I&C systems of PFBR is detailed. • Constraints imposed as part of software requirements and coding phase are elaborated. • Compliance to safety and security requirements are described. • Usage of CASE (Computer Aided Software Engineering) tools during software design, analysis and testing phase are explained. - Abstract: Prototype Fast Breeder Reactor (PFBR) is sodium cooled reactor which is in the advanced stage of construction in Kalpakkam, India. Versa Module Europa bus based Real Time Computer (RTC) systems are deployed for Instrumentation & Control of PFBR. RTC systems have to perform safety functions within the stipulated time which calls for highly dependable software. Hence, well defined software development methodology is adopted for RTC systems starting from the requirement capture phase till the final validation of the software product. V-model is used for software development. IEC 60880 standard and AERB SG D-25 guideline are followed at each phase of software development. Requirements documents and design documents are prepared as per IEEE standards. Defensive programming strategies are followed for software development using C language. Verification and validation (V&V) of documents and software are carried out at each phase by independent V&V committee. Computer aided software engineering tools are used for software modelling, checking for MISRA C compliance and to carry out static and dynamic analysis. Various software metrics such as cyclomatic complexity, nesting depth and comment to code are checked. Test cases are generated using equivalence class partitioning, boundary value analysis and cause and effect graphing techniques. System integration testing is carried out wherein functional and performance requirements of the system are monitored.

  1. Residual stresses in coating-based systems, part Ⅱ: Optimal designing methodologies

    Institute of Scientific and Technical Information of China (English)

    ZHANG Xiancheng; WU Yixiong; XU Binshi; WANG Haidou

    2007-01-01

    In this part of the work,different cases are studied to illustrate the implementation of the analytical models that have been developed in Part Ⅰ [Front.Mech.Eng.China,2007,2(1):1-12].Different topics are involved in the optimal design of coating-based systems.Some essential relations among material properties and dimensions of the coating and substrate and the residual stress variations are reflected.For the multilayered coating-based systems,some optimal design methodologies are proposed,such as the decrease in stress discontinuity at the interface between the adjacent layers,the curvature as a function of the coating thickness,the effect of the interlayer on the residual stress redistribution,and so on.For the single-layered coating-based systems,some typical approximations that are often used to predict the residual stresses in the coating-based system or bilayer structure are corrected.A simplified model for predicting the quenching stress in a coating is also developed.

  2. A Rapid Python-Based Methodology for Target-Focused Combinatorial Library Design.

    Science.gov (United States)

    Li, Shiliang; Song, Yuwei; Liu, Xiaofeng; Li, Honglin

    2016-01-01

    The chemical space is so vast that only a small portion of it has been examined. As a complementary approach to systematically probe the chemical space, virtual combinatorial library design has extended enormous impacts on generating novel and diverse structures for drug discovery. Despite the favorable contributions, high attrition rates in drug development that mainly resulted from lack of efficacy and side effects make it increasingly challenging to discover good chemical starting points. In most cases, focused libraries, which are restricted to particular regions of the chemical space, are deftly exploited to maximize hit rate and improve efficiency at the beginning of the drug discovery and drug development pipeline. This paper presented a valid methodology for fast target-focused combinatorial library design in both reaction-based and production-based ways with the library creating rates of approximately 70,000 molecules per second. Simple, quick and convenient operating procedures are the specific features of the method. SHAFTS, a hybrid 3D similarity calculation software, was embedded to help refine the size of the libraries and improve hit rates. Two target-focused (p38-focused and COX2-focused) libraries were constructed efficiently in this study. This rapid library enumeration method is portable and applicable to any other targets for good chemical starting points identification collaborated with either structure-based or ligand-based virtual screening.

  3. Development and Evaluation of an Improved Methodology for Assessing Adherence to Evidence-Based Drug Therapy Guidelines Using Claims Data

    Science.gov (United States)

    Kawamoto, Kensaku; Allen LaPointe, Nancy M.; Silvey, Garry M.; Anstrom, Kevin J.; Eisenstein, Eric L.; Lobach, David F.

    2007-01-01

    Non-adherence to evidence-based pharmacotherapy is associated with increased morbidity and mortality. Claims data can be used to detect and intervene on such non-adherence, but existing claims-based approaches for measuring adherence to pharmacotherapy guidelines have significant limitations. In this manuscript, we describe a methodology for assessing adherence to pharmacotherapy guidelines that overcomes many of these limitations. To develop this methodology, we first reviewed the literature to identify prior work on potential strategies for overcoming these limitations. We then assembled a team of relevant domain experts to iteratively develop an improved methodology. This development process was informed by the use of the proposed methodology to assess adherence levels for 14 pharmacotherapy guidelines related to seven common diseases among approximately 36,000 Medicaid beneficiaries. Finally, we evaluated the ability of the methodology to overcome the targeted limitations. Based on this evaluation, we conclude that the proposed methodology overcomes many of the limitations associated with existing approaches. PMID:18693865

  4. Ozone-depleting Substances (ODS)

    Data.gov (United States)

    U.S. Environmental Protection Agency — This site includes all of the ozone-depleting substances (ODS) recognized by the Montreal Protocol. The data include ozone depletion potentials (ODP), global warming...

  5. High-resolution three-dimensional imaging of a depleted CMOS sensor using an edge Transient Current Technique based on the Two Photon Absorption process (TPA-eTCT)

    CERN Document Server

    García, Marcos Fernández; Echeverría, Richard Jaramillo; Moll, Michael; Santos, Raúl Montero; Moya, David; Pinto, Rogelio Palomo; Vila, Iván

    2016-01-01

    For the first time, the deep n-well (DNW) depletion space of a High Voltage CMOS sensor has been characterized using a Transient Current Technique based on the simultaneous absorption of two photons. This novel approach has allowed to resolve the DNW implant boundaries and therefore to accurately determine the real depleted volume and the effective doping concentration of the substrate. The unprecedented spatial resolution of this new method comes from the fact that measurable free carrier generation in two photon mode only occurs in a micrometric scale voxel around the focus of the beam. Real three-dimensional spatial resolution is achieved by scanning the beam focus within the sample.

  6. Contingency theoretic methodology for agent-based web-oriented manufacturing systems

    Science.gov (United States)

    Durrett, John R.; Burnell, Lisa J.; Priest, John W.

    2000-12-01

    The development of distributed, agent-based, web-oriented, N-tier Information Systems (IS) must be supported by a design methodology capable of responding to the convergence of shifts in business process design, organizational structure, computing, and telecommunications infrastructures. We introduce a contingency theoretic model for the use of open, ubiquitous software infrastructure in the design of flexible organizational IS. Our basic premise is that developers should change in the way they view the software design process from a view toward the solution of a problem to one of the dynamic creation of teams of software components. We postulate that developing effective, efficient, flexible, component-based distributed software requires reconceptualizing the current development model. The basic concepts of distributed software design are merged with the environment-causes-structure relationship from contingency theory; the task-uncertainty of organizational- information-processing relationships from information processing theory; and the concept of inter-process dependencies from coordination theory. Software processes are considered as employees, groups of processes as software teams, and distributed systems as software organizations. Design techniques already used in the design of flexible business processes and well researched in the domain of the organizational sciences are presented. Guidelines that can be utilized in the creation of component-based distributed software will be discussed.

  7. Defatted flaxseed meal incorporated corn-rice flour blend based extruded product by response surface methodology.

    Science.gov (United States)

    Ganorkar, Pravin M; Patel, Jhanvi M; Shah, Vrushti; Rangrej, Vihang V

    2016-04-01

    Considering the evidence of flaxseed and its defatted flaxseed meal (DFM) for human health benefits, response surface methodology (RSM) based on three level four factor central composite rotatable design (CCRD) was employed for the development of DFM incorporated corn - rice flour blend based extruded snack. The effect of DFM fortification (7.5-20 %), moisture content of feed (14-20 %, wb), extruder barrel temperature (115-135 °C) and screw speed (300-330 RPM) on expansion ratio (ER), breaking strength (BS), overall acceptability (OAA) score and water solubility index (WSI) of extrudates were investigated using central composite rotatable design (CCRD). Significant regression models explained the effect of considered variables on all responses. DFM incorporation level was found to be most significant independent variable affecting on extrudates characteristics followed by extruder barrel temperature and then screw rpm. Feed moisture content did not affect extrudates characteristics. As DFM level increased (7.5 % to 20 %), ER and OAA value decreased. However, BS and WSI values were found to increase with increase in DFM level. Based on the defined criteria for numerical optimization, the combination for the production of DFM incorporated extruded snack with desired sensory attributes was achieved by incorporating 10 % DFM (replacing rice flour in flour blend) and by keeping 20 % moisture content, 312 screw rpm and 125 °C barrel temperature.

  8. The use of depleted uranium ammunition under contemporary international law: is there a need for a treaty-based ban on DU weapons?

    Science.gov (United States)

    Borrmann, Robin

    2010-01-01

    This article examines whether the use of Depleted Uranium (DU) munitions can be considered illegal under current public international law. The analysis covers the law of arms control and focuses in particular on international humanitarian law. The article argues that DU ammunition cannot be addressed adequately under existing treaty based weapon bans, such as the Chemical Weapons Convention, due to the fact that DU does not meet the criteria required to trigger the applicability of those treaties. Furthermore, it is argued that continuing uncertainties regarding the effects of DU munitions impedes a reliable review of the legality of their use under various principles of international law, including the prohibition on employing indiscriminate weapons; the prohibition on weapons that are intended, or may be expected, to cause widespread, long-term and severe damage to the natural environment; and the prohibition on causing unnecessary suffering or superfluous injury. All of these principles require complete knowledge of the effects of the weapon in question. Nevertheless, the author argues that the same uncertainty places restrictions on the use of DU under the precautionary principle. The paper concludes with an examination of whether or not there is a need for--and if so whether there is a possibility of achieving--a Convention that comprehensively outlaws the use, transfer and stockpiling of DU weapons, as proposed by some non-governmental organisations (NGOs).

  9. Artificial Neural Network and Response Surface Methodology Modeling in Ionic Conductivity Predictions of Phthaloylchitosan-Based Gel Polymer Electrolyte

    Directory of Open Access Journals (Sweden)

    Ahmad Danial Azzahari

    2016-01-01

    Full Text Available A gel polymer electrolyte system based on phthaloylchitosan was prepared. The effects of process variables, such as lithium iodide, caesium iodide, and 1-butyl-3-methylimidazolium iodide were investigated using a distance-based ternary mixture experimental design. A comparative approach was made between response surface methodology (RSM and artificial neural network (ANN to predict the ionic conductivity. The predictive capabilities of the two methodologies were compared in terms of coefficient of determination R2 based on the validation data set. It was shown that the developed ANN model had better predictive outcome as compared to the RSM model.

  10. A methodology for capability-based technology evaluation for systems-of-systems

    Science.gov (United States)

    Biltgen, Patrick Thomas

    2007-12-01

    Post-Cold War military conflicts have highlighted the need for a flexible, agile joint force responsive to emerging crises around the globe. The 2005 Joint Capabilities Integration and Development System (JCIDS) acquisition policy document mandates a shift away from stove-piped threat-based acquisition to a capability-based model focused on the multiple ways and means of achieving an effect. This shift requires a greater emphasis on scenarios, tactics, and operational concepts during the conceptual phase of design and structured processes for technology evaluation to support this transition are lacking. In this work, a methodology for quantitative technology evaluation for systems-of-systems is defined. Physics-based models of an aircraft system are exercised within a hierarchical, object-oriented constructive simulation to quantify technology potential in the context of a relevant scenario. A major technical challenge to this approach is the lack of resources to support real-time human-in-the-loop tactical decision making and technology analysis. An approach that uses intelligent agents to create a "Meta-General" capable of forecasting strategic and tactical decisions based on technology inputs is used. To demonstrate the synergy between new technologies and tactics, surrogate models are utilized to provide intelligence to individual agents within the framework and develop a set of tactics that appropriately exploit new technologies. To address the long run-times associated with constructive military simulations, neural network surrogate models are implemented around the forecasting environment to enable rapid trade studies. Probabilistic techniques are used to quantify uncertainty and richly populate the design space with technology-infused alternatives. Since a large amount of data is produced in the analysis of systems-of-systems, dynamic, interactive visualization techniques are used to enable "what-if" games on assumptions, systems, technologies, tactics, and

  11. Four Sustainability Paradigms for Environmental Management: A Methodological Analysis and an Empirical Study Based on 30 Italian Industries

    Directory of Open Access Journals (Sweden)

    Fabio Zagonari

    2016-05-01

    Full Text Available This paper develops an empirical methodology to consistently compare alternative sustainability paradigms (weak sustainability (WS, strong sustainability (SS, a-growth (AG, and de-growth (DG and different assessment approaches (LCA, CBA, and MCA within alternative relationship frameworks (economic general equilibrium (EGE and ecosystem services (ESS. The goal is to suggest different environmental interventions (e.g., projects vs. policies for environmental management at national, regional, or local levels. The top-down methodology is then applied to 30 interdependent industries in Italy for three pollutants and four resources during two periods. The industries were prioritized in terms of interventions to be taken to diminish pollution damage and resource depletion, whereas sustainability paradigms were compared in terms of their likelihood (i.e., WS > AG = DG > SS, robustness (i.e., AG > SS > DG > WS, effectiveness (i.e., SS > AG > DG > WS, and feasibility (i.e., SS > DG > WS > AG. Proper assessment approaches for projects are finally identified for situations when policies are infeasible (e.g., LCA in WS and SS, MCA in DG and SS within ESS, CBA in WS, and AG within EGE, by suggesting MCA in WS within ESS once ecological services are linked to sustainability criteria.

  12. Research on Part Experssion Methodology of Part Library Based on ISO13584

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    ISO13584 (PLIB) is an international standard, which is established to realize computer's identification ofdata expression and data exchange of part library. In this international standard, part expression methodology ofpart library is an important characteristic, which distinguishes itself from STEP. So, the methodology is a focus inthe research of part library. This article describes the principles of part information expression of part library basedon ISO13584, and the research results of the methodology in details.

  13. Construction Material-Based Methodology for Military Contingency Base Construction: Case Study of Maiduguri, Nigeria

    Science.gov (United States)

    2016-09-01

    Nigerians believe that the initial feasibility work was improperly conducted. One of the main feasibility studies was conducted with the in- tent to import...Military Contingency Base Construction Case Study of Maiduguri, Nigeria Co ns tr uc tio n En gi ne er in g R es ea rc h La bo ra to ry...Case Study of Maiduguri, Nigeria Patrick J. Guertin, George W. Calfas, Michael K. Valentino, and Ghassan K. Al-Chaar Construction Engineering

  14. Scalability of a Methodology for Generating Technical Trading Rules with GAPs Based on Risk-Return Adjustment and Incremental Training

    Science.gov (United States)

    de La Cal, E. A.; Fernández, E. M.; Quiroga, R.; Villar, J. R.; Sedano, J.

    In previous works a methodology was defined, based on the design of a genetic algorithm GAP and an incremental training technique adapted to the learning of series of stock market values. The GAP technique consists in a fusion of GP and GA. The GAP algorithm implements the automatic search for crisp trading rules taking as objectives of the training both the optimization of the return obtained and the minimization of the assumed risk. Applying the proposed methodology, rules have been obtained for a period of eight years of the S&P500 index. The achieved adjustment of the relation return-risk has generated rules with returns very superior in the testing period to those obtained applying habitual methodologies and even clearly superior to Buy&Hold. This work probes that the proposed methodology is valid for different assets in a different market than previous work.

  15. A risk-based methodology for ranking environmental chemical stressors at the regional scale.

    Science.gov (United States)

    Giubilato, Elisa; Zabeo, Alex; Critto, Andrea; Giove, Silvio; Bierkens, Johan; Den Hond, Elly; Marcomini, Antonio

    2014-04-01

    A "Risk-based Tool for the Regional Ranking of Environmental Chemical Stressors" has been developed, aimed at supporting decision-makers in the identification of priority environmental contaminants, as well as priority areas, to be further assessed. The tool implements a methodology based on a quantitative Weight-of-Evidence approach, integrating three types of information, identified as "Lines-of-Evidence" (LoE), namely: LoE "Environmental Contamination" (including data on chemical contamination in environmental matrices in the region, thus providing information on potential population exposure), LoE "Intake" (including results from human biomonitoring studies, i.e. concentration of chemicals in human biological matrices, thus providing an integrated estimation of exposure) and LoE "Observed Effects" (including information on the incidence of adverse health outcomes associated with environmental exposure to chemicals). A Multi-Criteria Decision Analysis (MCDA) methodology based on fuzzy logic has been developed to support the integration of information related to these three LoEs for each chemical stressor. The tool allows one to rank chemical stressors at different spatial scales, such as at the regional level as well as within each sub-area (e.g., counties). Moreover, it supports the identification of priority sub-areas within the region, where environmental and health data suggest possible adverse health effects and thus more investigation efforts are needed. To evaluate the performance of this newly developed tool, a case-study in the Flemish region (north of Belgium) has been selected. In the case-study, data on soil contamination by metals and organic contaminants were integrated with data on exposure and effect biomarkers measured in adolescents within the framework of the human biomonitoring study performed by the Flemish Centre of Expertise on Environment and Health in the period 2002-2006. The case-study demonstrated the performance of the tool in

  16. A study of polar ozone depletion based on sequential assimilation of satellite data from the ENVISAT/MIPAS and Odin/SMR instruments

    Directory of Open Access Journals (Sweden)

    J. D. Rösevall

    2007-01-01

    Full Text Available The objective of this study is to demonstrate how polar ozone depletion can be mapped and quantified by assimilating ozone data from satellites into the wind driven transport model DIAMOND, (Dynamical Isentropic Assimilation Model for OdiN Data. By assimilating a large set of satellite data into a transport model, ozone fields can be built up that are less noisy than the individual satellite ozone profiles. The transported fields can subsequently be compared to later sets of incoming satellite data so that the rates and geographical distribution of ozone depletion can be determined. By tracing the amounts of solar irradiation received by different air parcels in a transport model it is furthermore possible to study the photolytic reactions that destroy ozone. In this study, destruction of ozone that took place in the Antarctic winter of 2003 and in the Arctic winter of 2002/2003 have been examined by assimilating ozone data from the ENVISAT/MIPAS and Odin/SMR satellite-instruments. Large scale depletion of ozone was observed in the Antarctic polar vortex of 2003 when sunlight returned after the polar night. By mid October ENVISAT/MIPAS data indicate vortex ozone depletion in the ranges 80–100% and 70–90% on the 425 and 475 K potential temperature levels respectively while the Odin/SMR data indicates depletion in the ranges 70–90% and 50–70%. The discrepancy between the two instruments has been attributed to systematic errors in the Odin/SMR data. Assimilated fields of ENVISAT/MIPAS data indicate ozone depletion in the range 10–20% on the 475 K potential temperature level, (~19 km altitude, in the central regions of the 2002/2003 Arctic polar vortex. Assimilated fields of Odin/SMR data on the other hand indicate ozone depletion in the range 20–30%.

  17. Towards A Model-Based Prognostics Methodology For Electrolytic Capacitors: A Case Study Based On Electrical Overstress Accelerated Aging

    Data.gov (United States)

    National Aeronautics and Space Administration — This paper presents a model-driven methodology for predict- ing the remaining useful life of electrolytic capacitors. This methodology adopts a Kalman filter...

  18. Towards A Model-based Prognostics Methodology for Electrolytic Capacitors: A Case Study Based on Electrical Overstress Accelerated Aging

    Data.gov (United States)

    National Aeronautics and Space Administration — A remaining useful life prediction methodology for elec- trolytic capacitors is presented. This methodology adopts a Kalman filter approach in conjunction with an...

  19. Three-dimensional design methodologies for tree-based FPGA architecture

    CERN Document Server

    Pangracious, Vinod; Mehrez, Habib

    2015-01-01

    This book focuses on the development of 3D design and implementation methodologies for Tree-based FPGA architecture. It also stresses the needs for new and augmented 3D CAD tools to support designs such as, the design for 3D, to manufacture high performance 3D integrated circuits and reconfigurable FPGA-based systems. This book was written as a text that covers the foundations of 3D integrated system design and FPGA architecture design. It was written for the use in an elective or core course at the graduate level in field of Electrical Engineering, Computer Engineering and Doctoral Research programs. No previous background on 3D integration is required, nevertheless fundamental understanding of 2D CMOS VLSI design is required. It is assumed that reader has taken the core curriculum in Electrical Engineering or Computer Engineering, with courses like CMOS VLSI design, Digital System Design and Microelectronics Circuits being the most important. It is accessible for self-study by both senior students and profe...

  20. A Test Methodology for Determining Space-Readiness of Xilinx SRAM-Based FPGA Designs

    Energy Technology Data Exchange (ETDEWEB)

    Quinn, Heather M [Los Alamos National Laboratory; Graham, Paul S [Los Alamos National Laboratory; Morgan, Keith S [Los Alamos National Laboratory; Caffrey, Michael P [Los Alamos National Laboratory

    2008-01-01

    Using reconfigurable, static random-access memory (SRAM) based field-programmable gate arrays (FPGAs) for space-based computation has been an exciting area of research for the past decade. Since both the circuit and the circuit's state is stored in radiation-tolerant memory, both could be alterd by the harsh space radiation environment. Both the circuit and the circuit's state can be prote cted by triple-moduler redundancy (TMR), but applying TMR to FPGA user designs is often an error-prone process. Faulty application of TMR could cause the FPGA user circuit to output incorrect data. This paper will describe a three-tiered methodology for testing FPGA user designs for space-readiness. We will describe the standard approach to testing FPGA user designs using a particle accelerator, as well as two methods using fault injection and a modeling tool. While accelerator testing is the current 'gold standard' for pre-launch testing, we believe the use of fault injection and modeling tools allows for easy, cheap and uniform access for discovering errors early in the design process.

  1. A Platform-Based Methodology for System-Level Mixed-Signal Design

    Directory of Open Access Journals (Sweden)

    Alberto Sangiovanni-Vincentelli

    2010-01-01

    Full Text Available The complexity of today's embedded electronic systems as well as their demanding performance and reliability requirements are such that their design can no longer be tackled with ad hoc techniques while still meeting tight time to-market constraints. In this paper, we present a system level design approach for electronic circuits, utilizing the platform-based design (PBD paradigm as the natural framework for mixed-domain design formalization. In PBD, a meet-in-the-middle approach allows systematic exploration of the design space through a series of top-down mapping of system constraints onto component feasibility models in a platform library, which is based on bottom-up characterizations. In this framework, new designs can be assembled from the precharacterized library components, giving the highest priority to design reuse, correct assembly, and efficient design flow from specifications to implementation. We apply concepts from design centering to enforce robustness to modeling errors as well as process, voltage, and temperature variations, which are currently plaguing embedded system design in deep-submicron technologies. The effectiveness of our methodology is finally shown on the design of a pipeline A/D converter and two receiver front-ends for UMTS and UWB communications.

  2. Heuristic evaluation of paper-based Web pages: a simplified inspection usability methodology.

    Science.gov (United States)

    Allen, Mureen; Currie, Leanne M; Bakken, Suzanne; Patel, Vimla L; Cimino, James J

    2006-08-01

    Online medical information, when presented to clinicians, must be well-organized and intuitive to use, so that the clinicians can conduct their daily work efficiently and without error. It is essential to actively seek to produce good user interfaces that are acceptable to the user. This paper describes the methodology used to develop a simplified heuristic evaluation (HE) suitable for the evaluation of screen shots of Web pages, the development of an HE instrument used to conduct the evaluation, and the results of the evaluation of the aforementioned screen shots. In addition, this paper presents examples of the process of categorizing problems identified by the HE and the technological solutions identified to resolve these problems. Four usability experts reviewed 18 paper-based screen shots and made a total of 108 comments. Each expert completed the task in about an hour. We were able to implement solutions to approximately 70% of the violations. Our study found that a heuristic evaluation using paper-based screen shots of a user interface was expeditious, inexpensive, and straightforward to implement.

  3. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform

    Directory of Open Access Journals (Sweden)

    Raquel Acero

    2016-11-01

    Full Text Available This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs together with a capacitive sensor-based indexed metrology platform (IMP based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform’s mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument’s working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform.

  4. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform.

    Science.gov (United States)

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-11-18

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform's mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument's working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform.

  5. Affinity-based methodologies and ligands for antibody purification: advances and perspectives.

    Science.gov (United States)

    Roque, Ana C A; Silva, Cláudia S O; Taipa, M Angela

    2007-08-10

    Many successful, recent therapies for life-threatening diseases such as cancer and rheumatoid arthritis are based on the recognition between native or genetically engineered antibodies and cell-surface receptors. Although naturally produced by the immune system, the need for antibodies with unique specificities and designed for single application, has encouraged the search for novel antibody purification strategies. The availability of these products to the end-consumer is strictly related to manufacture costs, particularly those attributed to downstream processing. Over the last decades, academia and industry have developed different types of interactions and separation techniques for antibody purification, affinity-based strategies being the most common and efficient methodologies. The affinity ligands utilized range from biological to synthetic designed molecules with enhanced resistance and stability. Despite the successes achieved, the purification "paradigm" still moves interests and efforts in the continuous demand for improved separation performances. This review will focus on recent advances and perspectives in antibody purification by affinity interactions using different techniques, with particular emphasis on affinity chromatography.

  6. Temperature-based estimation of global solar radiation using soft computing methodologies

    Science.gov (United States)

    Mohammadi, Kasra; Shamshirband, Shahaboddin; Danesh, Amir Seyed; Abdullah, Mohd Shahidan; Zamani, Mazdak

    2016-07-01

    Precise knowledge of solar radiation is indeed essential in different technological and scientific applications of solar energy. Temperature-based estimation of global solar radiation would be appealing owing to broad availability of measured air temperatures. In this study, the potentials of soft computing techniques are evaluated to estimate daily horizontal global solar radiation (DHGSR) from measured maximum, minimum, and average air temperatures ( T max, T min, and T avg) in an Iranian city. For this purpose, a comparative evaluation between three methodologies of adaptive neuro-fuzzy inference system (ANFIS), radial basis function support vector regression (SVR-rbf), and polynomial basis function support vector regression (SVR-poly) is performed. Five combinations of T max, T min, and T avg are served as inputs to develop ANFIS, SVR-rbf, and SVR-poly models. The attained results show that all ANFIS, SVR-rbf, and SVR-poly models provide favorable accuracy. Based upon all techniques, the higher accuracies are achieved by models (5) using T max- T min and T max as inputs. According to the statistical results, SVR-rbf outperforms SVR-poly and ANFIS. For SVR-rbf (5), the mean absolute bias error, root mean square error, and correlation coefficient are 1.1931 MJ/m2, 2.0716 MJ/m2, and 0.9380, respectively. The survey results approve that SVR-rbf can be used efficiently to estimate DHGSR from air temperatures.

  7. Environmental external effects from wind power based on the EU ExternE methodology

    DEFF Research Database (Denmark)

    Ibsen, Liselotte Schleisner; Nielsen, Per Sieverts

    1998-01-01

    The European Commission has launched a major study project, ExternE, to develop a methodology to quantify externalities. A “National Implementation Phase”, was started under the Joule II programme with the purpose of implementing the ExternE methodology in all member states. The main objective...

  8. Information System Design Methodology Based on PERT/CPM Networking and Optimization Techniques.

    Science.gov (United States)

    Bose, Anindya

    The dissertation attempts to demonstrate that the program evaluation and review technique (PERT)/Critical Path Method (CPM) or some modified version thereof can be developed into an information system design methodology. The methodology utilizes PERT/CPM which isolates the basic functional units of a system and sets them in a dynamic time/cost…

  9. A Methodology for Equitable Performance Assessment and Presentation of Wave Energy Converters Based on Sea Trials

    DEFF Research Database (Denmark)

    Kofoed, Jens Peter; Pecher, Arthur; Margheritini, Lucia

    2013-01-01

    This paper provides a methodology for the analysis and presentation of data obtained from sea trials of wave energy converters (WEC). The equitable aspect of this methodology lies in its wide application, as any WEC at any scale or stage of development can be considered as long as the tests are p...

  10. Methodologies in Cultural-Historical Activity Theory: The Example of School-Based Development

    Science.gov (United States)

    Postholm, May Britt

    2015-01-01

    Background and purpose: Relatively little research has been conducted on methodology within Cultural-Historical Activity Theory (CHAT). CHAT is mainly used as a framework for developmental processes. The purpose of this article is to discuss both focuses for research and research questions within CHAT and to outline methodologies that can be used…

  11. Progressive design methodology for complex engineering systems based on multiobjective genetic algorithms and linguistic decision making

    NARCIS (Netherlands)

    Kumar, P.; Bauer, P.

    2008-01-01

    This work focuses on a design methodology that aids in design and development of complex engineering systems. This design methodology consists of simulation, optimization and decision making. Within this work a framework is presented in which modelling, multi-objective optimization and multi criteri

  12. Physics of Fully Depleted CCDs

    CERN Document Server

    Holland, S E; Kolbe, W F; Lee, J S

    2014-01-01

    In this work we present simple, physics-based models for two effects that have been noted in the fully depleted CCDs that are presently used in the Dark Energy Survey Camera. The first effect is the observation that the point-spread function increases slightly with the signal level. This is explained by considering the effect on charge-carrier diffusion due to the reduction in the magnitude of the channel potential as collected signal charge acts to partially neutralize the fixed charge in the depleted channel. The resulting reduced voltage drop across the carrier drift region decreases the vertical electric field and increases the carrier transit time. The second effect is the observation of low-level, concentric ring patterns seen in uniformly illuminated images. This effect is shown to be most likely due to lateral deflection of charge during the transit of the photogenerated carriers to the potential wells as a result of lateral electric fields. The lateral fields are a result of space charge in the fully...

  13. A game-based decision support methodology for competitive systems design

    Science.gov (United States)

    Briceno, Simon Ignacio

    This dissertation describes the development of a game-based methodology that facilitates the exploration and selection of research and development (R&D) projects under uncertain competitive scenarios. The proposed method provides an approach that analyzes competitor positioning and formulates response strategies to forecast the impact of technical design choices on a project's market performance. A critical decision in the conceptual design phase of propulsion systems is the selection of the best architecture, centerline, core size, and technology portfolio. This selection can be challenging when considering evolving requirements from both the airframe manufacturing company and the airlines in the market. Furthermore, the exceedingly high cost of core architecture development and its associated risk makes this strategic architecture decision the most important one for an engine company. Traditional conceptual design processes emphasize performance and affordability as their main objectives. These areas alone however, do not provide decision-makers with enough information as to how successful their engine will be in a competitive market. A key objective of this research is to examine how firm characteristics such as their relative differences in completing R&D projects, differences in the degree of substitutability between different project types, and first/second-mover advantages affect their product development strategies. Several quantitative methods are investigated that analyze business and engineering strategies concurrently. In particular, formulations based on the well-established mathematical field of game theory are introduced to obtain insights into the project selection problem. The use of game theory is explored in this research as a method to assist the selection process of R&D projects in the presence of imperfect market information. The proposed methodology focuses on two influential factors: the schedule uncertainty of project completion times and

  14. DESIGN METHODOLOGY OF NETWORKED SOFTWARE EVOLUTION GROWTH BASED ON SOFTWARE PATTERNS

    Institute of Scientific and Technical Information of China (English)

    Keqing HE; Rong PENG; Jing LIU; Fei HE; Peng LIANG; Bing LI

    2006-01-01

    Recently, some new characteristics of complex networks attract the attentions of scientists in different fields, and lead to many kinds of emerging research directions. So far, most of the research work has been limited in discovery of complex network characteristics by structure analysis in large-scale software systems. This paper presents the theoretical basis, design method, algorithms and experiment results of the research. It firstly emphasizes the significance of design method of evolution growth for network topology of Object Oriented (OO) software systems, and argues that the selection and modulation of network models with various topology characteristics will bring un-ignorable effect on the process of design and implementation of OO software systems. Then we analyze the similar discipline of "negation of negation and compromise" between the evolution of network models with different topology characteristics and the development of software modelling methods. According to the analysis of the growth features of software patterns, we propose an object-oriented software network evolution growth method and its algorithms in succession. In addition, we also propose the parameter systems for Oosoftware system metrics based on complex network theory. Based on these parameter systems, it can analyze the features of various nodes, links and local-world, modulate the network topology and guide the software metrics. All these can be helpful to the detailed design, implementation and performance analysis. Finally, we focus on the application of the evolution algorithms and demonstrate it by a case study.Comparing the results from our early experiments with methodologies in empirical software engineering, we believe that the proposed software engineering design method is a computational software engineering approach based on complex network theory. We argue that this method should be greatly beneficial for the design, implementation, modulation and metrics of

  15. Looking for phase-space structures in star-forming regions: an MST-based methodology

    Science.gov (United States)

    Alfaro, Emilio J.; González, Marta

    2016-03-01

    We present a method for analysing the phase space of star-forming regions. In particular we are searching for clumpy structures in the 3D sub-space formed by two position coordinates and radial velocity. The aim of the method is the detection of kinematic segregated radial velocity groups, that is, radial velocity intervals whose associated stars are spatially concentrated. To this end we define a kinematic segregation index, tilde{Λ }(RV), based on the Minimum Spanning Tree graph algorithm, which is estimated for a set of radial velocity intervals in the region. When tilde{Λ }(RV) is significantly greater than 1 we consider that this bin represents a grouping in the phase space. We split a star-forming region into radial velocity bins and calculate the kinematic segregation index for each bin, and then we obtain the spectrum of kinematic groupings, which enables a quick visualization of the kinematic behaviour of the region under study. We carried out numerical models of different configurations in the sub-space of the phase space formed by the coordinates and the that various case studies illustrate. The analysis of the test cases demonstrates the potential of the new methodology for detecting different kind of groupings in phase space.

  16. Measuring Effectiveness in Digital Game-Based Learning: A Methodological Review.

    Directory of Open Access Journals (Sweden)

    Anissa All

    2014-06-01

    Full Text Available In recent years, a growing number of studies are being conducted into the effectiveness of digital game-based learning (DGBL. Despite this growing interest, there is a lack of sound empirical evidence on the effectiveness of DGBL due to different outcome measures for assessing effectiveness, varying methods of data collection and inconclusive or difficult to interpret results. This has resulted in a need for an overarching methodology for assessing the effectiveness of DGBL. The present study took a first step in this direction by mapping current methods used for assessing the effectiveness of DGBL. Results showed that currently, comparison of results across studies and thus looking at effectiveness of DGBL on a more general level is problematic due to diversity in and suboptimal study designs. Variety in study design relates to three issues, namely different activities that are implemented in the control groups, different measures for assessing the effectiveness of DGBL and the use of different statistical techniques for analyzing learning outcomes. Suboptimal study designs are the result of variables confounding study results. Possible confounds that were brought forward in this review are elements that are added to the game as part of the educational intervention (e.g., required reading, debriefing session, instructor influences and practice effects when using the same test pre- and post-intervention. Lastly, incomplete information on the study design impedes replication of studies and thus falsification of study results.

  17. IIR filtering based adaptive active vibration control methodology with online secondary path modeling using PZT actuators

    Science.gov (United States)

    Boz, Utku; Basdogan, Ipek

    2015-12-01

    Structural vibrations is a major cause for noise problems, discomfort and mechanical failures in aerospace, automotive and marine systems, which are mainly composed of plate-like structures. In order to reduce structural vibrations on these structures, active vibration control (AVC) is an effective approach. Adaptive filtering methodologies are preferred in AVC due to their ability to adjust themselves for varying dynamics of the structure during the operation. The filtered-X LMS (FXLMS) algorithm is a simple adaptive filtering algorithm widely implemented in active control applications. Proper implementation of FXLMS requires availability of a reference signal to mimic the disturbance and model of the dynamics between the control actuator and the error sensor, namely the secondary path. However, the controller output could interfere with the reference signal and the secondary path dynamics may change during the operation. This interference problem can be resolved by using an infinite impulse response (IIR) filter which considers feedback of the one or more previous control signals to the controller output and the changing secondary path dynamics can be updated using an online modeling technique. In this paper, IIR filtering based filtered-U LMS (FULMS) controller is combined with online secondary path modeling algorithm to suppress the vibrations of a plate-like structure. The results are validated through numerical and experimental studies. The results show that the FULMS with online secondary path modeling approach has more vibration rejection capabilities with higher convergence rate than the FXLMS counterpart.

  18. Development of flaxseed fortified rice - corn flour blend based extruded product by response surface methodology.

    Science.gov (United States)

    Ganorkar, P M; Jain, R K

    2015-08-01

    Flaxseed imparted the evidence of health benefits in human being. Response surface methodology (RSM) was employed to develop flaxseed fortified rice - corn flour blend based extruded product using twin screw extruder. The effect of roasted flaxseed flour (RFF) fortification (15-25 %), moisture content of feed (12-16 %, wb), extruder barrel temperature (120-140 °C) and screw speed (300-330 RPM) on expansion ratio (ER), breaking strength (BS), bulk density (BD) and overall acceptability (OAA) score of extrudates were investigated using central composite rotatable design (CCRD). Increased RFF level decreased the ER and OAA score significantly while increased BS and BD of extrudates (p extruder feed was positively related to ER (p Extruder barrel temperature was found to be negatively related to ER and OAA (p rice flour, 16 % moisture content (wb) of extruder feed, 120 °C extruder barrel temperature and 330 RPM of screw speed gave an optimized product of high desirability with corresponding responses as 3.08 ER, 0.53 kgf BS, 0.106 g.cm(-3) BD and 7.86 OAA.

  19. A Preisach-Based Nonequilibrium Methodology for Simulating Performance of Hysteretic Magnetic Refrigeration Cycles

    Science.gov (United States)

    Brown, Timothy D.; Bruno, Nickolaus M.; Chen, Jing-Han; Karaman, Ibrahim; Ross, Joseph H.; Shamberger, Patrick J.

    2015-09-01

    In giant magnetocaloric effect (GMCE) materials a large entropy change couples to a magnetostructural first-order phase transition, potentially providing a basis for magnetic refrigeration cycles. However, hysteresis loss greatly reduces the availability of refrigeration work in such cycles. Here, we present a methodology combining a Preisach model for rate-independent hysteresis with a thermodynamic analysis of nonequilibrium phase transformations which, for GMCE materials exhibiting hysteresis, allows an evaluation of refrigeration work and efficiency terms for an arbitrary cycle. Using simplified but physically meaningful descriptors for the magnetic and thermal properties of a Ni45Co5Mn36.6In13.4 at.% single-crystal alloy, we relate these work/efficiency terms to fundamental material properties, demonstrating the method's use as a materials design tool. Following a simple two-parameter model for the alloy's hysteresis properties, we compute and interpret the effect of each parameter on the cyclic refrigeration work and efficiency terms. We show that hysteresis loss is a critical concern in cycles based on GMCE systems, since the resultant lost work can reduce the refrigeration work to zero; however, we also find that the lost work may be mitigated by modifying other aspects of the transition, such as the width over which the one-way transformation occurs.

  20. U.S. Natural Gas Storage Risk-Based Ranking Methodology and Results

    Energy Technology Data Exchange (ETDEWEB)

    Folga, Steve [Argonne National Lab. (ANL), Argonne, IL (United States); Portante, Edgar [Argonne National Lab. (ANL), Argonne, IL (United States); Shamsuddin, Shabbir [Argonne National Lab. (ANL), Argonne, IL (United States); Tompkins, Angeli [Argonne National Lab. (ANL), Argonne, IL (United States); Talaber, Leah [Argonne National Lab. (ANL), Argonne, IL (United States); McLamore, Mike [Argonne National Lab. (ANL), Argonne, IL (United States); Kavicky, Jim [Argonne National Lab. (ANL), Argonne, IL (United States); Conzelmann, Guenter [Argonne National Lab. (ANL), Argonne, IL (United States); Levin, Todd [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-10-01

    This report summarizes the methodology and models developed to assess the risk to energy delivery from the potential loss of underground gas storage (UGS) facilities located within the United States. The U.S. has a total of 418 existing storage fields, of which 390 are currently active. The models estimate the impacts of a disruption of each of the active UGS facilities on their owners/operators, including (1) local distribution companies (LDCs), (2) directly connected transporting pipelines and thus on the customers in downstream States, and (3) third-party entities and thus on contracted customers expecting the gas shipment. Impacts are measured across all natural gas customer classes. For the electric sector, impacts are quantified in terms of natural gas-fired electric generation capacity potentially affected from the loss of a UGS facility. For the purpose of calculating the overall supply risk, the overall consequence of the disruption of an UGS facility across all customer classes is expressed in terms of the number of expected equivalent residential customer outages per year, which combines the unit business interruption cost per customer class and the estimated number of affected natural gas customers with estimated probabilities of UGS disruptions. All models and analyses are based on publicly available data. The report presents a set of findings and recommendations in terms of data, further analyses, regulatory requirements and standards, and needs to improve gas/electric industry coordination for electric reliability.

  1. METHODOLOGICAL BASES OF THE OPTIMIZATION OF ORGANIZATIONAL MANAGEMENT STRUCTURE AT IMPLEMENTING THE MAJOR CONSTRUCTION ENTERPRISE STRATEGY

    Directory of Open Access Journals (Sweden)

    Rodionova Svetlana Vladimirovna

    2015-09-01

    Full Text Available Planning and implementation of innovations on the microlevel of management and on the higher levels is a process of innovative projects portfolio implementation. Project management is aimed at some goal; therefore, defining the mission and aims of implementation is of primary importance. These are the part of the notion of development strategy of an enterprise. Creating a strategy for big construction holding companies is complicated by the necessity to account for different factors effecting each business-block and subsidiary companies. The authors specify an algorithm of development and implementation of the activity strategy of a big construction enterprise. A special importance of the correspondence of organizational management structure to the implemented strategy is shown. The innovative character of organizational structure change is justified. The authors offer methods to optimize the organizational management structure based on communication approach with the use of the elements graph theory. The offered methodological provisions are tested on the example of the Russian JSC “RZhDstroy”.

  2. Developing More Insights on Sustainable Consumption in China Based on Q Methodology

    Directory of Open Access Journals (Sweden)

    Ying Qu

    2015-10-01

    Full Text Available Being an important aspect of sustainable development, sustainable consumption has attracted great attention among Chinese politicians and academia, and Chinese governments have established policies that encourage sustainable consumption behaviors. However, unsustainable consumption behavior still remains predominant in China. This paper aims to classify consumers with similar traits, in terms of the characteristics of practicing sustainable consumption, into one group, so that their traits can be clearly understood, to enable governments to establish pointed policies for different groups of consumers. Q methodology, generally used to reveal the subjectivity of human beings involved in any situation, is applied in this paper to classify Chinese consumers based on Q sample design and data collection and analysis. Next, the traits of each group are analyzed in detail and comparison analyses are also conducted to compare the common and differentiating factors among the three groups. The results show that Chinese consumers can be classified into three groups: sustainable (Group 1, potential sustainable (Group 2 and unsustainable consumers (Group 3, according to their values and attitudes towards sustainable consumption. As such, Group 1 cares for the environment and has strong environmental values. They understand sustainable consumption and its functions. Group 2 needs more enlightenments and external stimuli to motivate them to consume sustainably. Group 3 needs to be informed about and educated on sustainable consumption to enable them to change their consumption behavior from unsustainable to sustainable. Suggestions and implications of encouraging each group of consumers to engage in sustainable consumption are also provided.

  3. Fast Lemons and Sour Boulders: Testing Crossmodal Correspondences Using an Internet-Based Testing Methodology

    Directory of Open Access Journals (Sweden)

    Andy T. Woods

    2013-09-01

    Full Text Available According to a popular family of hypotheses, crossmodal matches between distinct features hold because they correspond to the same polarity on several conceptual dimensions (such as active–passive, good–bad, etc. that can be identified using the semantic differential technique. The main problem here resides in turning this hypothesis into testable empirical predictions. In the present study, we outline a series of plausible consequences of the hypothesis and test a variety of well-established and previously untested crossmodal correspondences by means of a novel internet-based testing methodology. The results highlight that the semantic hypothesis cannot easily explain differences in the prevalence of crossmodal associations built on the same semantic pattern (fast lemons, slow prunes, sour boulders, heavy red; furthermore, the semantic hypothesis only minimally predicts what happens when the semantic dimensions and polarities that are supposed to drive such crossmodal associations are made more salient (e.g., by adding emotional cues that ought to make the good/bad dimension more salient; finally, the semantic hypothesis does not explain why reliable matches are no longer observed once intramodal dimensions with congruent connotations are presented (e.g., visually presented shapes and colour do not appear to correspond.

  4. Designing reasonable accommodation of the workplace: a new methodology based on risk assessment.

    Science.gov (United States)

    Pigini, L; Andrich, R; Liverani, G; Bucciarelli, P; Occhipinti, E

    2010-05-01

    If working tasks are carried out in inadequate conditions, workers with functional limitations may, over time, risk developing further disabilities. While several validated risk assessment methods exist for able-bodied workers, few studies have been carried out for workers with disabilities. This article, which reports the findings of a Study funded by the Italian Ministry of Labour, proposes a general methodology for the technical and organisational re-design of a worksite, based on risk assessment and irrespective of any worker disability. To this end, a sample of 16 disabled workers, composed of people with either mild or severe motor disabilities, was recruited. Their jobs include business administration (5), computer programmer (1), housewife (1), mechanical worker (2), textile worker (1), bus driver (1), nurse (2), electrical worker (1), teacher (1), warehouseman (1). By using a mix of risk assessment methods and the International Classification of Functioning (ICF) taxonomy, their worksites were re-designed in view of a reasonable accommodation, and prospective evaluation was carried out to check whether the new design would eliminate the risks. In one case - a man with congenital malformations who works as a help-desk operator for technical assistance in the Information and Communication Technology (ICT) department of a big organisation - the accommodation was actually carried out within the time span of the study, thus making it possible to confirm the hypotheses raised in the prospective assessment.

  5. Low-Cost Fault Tolerant Methodology for Real Time MPSoC Based Embedded System

    Directory of Open Access Journals (Sweden)

    Mohsin Amin

    2014-01-01

    Full Text Available We are proposing a design methodology for a fault tolerant homogeneous MPSoC having additional design objectives that include low hardware overhead and performance. We have implemented three different FT methodologies on MPSoCs and compared them against the defined constraints. The comparison of these FT methodologies is carried out by modelling their architectures in VHDL-RTL, on Spartan 3 FPGA. The results obtained through simulations helped us to identify the most relevant scheme in terms of the given design constraints.

  6. Event based uncertainty assessment in urban drainage modelling, applying the GLUE methodology

    DEFF Research Database (Denmark)

    Thorndahl, Søren; Beven, K.J.; Jensen, Jacob Birk

    2008-01-01

    In the present paper an uncertainty analysis on an application of the commercial urban drainage model MOUSE is conducted. Applying the Generalized Likelihood Uncertainty Estimation (GLUE) methodology the model is conditioned on observation time series from two flow gauges as well as the occurrence...... of combined sewer overflow. The GLUE methodology is used to test different conceptual setups in order to determine if one model setup gives a better goodness of fit conditional on the observations than the other. Moreover, different methodological investigations of GLUE are conducted in order to test...

  7. Are QALYs based on time trade-off comparable?--A systematic review of TTO methodologies.

    Science.gov (United States)

    Arnesen, Trude; Trommald, Mari

    2005-01-01

    A wide range of methods is used to elicit quality-of-life weights of different health states to generate 'Quality-adjusted life years' (QALYs). The comparability between different types of health outcomes at a numerical level is the main advantage of using a 'common currency for health' such as the QALY. It has been warned that results of different methods and perspectives should not be directly compared in QALY league tables. But do we know that QALYs are comparable if they are based on the same method and perspective?The Time trade-off (TTO) consists in a hypothetical trade-off between living shorter and living healthier. We performed a literature review of the TTO methodology used to elicit quality-of-life weights for own, current health. Fifty-six journal articles, with quality-of-life weights assigned to 102 diagnostic groups were included. We found extensive differences in how the TTO question was asked. The time frame varied from 1 month to 30 years, and was not reported for one-fourth of the weights. The samples in which the quality-of-life weights were elicited were generally small with a median size of 53 respondents. Comprehensive inclusion criteria were given for half the diagnostic groups. Co-morbidity was described in less than one-tenth of the groups of respondents. For two-thirds of the quality-of-life weights, there was no discussion of the influence of other factors, such as age, sex, employment and children. The different methodological approaches did not influence the TTO weights in a predictable or clear pattern. Whether or not it is possible to standardise the TTO method and the sampling procedure, and whether or not the TTO will then give valid quality-of-life weights, remains an open question.This review of the TTO elicited on own behalf, shows that limiting cost-utility analysis to include only quality life weights from one method and one perspective is not enough to ensure that QALYs are comparable.

  8. Methodology To Define Drought Management Scenarios Based On Accumulated Future Projections Of Risk

    Science.gov (United States)

    Haro-Monteagudo, David; Solera-Solera, Abel; Andreu-Álvarez, Joaquín

    2014-05-01

    Drought is a serious threat to many water resources systems in the world. Especially to those in which the equilibrium between resources availability and water uses is very fragile, making that deviation below normality compromises the capacity of the system to cope with all the demands and environmental requirements. Since droughts are not isolated events but instead they develop through time in what could be considered a creeping behavior, it is very difficult to determine when an episode starts and how long will it last. Because this is a major concern for water managers and society in general, scientific research has strived to develop indices that allow evaluating the risk of a drought event occurrence. These indices often have as basis previous and current state variables of the system that combined between them supply decision making responsible with an indication of the risk of being in a situation of drought, normally through the definition of a drought scenario situation. While this way of proceeding has found to be effective in many systems, there are cases in which indicators systems fail to define the appropriate on-going drought scenario early enough to start measures that allowed to minimize the possible impacts. This is the case, for example, of systems with high seasonal precipitation variability. The use of risk assessment models to evaluate future possible states of the system becomes handy in cases like the previous one, although they are not limited to such systems. We present a method to refine the drought scenario definition within a water resources system. To implement this methodology, we use a risk assessment model generalized to water resources systems based in the stochastic generation of multiple possible future streamflows generation and the simulation of the system from a Monte-Carlo approach. We do this assessment every month of the year up to the end of the hydrologic year that normally corresponds with the end of the irrigation

  9. Integrated methodological frameworks for modelling agent-based advanced supply chain planning systems: A systematic literature review

    Directory of Open Access Journals (Sweden)

    Luis Antonio Santa-Eulalia

    2011-12-01

    Full Text Available Purpose: The objective of this paper is to provide a systematic literature review of recent developments in methodological frameworks for the modelling and simulation of agent-based advanced supply chain planning systems.Design/methodology/approach: A systematic literature review is provided to identify, select and make an analysis and a critical summary of all suitable studies in the area. It is organized into two blocks: the first one covers agent-based supply chain planning systems in general terms, while the second one specializes the previous search to identify those works explicitly containing methodological aspects.Findings: Among sixty suitable manuscripts identified in the primary literature search, only seven explicitly considered the methodological aspects. In addition, we noted that, in general, the notion of advanced supply chain planning is not considered unambiguously, that the social and individual aspects of the agent society are not taken into account in a clear manner in several studies and that a significant part of the works are of a theoretical nature, with few real-scale industrial applications. An integrated framework covering all phases of the modelling and simulation process is still lacking in the literature visited.Research limitations/implications: The main research limitations are related to the period covered (last four years, the selected scientific databases, the selected language (i.e. English and the use of only one assessment framework for the descriptive evaluation part.Practical implications: The identification of recent works in the domain and discussion concerning their limitations can help pave the way for new and innovative researches towards a complete methodological framework for agent-based advanced supply chain planning systems.Originality/value: As there are no recent state-of-the-art reviews in the domain of methodological frameworks for agent-based supply chain planning, this paper contributes to

  10. The New MCNP6 Depletion Capability

    Energy Technology Data Exchange (ETDEWEB)

    Fensin, Michael Lorne [Los Alamos National Laboratory; James, Michael R. [Los Alamos National Laboratory; Hendricks, John S. [Los Alamos National Laboratory; Goorley, John T. [Los Alamos National Laboratory

    2012-06-19

    The first MCNP based inline Monte Carlo depletion capability was officially released from the Radiation Safety Information and Computational Center as MCNPX 2.6.0. Both the MCNP5 and MCNPX codes have historically provided a successful combinatorial geometry based, continuous energy, Monte Carlo radiation transport solution for advanced reactor modeling and simulation. However, due to separate development pathways, useful simulation capabilities were dispersed between both codes and not unified in a single technology. MCNP6, the next evolution in the MCNP suite of codes, now combines the capability of both simulation tools, as well as providing new advanced technology, in a single radiation transport code. We describe here the new capabilities of the MCNP6 depletion code dating from the official RSICC release MCNPX 2.6.0, reported previously, to the now current state of MCNP6. NEA/OECD benchmark results are also reported. The MCNP6 depletion capability enhancements beyond MCNPX 2.6.0 reported here include: (1) new performance enhancing parallel architecture that implements both shared and distributed memory constructs; (2) enhanced memory management that maximizes calculation fidelity; and (3) improved burnup physics for better nuclide prediction. MCNP6 depletion enables complete, relatively easy-to-use depletion calculations in a single Monte Carlo code. The enhancements described here help provide a powerful capability as well as dictate a path forward for future development to improve the usefulness of the technology.

  11. An AFM-based methodology for measuring axial and radial error motions of spindles

    Science.gov (United States)

    Geng, Yanquan; Zhao, Xuesen; Yan, Yongda; Hu, Zhenjiang

    2014-05-01

    This paper presents a novel atomic force microscopy (AFM)-based methodology for measurement of axial and radial error motions of a high precision spindle. Based on a modified commercial AFM system, the AFM tip is employed as a cutting tool by which nano-grooves are scratched on a flat surface with the rotation of the spindle. By extracting the radial motion data of the spindle from the scratched nano-grooves, the radial error motion of the spindle can be calculated after subtracting the tilting errors from the original measurement data. Through recording the variation of the PZT displacement in the Z direction in AFM tapping mode during the spindle rotation, the axial error motion of the spindle can be obtained. Moreover the effects of the nano-scratching parameters on the scratched grooves, the tilting error removal method for both conditions and the method of data extraction from the scratched groove depth are studied in detail. The axial error motion of 124 nm and the radial error motion of 279 nm of a commercial high precision air bearing spindle are achieved by this novel method, which are comparable with the values provided by the manufacturer, verifying this method. This approach does not need an expensive standard part as in most conventional measurement approaches. Moreover, the axial and radial error motions of the spindle can both be obtained, indicating that this is a potential means of measuring the error motions of the high precision moving parts of ultra-precision machine tools in the future.

  12. A novel pyrogallol red-based assay to assess catalase activity: Optimization by response surface methodology.

    Science.gov (United States)

    Abderrahim, Mohamed; Arribas, Silvia M; Condezo-Hoyos, Luis

    2017-05-01

    Pyrogallol red (PGR) was identified as a novel optical probe for the detection of hydrogen peroxide (H2O2) based on horseradish peroxidase (HRP)-catalyzed oxidation. Response surface methodology (RSM) was applied as a tool to optimize the concentrations of PGR (100µmolL(-1)), HRP (1UmL(-1)) and H2O2 (250µmolL(-1)) and used to develop a sensitive PGR-based catalase (CAT) activity assay (PGR-CAT assay). N-ethylmaleimide -NEM- (102mmolL(-1)) was used to avoid interference produced by thiol groups while protecting CAT activity. Incubation time (30min) for samples or CAT used as standard and H2O2 as well as signal stability (stable between 5 and 60min) were also evaluated. PGR-CAT assay was linear within the range of 0-4UmL(-1) (R(2)=0.993) and very sensitive with limits of detection (LOD) of 0.005UmL(-1) and quantitation (LOQ) of 0.01UmL(-1). PGR-CAT assay showed an adequate intra-day RSD=0.6-9.5% and inter-day RSD=2.4-8.9%. Bland-Altman analysis and Passing-Bablok and Pearson correlation analysis showed good agreement between CAT activity as measured by the PRG-CAT assay and the Amplex Red assay. The PGR-CAT assay is more sensitive than all the other colorimetric assays reported, particularly the Amplex Red assay, and the cost of PGR is a small fraction (about 1/1000) of that of an Amplex Red probe, so it can be expected to find wide use among scientists studying CAT activity in biological samples.

  13. Redesigning Acquisition Processes: A New Methodology Based on the Flow of Knowledge and Information

    Science.gov (United States)

    2001-07-01

    the report discusses the InfoDesign methodology, which was developed as part of this project. A methodology for process rede - sign is necessarily...which one acquisition process, involving the U.S. Government and one key supplier, was analyzed and redesigned. The process rede - sign proposal was... Informatica , V.21, No.2. Lau, F. (1997), “A Review on the Use of Action Research in Information Systems Studies,” Informa- tion Systems and Qualitative

  14. Revised Design-Based Research Methodology for College Course Improvement and Application to Education Courses in Japan

    Science.gov (United States)

    Akahori, Kanji

    2011-01-01

    The author describes a research methodology for college course improvement, and applies the results to education courses. In Japan, it is usually difficult to carry out research on college course improvement, because faculty cannot introduce experimental design approaches based on control and treatment groupings of students in actual classroom…

  15. Putting Order into Our Universe: The Concept of "Blended Learning"--A Methodology within the Concept-Based Terminology Framework

    Science.gov (United States)

    Fernandes, Joana; Costa, Rute; Peres, Paula

    2016-01-01

    This paper aims at discussing the advantages of a methodology design grounded on a concept-based approach to Terminology applied to the most prominent scenario of current Higher Education: "blended learning." Terminology is a discipline that aims at representing, describing and defining specialized knowledge through language, putting…

  16. Towards a common methodology to simulate tree mortality based on ring-width data

    Science.gov (United States)

    Cailleret, Maxime; Bigler, Christof; Bugmann, Harald; Davi, Hendrik; Minunno, Francesco; Peltoniemi, Mikko; Martínez-Vilalta, Jordi

    2015-04-01

    Individual mortality is a key process of population and community dynamics, especially for long-lived species such as trees. As the rates of vegetation background mortality and of massive diebacks accelerated during the last decades and would continue in the future due to rising temperature and increasing drought, there is a growing demand of early warning signals that announce that the likelihood of death is very high. If physiological indicators have a high potential to predict tree mortality, their development requires an intensive tree monitoring which cannot be currently done on a representative sample of a population and on several species. An easier approach is to use radial growth data such as tree ring-widths measurements. During the last decades, an increasing number of studies aimed to derive these growth-mortality functions. However, as they followed different approaches concerning the choice of the sampling strategy (number of dead and living trees), of the type of growth explanatory variables (growth level, growth trend variables…), and of the length of the time-window (number of rings before death) used to calculate them, it makes difficult to compare results among studies and a subsequent biological interpretation. We detailed a new methodology for assessing reliable tree-ring based growth-mortality relationships using binomial logistic regression models. As examples we used published tree-ring datasets from Abies alba growing in 13 different sites, and from Nothofagus dombeyi and Quercus petraea located in one single site. Our first approach, based on constant samplings, aims to (1) assess the dependency of growth-mortality relationships on the statistical sampling scheme used; (2) determine the best length of the time-window used to calculate each growth variable; and (3) reveal the presence of intra-specific shifts in growth-mortality relationships. We also followed a Bayesian approach to build the best multi-variable logistic model considering

  17. Study on the Optimization of Bio-emulsifier Production by Geobacillus sp.XS2 Based on Response Surface Methodology

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    [Objective] The aim was to study the optimization of bio-emulsifier production by Geobacillus sp.XS2 based on response surface methodology.[Method] Firstly,single factor experiment was conducted to find out the main medium components influencing bio-emulsifier production by Geobacillus sp.XS2,and then response surface model was established by using response surface methodology and Design-Expert 7.0,so as to optimize the fermentation medium for bio-emulsifier production by Geobacillus sp.XS2.[Result] Glucose...

  18. Depleted zinc: Properties, application, production.

    Science.gov (United States)

    Borisevich, V D; Pavlov, A V; Okhotina, I A

    2009-01-01

    The addition of ZnO, depleted in the Zn-64 isotope, to the water of boiling water nuclear reactors lessens the accumulation of Co-60 on the reactor interior surfaces, reduces radioactive wastes and increases the reactor service-life because of the inhibitory action of zinc on inter-granular stress corrosion cracking. To the same effect depleted zinc in the form of acetate dihydrate is used in pressurized water reactors. Gas centrifuge isotope separation method is applied for production of depleted zinc on the industrial scale. More than 20 years of depleted zinc application history demonstrates its benefits for reduction of NPP personnel radiation exposure and combating construction materials corrosion.

  19. An Isogeometric Design-through-analysis Methodology based on Adaptive Hierarchical Refinement of NURBS, Immersed Boundary Methods, and T-spline CAD Surfaces

    Science.gov (United States)

    2012-01-22

    ICES REPORT 12-05 January 2012 An Isogeometric Design-through-analysis Methodology based on Adaptive Hierarchical Refinement of NURBS , Immersed...M.J. Borden, E. Rank, T.J.R. Hughes, An Isogeometric Design-through-analysis Methodology based on Adaptive Hierarchical Refinement of NURBS , Immersed...analysis Methodology based on Adaptive Hierarchical Refinement of NURBS , Immersed Boundary Methods, and T-spline CAD Surfaces 5a. CONTRACT NUMBER 5b

  20. SCAP: a new methodology for safety management based on feedback from credible accident-probabilistic fault tree analysis system.

    Science.gov (United States)

    Khan, F I; Iqbal, A; Ramesh, N; Abbasi, S A

    2001-10-12

    As it is conventionally done, strategies for incorporating accident--prevention measures in any hazardous chemical process industry are developed on the basis of input from risk assessment. However, the two steps-- risk assessment and hazard reduction (or safety) measures--are not linked interactively in the existing methodologies. This prevents a quantitative assessment of the impacts of safety measures on risk control. We have made an attempt to develop a methodology in which risk assessment steps are interactively linked with implementation of safety measures. The resultant system tells us the extent of reduction of risk by each successive safety measure. It also tells based on sophisticated maximum credible accident analysis (MCAA) and probabilistic fault tree analysis (PFTA) whether a given unit can ever be made 'safe'. The application of the methodology has been illustrated with a case study.

  1. Novel methodology for 3D reconstruction of carotid arteries and plaque characterization based upon magnetic resonance imaging carotid angiography data.

    Science.gov (United States)

    Sakellarios, Antonis I; Stefanou, Kostas; Siogkas, Panagiotis; Tsakanikas, Vasilis D; Bourantas, Christos V; Athanasiou, Lambros; Exarchos, Themis P; Fotiou, Evangelos; Naka, Katerina K; Papafaklis, Michail I; Patterson, Andrew J; Young, Victoria E L; Gillard, Jonathan H; Michalis, Lampros K; Fotiadis, Dimitrios I

    2012-10-01

    In this study, we present a novel methodology that allows reliable segmentation of the magnetic resonance images (MRIs) for accurate fully automated three-dimensional (3D) reconstruction of the carotid arteries and semiautomated characterization of plaque type. Our approach uses active contours to detect the luminal borders in the time-of-flight images and the outer vessel wall borders in the T(1)-weighted images. The methodology incorporates the connecting components theory for the automated identification of the bifurcation region and a knowledge-based algorithm for the accurate characterization of the plaque components. The proposed segmentation method was validated in randomly selected MRI frames analyzed offline by two expert observers. The interobserver variability of the method for the lumen and outer vessel wall was -1.60%±6.70% and 0.56%±6.28%, respectively, while the Williams Index for all metrics was close to unity. The methodology implemented to identify the composition of the plaque was also validated in 591 images acquired from 24 patients. The obtained Cohen's k was 0.68 (0.60-0.76) for lipid plaques, while the time needed to process an MRI sequence for 3D reconstruction was only 30 s. The obtained results indicate that the proposed methodology allows reliable and automated detection of the luminal and vessel wall borders and fast and accurate characterization of plaque type in carotid MRI sequences. These features render the currently presented methodology a useful tool in the clinical and research arena.

  2. Project-Based Learning and Agile Methodologies in Electronic Courses: Effect of Student Population and Open Issues

    Directory of Open Access Journals (Sweden)

    Marina Zapater

    2013-12-01

    Full Text Available Project-Based Learning (PBL and Agile methodologies have proven to be very interesting instructional strategies in Electronics and Engineering education, because they provide practical learning skills that help students understand the basis of electronics. In this paper we analyze two courses, one belonging to a Master in Electronic Engineering and one to a Bachelor in Telecommunication Engineering that apply Agile-PBL methodologies, and compare the results obtained in both courses with a traditional laboratory course. Our results support previous work stating that Agile-PBL methodologies increase student satisfaction. However, we also highlight some open issues that negatively affect the implementation of these methodologies,such as planning overhead or accidental complexity. Moreover,we show how differences in the student population, mostly related to the time spent on-campus, their commitment to the course or part-time dedication, have an impact on the benefits of Agile-PBL methods. In these cases, Agile-PBL methodologies by themselves are not enough and need to be combined with other techniques to increase student motivation.

  3. A methodological approach to characterise Landslide Periods based on historical series of rainfall and landslide damage

    Directory of Open Access Journals (Sweden)

    O. Petrucci

    2009-10-01

    Full Text Available Landslide Periods (LPs are defined as periods, shorter than a hydrological year, during which one or more landslide damage events occur in one or more sectors of a study area. In this work, we present a methodological approach, based on the comparative analysis of historical series of landslide damage and daily rainfall data, aiming to characterise the main types of LPs affecting selected areas. Cumulative rainfall preceding landslide activation is assessed for short (1, 2, 3, and 5 days, medium (7, 10, and 30 days and long (60, 90, and 180 days durations, and their Return Periods (RPs are assessed and ranked into three classes (Class 1: RP=5-10 years; Class 2: RP=11-15; Class 3: RP>15 years. To assess landslide damage, the Simplified Damage Index (SDI is introduced. This represents classified landslide losses and is obtained by multiplying the value of the damaged element and the percentage of damage affecting it. The comparison of the RP of rainfall and the SDI allows us to indentify the different types of LPs that affected the study area in the past and that could affect it again in the future.

    The results of this activity can be used for practical purposes to define scenarios and strategies for risk management, to suggest priorities in policy towards disaster mitigation and preparedness and to predispose defensive measures and civil protection plans ranked according to the types of LPs that must be managed.

    We present an application, performed for a 39-year series of rainfall/landslide damage data and concerning a study area located in NE Calabria (Italy; in this case study, we identify four main types of LPs, which are ranked according to damage severity.

  4. Methodology of using CFD-based risk assessment in road tunnels

    Directory of Open Access Journals (Sweden)

    Vidmar Peter

    2007-01-01

    Full Text Available The definition of the deterministic approach in the safety analyses comes from the need to understand the conditions that come out during the fire accident in a road tunnel. The key factor of the tunnel operations during the fire is the ventilation, which during the initial fazes of the fire, impact strongly on the evacuation of people and latter on the access of the intervention units in the tunnel. The paper presents the use of the computational fluid dynamics model in the tunnel safety assessment process. The model is validated by comparing data with experimental and quantifying the differences. The set-up of the initial and boundary conditions and the requirement for grid density found during the validation tests is used to prepare three kind of fire scenarios 20 MW, 50 MW, and 100 MW, with different ventilation conditions; natural, semi transverse, full transverse, and longitudinal ventilation. The observed variables, soot density and temperature, are presented in minutes time steps trough the entire tunnel length. Comparing the obtained data in a table, allows the analyses of the ventilation conditions for different heat releases from fires. The second step is to add additional criteria of human behaviour inside the tunnel (evacuation and human resistance to the elevated gas concentrations and temperature. What comes out is a fully deterministic risk matrix that is based on the calculated data where the risk is ranged on five levels, from the lowest to a very danger level. The deterministic risk matrix represents the alternative to a probabilistic safety assessment methodology, where the fire risk is represented in detail as well as the computational fluid dynamics model results are physically correct. .

  5. A Novel Depletion-Mode MOS Gated Emitter Shorted Thyristor

    Institute of Scientific and Technical Information of China (English)

    张鹤鸣; 戴显英; 张义门; 马晓华; 林大松

    2000-01-01

    A Novel MOS-gated thyristor, depletion-mode MOS gated emitter shorted thyristor (DMST),and its two structures are proposed. In DMST,the channel of depletion-mode MOS makes the thyristor emitter-based junction inherently short. The operation of the device is controlled by the interruption and recovery of the depletion-mode MOS P channel. The perfect properties have been demonstrated by 2-D numerical simulations and the tests on the fabricated chips.

  6. Study of A Multi-criteria Evaluation Methodology for Nuclear Fuel Cycle System Based on Sustainability

    Institute of Scientific and Technical Information of China (English)

    Liu Jingquan; Hidekazu Yoshikawa; OuYang Jun; Zhou Yangping

    2006-01-01

    This paper presents a multi-criteria evaluation methodology for nuclear fuel cycle options in terms of energy sustainability. Starting from the general sustainability concept and the public acceptance questionnaire, a set of indicators reflecting specific criteria for the evaluation of nuclear fuel cycle options are defined.Particular attention is devoted to the resource utility efficiency, environmental effect, human health hazard and economic effect, which represent the different concerns of different stakeholders. This methodology also integrated a special mathematic processing approach, namely the Extentics Evaluation Method, which quantifies the human being subjective perception to provide the intuitionistic judgement and comparison for different options. The once-through option and reprocessing option of nuclear fuel cycle are examined by using the proposed methodology. The assessment process and result can give us some guidance in nuclear fuel cycle evaluation under the constraint of limited data.

  7. Spreadsheet based methodology to assess offshore wind capacity factors in project planning stage

    Energy Technology Data Exchange (ETDEWEB)

    Madariaga, Ander; Martin, Jose Luis; Martinez de Alegria, Inigo [University of the Basque Country (UPV/EHU), Bilbao (Spain). Engineering Faculty; Ceballos, Salvador [Tecnalia Research and Innovation, Parque Tecnologico de Bizkaia, Derio (Spain); Anaya-Lara, Olimpo [Strathclyde Univ., Glasgow (United Kingdom). Inst. for Energy and Environment

    2012-07-01

    This paper presents a methodology to assess the effective capacity factor of an offshore wind power plant (OWPP). The electric power losses in all the systems that make up the wind farm are considered: the offshore wind turbines (OWTs), the collector system (CS) and the transmission system (TS). Other relevant issues such as the wake effect and the unavailability of the systems that make up the windfarm are also taken into account. The formulation of the methodology is fully analytic, with no simulation procedures for the evaluation of the electric power losses. A novel proposal to assess the losses in long submarine three-core cross linked polyethylene (XLPE) cables is presented in detail. Due to its analytic structure, the methodology can be implemented either in algorithmic form or in a spreadsheet, enabling to evaluate OWPPs electric topologies in an agile way. It is aimed at engineering researchers and project planning engineers involved in the offshore wind industry. (orig.)

  8. A New Methodology of Multicriteria Decision-Making in Supplier Selection Based on Z-Numbers

    Directory of Open Access Journals (Sweden)

    Bingyi Kang

    2016-01-01

    Full Text Available Supplier selection is a significant issue of multicriteria decision-making (MCDM, which has been heavily studied with classical fuzzy methodologies, but the reliability of the knowledge from domain experts is not efficiently taken into consideration. Z-number introduced by Zadeh has more power to describe the knowledge of human being with uncertain information considering both restraint and reliability. In this paper, a methodology for supplier selection using Z-numbers is proposed considering information transformation. It includes two parts: one solves the issue of how to convert Z-number to the classic fuzzy number according to the fuzzy expectation; the other solves the problem of how to get the optimal priority weight for supplier selection with genetic algorithm (GA, which is an efficient and flexible method for calculating the priority weight of the judgement matrix. Finally, an example for supplier selection is used to illustrate the effectiveness the proposed methodology.

  9. A methodology aimed at fostering and sustaining the development processes of an IE-based industry

    Science.gov (United States)

    Corallo, Angelo; Errico, Fabrizio; de Maggio, Marco; Giangreco, Enza

    In the current competitive scenario, where business relationships are fundamental in building successful business models and inter/intra organizational business processes are progressively digitalized, an end-to-end methodology is required that is capable of guiding business networks through the Internetworked Enterprise (IE) paradigm: a new and innovative organizational model able to leverage Internet technologies to perform real-time coordination of intra and inter-firm activities, to create value by offering innovative and personalized products/services and reduce transaction costs. This chapter presents the TEKNE project Methodology of change that guides business networks, by means of a modular and flexible approach, towards the IE techno-organizational paradigm, taking into account the competitive environment of the network and how this environment influences its strategic, organizational and technological levels. Contingency, the business model, enterprise architecture and performance metrics are the key concepts that form the cornerstone of this methodological framework.

  10. Parametric Identification of Solar Series based on an Adaptive Parallel Methodology

    Indian Academy of Sciences (India)

    Juan A. Gómez Pulido; Miguel A. Vega Rodríguez; Juan M. Sánchez Pérez

    2005-03-01

    In this work we present an adaptive parallel methodology to optimize the identification of time series through parametric models, applying it to the case of sunspot series. We employ high precision computation of system identification algorithms, and use recursive least squares processing and ARMAX (Autoregressive Moving Average Extensive) parametric modelling. This methodology could be very useful when the high precision mathematical modelling of dynamic complex systems is required. After explaining the proposed heuristics and the tuning of its parameters, we showthe results we have found for several solar series using different implementations. Thus, we demonstrate how the result precision improves.

  11. Methodology for the Construction of a Rule-Based Knowledge Base Enabling the Selection of Appropriate Bronze Heat Treatment Parameters Using Rough Sets

    Directory of Open Access Journals (Sweden)

    Górny Z.

    2015-04-01

    Full Text Available Decisions regarding appropriate methods for the heat treatment of bronzes affect the final properties obtained in these materials. This study gives an example of the construction of a knowledge base with application of the rough set theory. Using relevant inference mechanisms, knowledge stored in the rule-based database allows the selection of appropriate heat treatment parameters to achieve the required properties of bronze. The paper presents the methodology and the results of exploratory research. It also discloses the methodology used in the creation of a knowledge base.

  12. Ego depletion impairs implicit learning.

    Science.gov (United States)

    Thompson, Kelsey R; Sanchez, Daniel J; Wesley, Abigail H; Reber, Paul J

    2014-01-01

    Implicit skill learning occurs incidentally and without conscious awareness of what is learned. However, the rate and effectiveness of learning may still be affected by decreased availability of central processing resources. Dual-task experiments have generally found impairments in implicit learning, however, these studies have also shown that certain characteristics of the secondary task (e.g., timing) can complicate the interpretation of these results. To avoid this problem, the current experiments used a novel method to impose resource constraints prior to engaging in skill learning. Ego depletion theory states that humans possess a limited store of cognitive resources that, when depleted, results in deficits in self-regulation and cognitive control. In a first experiment, we used a standard ego depletion manipulation prior to performance of the Serial Interception Sequence Learning (SISL) task. Depleted participants exhibited poorer test performance than did non-depleted controls, indicating that reducing available executive resources may adversely affect implicit sequence learning, expression of sequence knowledge, or both. In a second experiment, depletion was administered either prior to or after training. Participants who reported higher levels of depletion before or after training again showed less sequence-specific knowledge on the post-training assessment. However, the results did not allow for clear separation of ego depletion effects on learning versus subsequent sequence-specific performance. These results indicate that performance on an implicitly learned sequence can be impaired by a reduction in executive resources, in spite of learning taking place outside of awareness and without conscious intent.

  13. Depletable resources and the economy.

    NARCIS (Netherlands)

    Heijman, W.J.M.

    1991-01-01

    The subject of this thesis is the depletion of scarce resources. The main question to be answered is how to avoid future resource crises. After dealing with the complex relation between nature and economics, three important concepts in relation with resource depletion are discussed: steady state, ti

  14. A grey-based group decision-making methodology for the selection of hydrogen technologiess in Life Cycle Sustainability perspective

    DEFF Research Database (Denmark)

    Manzardo, Alessandro; Ren, Jingzheng; Mazzi, Anna;

    2012-01-01

    The objective of this research is to develop a grey-based group decision-making methodology for the selection of the best renewable energy technology (including hydrogen) using a life cycle sustainability perspective. The traditional grey relational analysis has been modified to better address th...... using the proposed methodology, electrolysis of water technology by hydropower has been considered to be the best technology for hydrogen production according to the decision-making group....... the issue of uncertainty. The proposed methodology allows multi-person to participate in the decision-making process and to give linguistic evaluation on the weights of the criteria and the performance of the alternative technologies. In this paper, twelve hydrogen production technologies have been assessed......The objective of this research is to develop a grey-based group decision-making methodology for the selection of the best renewable energy technology (including hydrogen) using a life cycle sustainability perspective. The traditional grey relational analysis has been modified to better address...

  15. Mechanistic Methodology for Airport Pavement Design with Engineering Fabrics. Volume 1. Theoretical and Experimental Bases.

    Science.gov (United States)

    1984-08-01

    DOTIFAAIPM-8419,, Mechanistic Methodology for Program Engineering& Airport Pavement Design with Maintenance Service Engineerin Washington, D.C. 20591...Reflective cracks require labor intensive operations for crack sealing and patching, thus becoming a significant maintenance expense item. The problem of...models or prediciting allowable critical strains are not available. The problems are complicated further by the fact that since asphaltic concrete is a

  16. 3D Buildings Modelling Based on a Combination of Techniques and Methodologies

    NARCIS (Netherlands)

    Pop, G.; Bucksch, A.K.; Gorte, B.G.H.

    2007-01-01

    Three dimensional architectural models are more and more important for a large number of applications. Specialists look for faster and more precise ways to generate them. This paper discusses methods to combine methodologies for handling data acquired from multiple sources: maps, terrestrial laser a

  17. Abort Trigger False Positive and False Negative Analysis Methodology for Threshold-Based Abort Detection

    Science.gov (United States)

    Melcher, Kevin J.; Cruz, Jose A.; Johnson Stephen B.; Lo, Yunnhon

    2015-01-01

    This paper describes a quantitative methodology for bounding the false positive (FP) and false negative (FN) probabilities associated with a human-rated launch vehicle abort trigger (AT) that includes sensor data qualification (SDQ). In this context, an AT is a hardware and software mechanism designed to detect the existence of a specific abort condition. Also, SDQ is an algorithmic approach used to identify sensor data suspected of being corrupt so that suspect data does not adversely affect an AT's detection capability. The FP and FN methodologies presented here were developed to support estimation of the probabilities of loss of crew and loss of mission for the Space Launch System (SLS) which is being developed by the National Aeronautics and Space Administration (NASA). The paper provides a brief overview of system health management as being an extension of control theory; and describes how ATs and the calculation of FP and FN probabilities relate to this theory. The discussion leads to a detailed presentation of the FP and FN methodology and an example showing how the FP and FN calculations are performed. This detailed presentation includes a methodology for calculating the change in FP and FN probabilities that result from including SDQ in the AT architecture. To avoid proprietary and sensitive data issues, the example incorporates a mixture of open literature and fictitious reliability data. Results presented in the paper demonstrate the effectiveness of the approach in providing quantitative estimates that bound the probability of a FP or FN abort determination.

  18. Context Based Inferences in Research Methodology: The Role of Culture in Justifying Knowledge Claims

    Science.gov (United States)

    Evers, Colin W.; Mason, Mark

    2011-01-01

    Drawing on work in epistemology and the philosophy of science, this paper seeks to provide very general reasons for why a comparative perspective needs to be applied to the inferential procedures of research methodologies where these concern the issue of justifying knowledge claims. In particular, the paper explores the role of culture on a number…

  19. Methodology for Evaluating an Adaptation of Evidence-Based Drug Abuse Prevention in Alternative Schools

    Science.gov (United States)

    Hopson, Laura M.; Steiker, Lori K. H.

    2008-01-01

    The purpose of this article is to set forth an innovative methodological protocol for culturally grounding interventions with high-risk youths in alternative schools. This study used mixed methods to evaluate original and adapted versions of a culturally grounded substance abuse prevention program. The qualitative and quantitative methods…

  20. A GIS-based methodology for the estimation of potential volcanic damage and its application to Tenerife Island, Spain

    Science.gov (United States)

    Scaini, C.; Felpeto, A.; Martí, J.; Carniel, R.

    2014-05-01

    This paper presents a GIS-based methodology to estimate damages produced by volcanic eruptions. The methodology is constituted by four parts: definition and simulation of eruptive scenarios, exposure analysis, vulnerability assessment and estimation of expected damages. Multi-hazard eruptive scenarios are defined for the Teide-Pico Viejo active volcanic complex, and simulated through the VORIS tool. The exposure analysis identifies the elements exposed to the hazard at stake and focuses on the relevant assets for the study area. The vulnerability analysis is based on previous studies on the built environment and complemented with the analysis of transportation and urban infrastructures. Damage assessment is performed associating a qualitative damage rating to each combination of hazard and vulnerability. This operation consists in a GIS-based overlap, performed for each hazardous phenomenon considered and for each element. The methodology is then automated into a GIS-based tool using an ArcGIS® program. Given the eruptive scenarios and the characteristics of the exposed elements, the tool produces expected damage maps. The tool is applied to the Icod Valley (North of Tenerife Island) which is likely to be affected by volcanic phenomena in case of eruption from both the Teide-Pico Viejo volcanic complex and North-West basaltic rift. Results are thematic maps of vulnerability and damage that can be displayed at different levels of detail, depending on the user preferences. The aim of the tool is to facilitate territorial planning and risk management in active volcanic areas.

  1. Vulnerability curves vs. vulnerability indicators: application of an indicator-based methodology for debris-flow hazards

    Science.gov (United States)

    Papathoma-Köhle, Maria

    2016-08-01

    The assessment of the physical vulnerability of elements at risk as part of the risk analysis is an essential aspect for the development of strategies and structural measures for risk reduction. Understanding, analysing and, if possible, quantifying physical vulnerability is a prerequisite for designing strategies and adopting tools for its reduction. The most common methods for assessing physical vulnerability are vulnerability matrices, vulnerability curves and vulnerability indicators; however, in most of the cases, these methods are used in a conflicting way rather than in combination. The article focuses on two of these methods: vulnerability curves and vulnerability indicators. Vulnerability curves express physical vulnerability as a function of the intensity of the process and the degree of loss, considering, in individual cases only, some structural characteristics of the affected buildings. However, a considerable amount of studies argue that vulnerability assessment should focus on the identification of these variables that influence the vulnerability of an element at risk (vulnerability indicators). In this study, an indicator-based methodology (IBM) for mountain hazards including debris flow (Kappes et al., 2012) is applied to a case study for debris flows in South Tyrol, where in the past a vulnerability curve has been developed. The relatively "new" indicator-based method is being scrutinised and recommendations for its improvement are outlined. The comparison of the two methodological approaches and their results is challenging since both methodological approaches deal with vulnerability in a different way. However, it is still possible to highlight their weaknesses and strengths, show clearly that both methodologies are necessary for the assessment of physical vulnerability and provide a preliminary "holistic methodological framework" for physical vulnerability assessment showing how the two approaches may be used in combination in the future.

  2. Contrasts between Antarctic and Arctic ozone depletion.

    Science.gov (United States)

    Solomon, Susan; Portmann, Robert W; Thompson, David W J

    2007-01-09

    This work surveys the depth and character of ozone depletion in the Antarctic and Arctic using available long balloon-borne and ground-based records that cover multiple decades from ground-based sites. Such data reveal changes in the range of ozone values including the extremes observed as polar air passes over the stations. Antarctic ozone observations reveal widespread and massive local depletion in the heart of the ozone "hole" region near 18 km, frequently exceeding 90%. Although some ozone losses are apparent in the Arctic during particular years, the depth of the ozone losses in the Arctic are considerably smaller, and their occurrence is far less frequent. Many Antarctic total integrated column ozone observations in spring since approximately the 1980s show values considerably below those ever observed in earlier decades. For the Arctic, there is evidence of some spring season depletion of total ozone at particular stations, but the changes are much less pronounced compared with the range of past data. Thus, the observations demonstrate that the widespread and deep ozone depletion that characterizes the Antarctic ozone hole is a unique feature on the planet.

  3. Soft-systems thinking for community-development decision making: A participative, computer-based modeling methodology

    Energy Technology Data Exchange (ETDEWEB)

    Cook, R.J.

    1987-01-01

    The normative-rational models used to ensure logical decision processes do not capture the complex nature of planning situations, and alternative methodologies that can improve the collection and use of qualitative data are scarce. The intent of this thesis is to design and apply a methodology that may help planners incorporate such data into policy analysis. To guide the application and allow for its evaluation, criteria are gleaned from the literature on computer modeling, human cognition, and group process. From this, a series of individual and group ideation techniques along with two computer-modeling procedures are combined to aid participant understanding and provide computation capabilities. The methodology is applied in the form of a case study in Door County, Wisconsin. The process and its results were evaluated by workshop participants and by three planners who were intent on using this information to help update a county master plan. Based on established criteria, their evaluations indicate that the soft-systems methodology devised in this thesis has potential for improving the collection and use of qualitative data for public-policy purposes.

  4. A Multi-Criteria Decision Analysis based methodology for quantitatively scoring the reliability and relevance of ecotoxicological data.

    Science.gov (United States)

    Isigonis, Panagiotis; Ciffroy, Philippe; Zabeo, Alex; Semenzin, Elena; Critto, Andrea; Giove, Silvio; Marcomini, Antonio

    2015-12-15

    Ecotoxicological data are highly important for risk assessment processes and are used for deriving environmental quality criteria, which are enacted for assuring the good quality of waters, soils or sediments and achieving desirable environmental quality objectives. Therefore, it is of significant importance the evaluation of the reliability of available data for analysing their possible use in the aforementioned processes. The thorough analysis of currently available frameworks for the assessment of ecotoxicological data has led to the identification of significant flaws but at the same time various opportunities for improvement. In this context, a new methodology, based on Multi-Criteria Decision Analysis (MCDA) techniques, has been developed with the aim of analysing the reliability and relevance of ecotoxicological data (which are produced through laboratory biotests for individual effects), in a transparent quantitative way, through the use of expert knowledge, multiple criteria and fuzzy logic. The proposed methodology can be used for the production of weighted Species Sensitivity Weighted Distributions (SSWD), as a component of the ecological risk assessment of chemicals in aquatic systems. The MCDA aggregation methodology is described in detail and demonstrated through examples in the article and the hierarchically structured framework that is used for the evaluation and classification of ecotoxicological data is shortly discussed. The methodology is demonstrated for the aquatic compartment but it can be easily tailored to other environmental compartments (soil, air, sediments).

  5. A correlative imaging based methodology for accurate quantitative assessment of bone formation in additive manufactured implants.

    Science.gov (United States)

    Geng, Hua; Todd, Naomi M; Devlin-Mullin, Aine; Poologasundarampillai, Gowsihan; Kim, Taek Bo; Madi, Kamel; Cartmell, Sarah; Mitchell, Christopher A; Jones, Julian R; Lee, Peter D

    2016-06-01

    A correlative imaging methodology was developed to accurately quantify bone formation in the complex lattice structure of additive manufactured implants. Micro computed tomography (μCT) and histomorphometry were combined, integrating the best features from both, while demonstrating the limitations of each imaging modality. This semi-automatic methodology registered each modality using a coarse graining technique to speed the registration of 2D histology sections to high resolution 3D μCT datasets. Once registered, histomorphometric qualitative and quantitative bone descriptors were directly correlated to 3D quantitative bone descriptors, such as bone ingrowth and bone contact. The correlative imaging allowed the significant volumetric shrinkage of histology sections to be quantified for the first time (~15 %). This technique demonstrated the importance of location of the histological section, demonstrating that up to a 30 % offset can be introduced. The results were used to quantitatively demonstrate the effectiveness of 3D printed titanium lattice implants.

  6. Business analysis methodology in telecommunication industry – the research based on the grounded theory

    Directory of Open Access Journals (Sweden)

    Hana Nenickova

    2013-10-01

    Full Text Available The objective of this article is to present the grounded theory using in the qualitative research as a basis to build a business analysis methodology for the implementation of information systems in telecommunication enterprises in Czech Republic. In the preparation of the methodology I have used the current needs of telecommunications companies, which are characterized mainly by high dependence on information systems. Besides that, this industry is characterized by high flexibility and competition and compressing of the corporate strategy timeline. The grounded theory of business analysis defines the specifics of the telecommunications industry, focusing on the very specific description of the procedure for collecting the business requirements and following the business strategy.

  7. A methodology for secure recovery of spacecrafts based on a trusted hardware platform

    Science.gov (United States)

    Juliato, Marcio; Gebotys, Catherine

    2017-02-01

    This paper proposes a methodology for the secure recovery of spacecrafts and the recovery of its cryptographic capabilities in emergency scenarios recurring from major unintentional failures and malicious attacks. The proposed approach employs trusted modules to achieve higher reliability and security levels in space missions due to the presence of integrity check capabilities as well as secure recovery mechanisms. Additionally, several recovery protocols are thoroughly discussed and analyzed against a wide variety of attacks. Exhaustive search attacks are shown in a wide variety of contexts and are shown to be infeasible and totally independent of the computational power of attackers. Experimental results have shown that the proposed methodology allows for the fast and secure recovery of spacecrafts, demanding minimum implementation area, power consumption and bandwidth.

  8. Toward Improved Understanding of Food Security: A Methodological Examination Based in Rural South Africa.

    Science.gov (United States)

    Kirkland, Tracy; Kemp, Robert S; Hunter, Lori M; Twine, Wayne S

    2013-03-01

    Accurate measurement of household food security is essential to generate adequate information on the proportion of households experiencing food insecurity, especially in areas or regions vulnerable to food shortages and famine. This manuscript offers a methodological examination of three commonly used indicators of household food security - experience of hunger, dietary diversity, and coping strategies. Making use of data from the Agincourt Health and Demographic Surveillance Site in rural South Africa, we examine the association between the indicators themselves to improve understanding of the different insight offered by each food security "lens." We also examine how the choice of indicator shapes the profile of vulnerable households, with results suggesting that dietary diversity scores may not adequately capture broader food insecurity. Concluding discussion explores programmatic and policy implications as related to methodological choices.

  9. Development of six sigma concurrent parameter and tolerance design method based on response surface methodology

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Using Response Surface Methodology (RSM), an optimizing model of concurrent parameter and tolerance design is proposed where response mean equals its target in the target being best. The optimizing function of the model is the sum of quality loss and tolerance cost subjecting to the variance confidence region of which six sigma capability can be assured. An example is illustrated in order to compare the differences between the developed model and the parameter design with minimum variance. The results show ...

  10. Outline for a human-based assessment methodology of urban projects. The case of Polis Gondomar

    Directory of Open Access Journals (Sweden)

    Paolo Marcolin

    2015-06-01

    Full Text Available This paper presents the results of a preliminary research project proposing an assessment methodology for the actual impacts of urban projects on city development[1]. At this stage, the research has focused on the identification of indicators about the success of urban projects according to the accomplishment, functioning and use of public spaces. A case study is presented as a means for exploring these indicators: the intervention made by the Polis Programme at Gondomar.

  11. A PLM components monitoring framework for SMEs based on a PLM maturity model and FAHP methodology

    OpenAIRE

    Zhang, Haiqing; Sekhari, Aicha; Ouzrout, Yacine; Bouras, Abdelaziz

    2014-01-01

    Right PLM components selection and investments increase business advantages. This paper develops a PLM components monitoring framework to assess and guide PLM implementation in small and middle enterprises (SMEs). The framework builds upon PLM maturity models and decision-making methodology. PLM maturity model has the capability to analyze PLM functionalities and evaluate PLM components. A proposed PLM components maturity assessment (PCMA) model can obtain general maturity levels of PLM compo...

  12. Optimization design of the stratospheric airship's power system based on the methodology of orthogonal experiment

    Institute of Scientific and Technical Information of China (English)

    Jian LIU; Quan-bao WANG; Hai-tao ZHAO; Ji-an CHEN; Ye QIU; Deng-ping DUAN

    2013-01-01

    The optimization design of the power system is essential for stratospheric airships with paradoxical requirements of high reliability and low weight.The methodology of orthogonal experiment is presented to deal with the problem of the optimization design of the airship's power system.Mathematical models of the solar array,regenerative fuel cell,and power management subsystem (PMS) are presented.The basic theory of the method of orthogonal experiment is discussed,and the selection of factors and levels of the experiment and the choice of the evaluation function are also revealed.The proposed methodology is validated in the optimization design of the power system of the ZhiYuan-2 stratospheric airship.Results show that the optimal configuration is easily obtained through this methodology.Furthermore,the optimal configuration and three sub-optimal configurations are in the Pareto frontier of the design space.Sensitivity analyses for the weight and reliability of the airship's power system are presented.

  13. A GIS-based methodology to delineate potential areas for groundwater development: a case study from Kathmandu Valley, Nepal

    Science.gov (United States)

    Pandey, Vishnu P.; Shrestha, Sangam; Kazama, Futaba

    2013-06-01

    For an effective planning of activities aimed at recovering aquifer depletion and maintaining health of groundwater ecosystem, estimates of spatial distribution in groundwater storage volume would be useful. The estimated volume, if analyzed together with other hydrogeologic characteristics, may help delineate potential areas for groundwater development. This study proposes a GIS-based ARC model to delineate potential areas for groundwater development; where `A' stands for groundwater availability, `R' for groundwater release potential of soil matrix, and `C' for cost for groundwater development. The model is illustrated with a case of the Kathmandu Valley in Central Nepal, where active discussions are going on to develop and implement groundwater management strategies. The study results show that shallow aquifers have high groundwater storage potential (compared to the deep) and favorable areas for groundwater development are concentrated at some particular areas in shallow and deep aquifers. The distribution of groundwater storage and potential areas for groundwater development are then mapped using GIS.

  14. Multicriteria decision-making analysis based methodology for predicting carbonate rocks' uniaxial compressive strength

    Directory of Open Access Journals (Sweden)

    Ersoy Hakan

    2012-10-01

    Full Text Available

    ABSTRACT

    Uniaxial compressive strength (UCS deals with materials' to ability to withstand axially-directed pushing forces and especially considered to be rock materials' most important mechanical properties. However, the UCS test is an expensive, very time-consuming test to perform in the laboratory and requires high-quality core samples having regular geometry. Empirical equations were thus proposed for predicting UCS as a function of rocks' index properties. Analytical hierarchy process and multiple regression analysis based methodology were used (as opposed to traditional linear regression methods on data-sets obtained from carbonate rocks in NE Turkey. Limestone samples ranging from Devonian to late Cretaceous ages were chosen; travertine-onyx samples were selected from morphological environments considering their surface environmental conditions Test results from experiments carried out on about 250 carbonate rock samples were used in deriving the model. While the hierarchy model focused on determining the most important index properties affecting on UCS, regression analysis established meaningful relationships between UCS and index properties; 0. 85 and 0. 83 positive coefficient correlations between the variables were determined by regression analysis. The methodology provided an appropriate alternative to quantitative estimation of UCS and avoided the need for tedious and time consuming laboratory testing


    RESUMEN

    La resistencia a la compresión uniaxial (RCU trata con la capacidad de los materiales para soportar fuerzas empujantes dirigidas axialmente y, especialmente, es considerada ser uno de las más importantes propiedades mecánicas de

  15. Module-based functional pathway enrichment analysis of a protein-protein interaction network to study the effects of intestinal microbiota depletion in mice.

    Science.gov (United States)

    Jia, Zhen-Yi; Xia, Yang; Tong, Danian; Yao, Jing; Chen, Hong-Qi; Yang, Jun

    2014-06-01

    Complex communities of microorganisms play important roles in human health, and alterations in the intestinal microbiota may induce intestinal inflammation and numerous diseases. The purpose of this study was to identify the key genes and processes affected by depletion of the intestinal microbiota in a murine model. The Affymetrix microarray dataset GSE22648 was downloaded from the Gene Expression Omnibus database, and differentially expressed genes (DEGs) were identified using the limma package in R. A protein-protein interaction (PPI) network was constructed for the DEGs using the Cytoscape software, and the network was divided into several modules using the MCODE plugin. Furthermore, the modules were functionally annotated using the PiNGO plugin, and DEG-related pathways were retrieved and analyzed using the GenMAPP software. A total of 53 DEGs were identified, of which 26 were upregulated and 27 were downregulated. The PPI network of these DEGs comprised 3 modules. The most significant module-related DEGs were the cytochrome P450 (CYP) 4B1 isozyme gene (CYP4B1) in module 1, CYP4F14 in module 2 and the tachykinin precursor 1 gene (TAC1) in module 3. The majority of enriched pathways of module 1 and 2 were oxidation reduction pathways (metabolism of xenobiotics by CYPs) and lipid metabolism-related pathways, including linoleic acid and arachidonic acid metabolism. The neuropeptide signaling pathway was the most significantly enriched functional pathway of module 3. In conclusion, our findings strongly suggest that intestinal microbiota depletion affects cellular metabolism and oxidation reduction pathways. In addition, this is the first time, to the best of our knowledge, that the neuropeptide signaling pathway is reported to be affected by intestinal microbiota depletion in mice. The present study provides a list of candidate genes and processes related to the interaction of microbiota with the intestinal tract.

  16. 12.5 Gbps optical modulation of silicon racetrack resonator based on carrier-depletion in asymmetric p-n diode.

    Science.gov (United States)

    You, Jong-Bum; Park, Miran; Park, Jeong-Woo; Kim, Gyungock

    2008-10-27

    We present a high speed optical modulation using carrier depletion effect in an asymmetric silicon p-n diode resonator. To optimize coupling efficiency and reduce bending loss, two-step-etched waveguide is used in the racetrack resonator with a directional coupler. The quality factor of the resonator with a circumference of 260 um is 9,482, and the DC on/off ratio is 8 dB at -12V. The device shows the 3dB bandwidth of approximately8 GHz and the data transmission up to 12.5Gbit/s.

  17. A methodology to ensure local mass conservation for porous media models under finite element formulations based on convex optimization

    Science.gov (United States)

    Chang, J.; Nakshatrala, K.

    2014-12-01

    It is well know that the standard finite element methods, in general, do not satisfy element-wise mass/species balance properties. It is, however, desirable to have element-wide mass balance property in subsurface modeling. Several studies over the years have aimed to overcome this drawback of finite element formulations. Currently, a post-processing optimization-based methodology is commonly employed to recover the local mass balance for porous media models. However, such a post-processing technique does not respect the underlying variational structure that the finite element formulation may enjoy. Motivated by this, a consistent methodology to satisfy element-wise local mass balance for porous media models is constructed using convex optimization techniques. The assembled system of global equations is reconstructed into a quadratic programming problem subjected to bounded equality constraints that ensure conservation at the element level. The proposed methodology can be applied to any computational mesh and to any non-locally conservative nodal-based finite element method. Herein, we integrate our proposed methodology into the framework of the classical mixed Galerkin formulation using Taylor-Hood elements and the least-squares finite element formulation. Our numerical studies will include computational cost, numerical convergence, and comparision with popular methods. In particular, it will be shown that the accuracy of the solutions is comparable with that of several popular locally conservative finite element formulations like the lowest order Raviart-Thomas formulation. We believe the proposed optimization-based approach is a viable approach to preserve local mass balance on general computational grids and is amenable for large-scale parallel implementation.

  18. Post flood damage data collection and assessment in Albania based on DesInventar methodology

    Science.gov (United States)

    Toto, Emanuela; Massabo, Marco; Deda, Miranda; Rossello, Laura

    2015-04-01

    In 2013 in Albania was implemented a collection of disaster losses based on Desinventar. The DesInventar system consists in a methodology and software tool that lead to the systematic collection, documentation and analysis of loss data on disasters. The main sources of information about disasters used for the Albanian database were the Albanian Ministry of Internal Affairs, the National Library and the State archive. Specifically for floods the database created contains nearly 900 datasets, for a period of 148 years (from 1865 to 2013). The data are georeferenced on the administrative units of Albania: Region, Provinces and Municipalities. The datasets describe the events by reporting the date of occurrence, the duration, the localization in administrative units and the cause. Additional information regards the effects and damage that the event caused on people (deaths, injured, missing, affected, relocated, evacuated, victims) and on houses (houses damaged or destroyed). Other quantitative indicators are the losses in local currency or US dollars, the damage on roads, the crops affected , the lost cattle and the involvement of social elements over the territory such as education and health centers. Qualitative indicators simply register the sectors (e.g. transportations, communications, relief, agriculture, water supply, sewerage, power and energy, industries, education, health sector, other sectors) that were affected. Through the queries and analysis of the data collected it was possible to identify the most affected areas, the economic loss, the damage in agriculture, the houses and people affected and many other variables. The most vulnerable Regions for the past floods in Albania were studied and individuated, as well as the rivers that cause more damage in the country. Other analysis help to estimate the damage and losses during the main flood events of the recent years, occurred in 2010 and 2011, and to recognize the most affected sectors. The database was

  19. Cord blood glutathione depletion in preterm infants: correlation with maternal cysteine depletion.

    Directory of Open Access Journals (Sweden)

    Alice Küster

    Full Text Available BACKGROUND: Depletion of blood glutathione (GSH, a key antioxidant, is known to occur in preterm infants. OBJECTIVE: Our aim was to determine: 1 whether GSH depletion is present at the time of birth; and 2 whether it is associated with insufficient availability of cysteine (cys, the limiting GSH precursor, or a decreased capacity to synthesize GSH. METHODOLOGY: Sixteen mothers delivering very low birth weight infants (VLBW, and 16 mothers delivering healthy, full term neonates were enrolled. Immediately after birth, erythrocytes from umbilical vein, umbilical artery, and maternal blood were obtained to assess GSH [GSH] and cysteine [cys] concentrations, and the GSH synthesis rate was determined from the incorporation of labeled cysteine into GSH in isolated erythrocytes ex vivo, measured using gas chromatography mass spectrometry. PRINCIPAL FINDINGS: Compared with mothers delivering at full term, mothers delivering prematurely had markedly lower erythrocyte [GSH] and [cys] and these were significantly depressed in VLBW infants, compared with term neonates. A strong correlation was found between maternal and fetal GSH and cysteine levels. The capacity to synthesize GSH was as high in VLBW as in term infants. CONCLUSION: The current data demonstrate that: 1 GSH depletion is present at the time of birth in VLBW infants; 2 As VLBW neonates possess a fully active capacity to synthesize glutathione, the depletion may arise from inadequate cysteine availability, potentially due to maternal depletion. Further studies would be needed to determine whether maternal-fetal cysteine transfer is decreased in preterm infants, and, if so, whether cysteine supplementation of mothers at risk of delivering prematurely would strengthen antioxidant defense in preterm neonates.

  20. The bounds on tracking performance utilising a laser-based linear and angular sensing and measurement methodology for micro/nano manipulation

    OpenAIRE

    Clark, Leon; Shirinzadeh, Bijan; Tian, Yanling; Zhong, Yongmin

    2014-01-01

    This paper presents an analysis of the tracking performance of a planar three degrees of freedom (DOF) flexure-based mechanism for micro/nano manipulation, utilising a tracking methodology for the measurement of coupled linear and angular motions. The methodology permits trajectories over a workspace with large angular range through the reduction of geometric errors. However, when combining this methodology with feedback control systems, the accuracy of performed manipulations can only be sta...

  1. A Methodology for Platform Based High—Level System—on—Chip Verification

    Institute of Scientific and Technical Information of China (English)

    GAOFeng; LIUPeng; YAOQingdong

    2003-01-01

    The time-to-market challenge has increased the need for shortening the co-verification time in system-on-chip development.In this article,a new methodology of high-level hardware/software coverification is introduced.With the help of the real-time operating system,the application program can easily be migrated from the software simulator to the hardware emulation board.The hierarchical architecture can be used to separate application program from the implementation of the platform during the veriflaction process.The highlevel verification platform is successfully used in developing the HDTV decoding chip.

  2. Methodology for Definition of Yellow Fever Priority Areas, Based on Environmental Variables and Multiple Correspondence Analyses

    OpenAIRE

    Eduardo Stramandinoli Moreno; Rita de Cássia Barradas Barata

    2012-01-01

    Yellow fever (YF) is endemic in much of Brazil, where cases of the disease are reported every year. Since 2008, outbreaks of the disease have occurred in regions of the country where no reports had been registered for decades, which has obligated public health authorities to redefine risk areas for the disease. The aim of the present study was to propose a methodology of environmental risk analysis for defining priority municipalities for YF vaccination, using as example, the State of São Pau...

  3. A problem-based approach to teaching research methodology to medical graduates in Iran

    Directory of Open Access Journals (Sweden)

    Mehrdad Jalalian Hosseini

    2009-08-01

    Full Text Available Physicians are reticent to participate in research projects for avariety of reasons. Facilitating the active involvement ofdoctors in research projects is a high priority for the IranianBlood Transfusion Organization (IBTO. A one-month trainingcourse on research methodology was conducted for a groupof physicians in Mashhad, in northeast Iran. The participantswere divided in ten groups. They prepared a researchproposal under the guidance of a workshop leader. Thequality of the research proposals, which were prepared by allparticipants, went beyond our expectations. All of theresearch proposals were relevant to blood safety. In this briefreport we describe our approach.

  4. A Combined Heuristic and Indicator-based Methodology for Design of Sustainable Chemical Process Plants

    DEFF Research Database (Denmark)

    Halim, Iskandar; Carvalho, Ana; Srinivasan, Rajagopalan;

    2011-01-01

    The current emphasis on sustainable production has prompted chemical plants to minimize raw material and energy usage without compromising on economics. While computer tools are available to assistin sustainability assessment, their applications are constrained to a specific domain of the design...... synthesis problem. This paper outlines a design synthesis strategy that integrates two computer methodologies – ENVOPExpert and SustainPro – for simultaneous generation, analysis, evaluation, and optimization of sustainable process alternatives. ENVOPExpert diagnoses waste sources, identifies alternatives......, comprehensive generation of design alternatives, and effective reduction of the optimization search space. The frame-work is illustrated using anacetone process and a methanol and dimethyl ether production case study....

  5. Towards a complete propagation uncertainties in depletion calculations

    Energy Technology Data Exchange (ETDEWEB)

    Martinez, J.S. [Universidad Politecnica de Madrid (Spain). Dept. of Nuclear Engineering; Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Garching (Germany); Zwermann, W.; Gallner, L.; Puente-Espel, Federico; Velkov, K.; Hannstein, V. [Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Garching (Germany); Cabellos, O. [Universidad Politecnica de Madrid (Spain). Dept. of Nuclear Engineering

    2013-07-01

    Propagation of nuclear data uncertainties to calculated values is interesting for design purposes and libraries evaluation. XSUSA, developed at GRS, propagates cross section uncertainties to nuclear calculations. In depletion simulations, fission yields and decay data are also involved and are a possible source of uncertainty that should be taken into account. We have developed tools to generate varied fission yields and decay libraries and to propagate uncertainties through depletion in order to complete the XSUSA uncertainty assessment capabilities. A generic test to probe the methodology is defined and discussed. (orig.)

  6. Experimental methodology for turbocompressor in-duct noise evaluation based on beamforming wave decomposition

    Science.gov (United States)

    Torregrosa, A. J.; Broatch, A.; Margot, X.; García-Tíscar, J.

    2016-08-01

    An experimental methodology is proposed to assess the noise emission of centrifugal turbocompressors like those of automotive turbochargers. A step-by-step procedure is detailed, starting from the theoretical considerations of sound measurement in flow ducts and examining specific experimental setup guidelines and signal processing routines. Special care is taken regarding some limiting factors that adversely affect the measuring of sound intensity in ducts, namely calibration, sensor placement and frequency ranges and restrictions. In order to provide illustrative examples of the proposed techniques and results, the methodology has been applied to the acoustic evaluation of a small automotive turbocharger in a flow bench. Samples of raw pressure spectra, decomposed pressure waves, calibration results, accurate surge characterization and final compressor noise maps and estimated spectrograms are provided. The analysis of selected frequency bands successfully shows how different, known noise phenomena of particular interest such as mid-frequency "whoosh noise" and low-frequency surge onset are correlated with operating conditions of the turbocharger. Comparison against external inlet orifice intensity measurements shows good correlation and improvement with respect to alternative wave decomposition techniques.

  7. Preliminary Evaluation on In-vessel Source Term based on 4S Methodology in PGSFR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seong Won; Chang, Won-Pyo; Seong, Seung Hwan; Ahn, Sang June; Kang, Seok Hun; Choi, Chi-Woong; Lee, Jin Yoo; Lee, Kwi Lim; Jeong, Jae-Ho; Jeong, Taekyeong; Ha, Kwi-Seok [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    This strategy requires nuclear plants to have features that prevent radionuclide release and multiple barriers to the escape from the plants of any radionuclides that are released despite preventive measures. Considerations of the ability to prevent and mitigate release of radionuclides arise at numerous places in the safety regulations of nuclear plants. The effectiveness of mitigative capabilities in nuclear plants is subject to quantitative analysis. The radionuclide input to these quantitative analyses of effectiveness is the Source Term (ST). All features of the composition, magnitude, timing, chemical form and physical form of accidental radionuclide release constitute the ST. Also, ST is defined as the release of radionuclides from the fuel and coolant into the containment, and subsequently to the environment. The many assumptions and equations evaluated in 4S are used. The in-vessel STs are calculated through several phases: The inventory of each radionuclide is calculated by ORIGEN-2 code using the peak burnup conditions. The nominal value of the radiological inventory is multiplied by a factor of 1.5 as an uncertainty margin to give the radiological inventory. ST in the release from the core to primary sodium is calculated by using the assumption of 4S methodology. Lastly, ST in the release from the primary sodium to cover gas space is calculated by using the assumption of 4S methodology.

  8. A Methodology for Mapping Meanings in Text-Based Sustainability Communication

    Directory of Open Access Journals (Sweden)

    Mark Brown

    2013-06-01

    Full Text Available In moving society towards more sustainable forms of consumption and production, social learning must play an important role. Making the assumption that it occurs as a consequence of changes in understanding, this article presents a methodology for mapping meanings in sustainability communication texts. The methodology uses techniques from corpus linguistics and framing theory. Two large databases of text were constructed by copying material down from the websites of two different groups of social actors: (i environmental NGOs and (ii British green business, and saving it as .txt files. The findings on individual words show that the NGOs and business use them very differently. Focusing on words expressing concern for the natural environment, it is proposed that the two actors also conceptualize their concern differently. Green business’s cognitive system of concern has two well-developed frames; good intentions and risk management. However, three frames—concern for the natural environment, perception of the damage, and responsibility, are light on detail. In contrast, within the NGOs’ system of concern, the frames of concern for the natural environment, perception of the damage and responsibility, contain words making detailed representations.

  9. Artificial Neural Network based γ-hadron segregation methodology for TACTIC telescope

    Energy Technology Data Exchange (ETDEWEB)

    Dhar, V.K., E-mail: veer@barc.gov.in [Astrophysical Sciences Division, Bhabha Atomic Research Centre, Mumbai 400 085 (India); Tickoo, A.K.; Koul, M.K.; Koul, R. [Astrophysical Sciences Division, Bhabha Atomic Research Centre, Mumbai 400 085 (India); Dubey, B.P. [Electronics and Instrumentation Services Division, Bhabha Atomic Research Centre, Mumbai 400 085 (India); Rannot, R.C.; Yadav, K.K.; Chandra, P.; Kothari, M.; Chanchalani, K.; Venugopal, K. [Astrophysical Sciences Division, Bhabha Atomic Research Centre, Mumbai 400 085 (India)

    2013-04-21

    The sensitivity of a Cherenkov imaging telescope is strongly dependent on the rejection of the cosmic-ray background events. The methods which have been used to achieve the segregation between the γ-rays from the source and the background cosmic-rays, include methods like Supercuts/Dynamic Supercuts, Maximum likelihood classifier, Kernel methods, Fractals, Wavelets and random forest. While the segregation potential of the neural network classifier has been investigated in the past with modest results, the main purpose of this paper is to study the γ/hadron segregation potential of various ANN algorithms, some of which are supposed to be more powerful in terms of better convergence and lower error compared to the commonly used Backpropagation algorithm. The results obtained suggest that Levenberg–Marquardt method outperforms all other methods in the ANN domain. Applying this ANN algorithm to ∼101.44h of Crab Nebula data collected by the TACTIC telescope, during November 10, 2005–January 30, 2006, yields an excess of ∼(1141±106) with a statistical significance of ∼11.07σ, as against an excess of ∼(928±100) with a statistical significance of ∼9.40σ obtained with Dynamic Supercuts selection methodology. The main advantage accruing from the ANN methodology is that it is more effective at higher energies and this has allowed us to re-determine the Crab Nebula energy spectrum in the energy range ∼1–24TeV.

  10. PAGIS summary report of phase 1: a common methodological approach based on European data and models

    Energy Technology Data Exchange (ETDEWEB)

    Cadelli, N.; Cottone, G.; Bertozzi, G.; Girardi, F.

    1984-01-01

    Since 1982 a joint study has been launched by the CEC with the participation of national institutions in the E.C., aiming at a Performance Assessment of Geological Isolation Systems (PAGIS) for HLW disposal. This document is a summary of the first phase of the study which was devoted to the collection of data and models and to the choice of an appropriate methodology. To this purpose, real or national sites have been chosen, which are representative of three types of continental geological formations in the E.C.: clay, granite and salt (although the choices imply no committment of any kind about their final use). Moreover, sub-seabed areas have also been identified. The study covers the following items: - basic data on waste characteristics, site data and repository designs; - methodology, which allows sensitivity and uncertainty analyses to be performed, as well as the assessment of radiation doses to individuals and populations; - preliminary modelling of radionuclide release and migration through the geosphere (near- and far-field) and the biosphere following their various pathways to man; - selection of the most relevant radionuclide release scenarios and their probability of occurrence. Reference values have been selected for the basic data as well as variants covering the various options which are under consideration in the different Countries of the E.C.

  11. Information Systems Planning based on the BSP methodology: the DETRAN/AL case

    Directory of Open Access Journals (Sweden)

    Adiel Teixeira de Almeida

    2008-12-01

    Full Text Available The ability to generate, process and transmit information is the first step of a production process that finishes by its application in the process of value aggregation to products and services. However, in order to provide access to the necessary information for organizations investments in technology are not enough: they must invest in the information infrastructure. In this context, the Information Systems planning should ensure that investments inInformation Systems are aligned with the organizational strategy. This article presents the results of the application of the Business System Planning methodology to the Information System planning of the administrative and financial area of the Transit Department of theAlagoas State. The choice of the BSP methodology was motivated by its emphasis on business processes and it provided the identification of the strategic vision of the organization, the business processes of the administrative and financial area and the groups of information necessary to procedures. Additionally, through the use of a prioritization software it was possible to establish groups of information services modules to be developed in a first horizon.

  12. Depleting depletion: Polymer swelling in poor solvent mixtures

    Science.gov (United States)

    Mukherji, Debashish; Marques, Carlos; Stuehn, Torsten; Kremer, Kurt

    A polymer collapses in a solvent when the solvent particles dislike monomers more than the repulsion between monomers. This leads to an effective attraction between monomers, also referred to as depletion induced attraction. This attraction is the key factor behind standard polymer collapse in poor solvents. Strikingly, even if a polymer exhibits poor solvent condition in two different solvents, it can also swell in mixtures of these two poor solvents. This collapse-swelling-collapse scenario is displayed by poly(methyl methacrylate) (PMMA) in aqueous alcohol. Using molecular dynamics simulations of a thermodynamically consistent generic model and theoretical arguments, we unveil the microscopic origin of this phenomenon. Our analysis suggests that a subtle interplay of the bulk solution properties and the local depletion forces reduces depletion effects, thus dictating polymer swelling in poor solvent mixtures.

  13. Estimating initial contaminant mass based on fitting mass-depletion functions to contaminant mass discharge data: Testing method efficacy with SVE operations data

    Science.gov (United States)

    Mainhagu, J.; Brusseau, M. L.

    2016-09-01

    The mass of contaminant present at a site, particularly in the source zones, is one of the key parameters for assessing the risk posed by contaminated sites, and for setting and evaluating remediation goals and objectives. This quantity is rarely known and is challenging to estimate accurately. This work investigated the efficacy of fitting mass-depletion functions to temporal contaminant mass discharge (CMD) data as a means of estimating initial mass. Two common mass-depletion functions, exponential and power functions, were applied to historic soil vapor extraction (SVE) CMD data collected from 11 contaminated sites for which the SVE operations are considered to be at or close to essentially complete mass removal. The functions were applied to the entire available data set for each site, as well as to the early-time data (the initial 1/3 of the data available). Additionally, a complete differential-time analysis was conducted. The latter two analyses were conducted to investigate the impact of limited data on method performance, given that the primary mode of application would be to use the method during the early stages of a remediation effort. The estimated initial masses were compared to the total masses removed for the SVE operations. The mass estimates obtained from application to the full data sets were reasonably similar to the measured masses removed for both functions (13 and 15% mean error). The use of the early-time data resulted in a minimally higher variation for the exponential function (17%) but a much higher error (51%) for the power function. These results suggest that the method can produce reasonable estimates of initial mass useful for planning and assessing remediation efforts.

  14. A new EBSD based methodology for the quantitative characterisation of microstructures formed by displacive fcc-bcc transformations.

    Science.gov (United States)

    Zachrisson, J; Börjesson, J; Karlsson, L

    2013-02-01

    This work is concerned with a new methodology that can be used to quantify the degree to which grains in the microstructure are aligned in the form of packets. The methodology is based on a crystallographic definition of the term packet which is used to deduce the theoretically ideal misorientations of intra-packet grain boundaries. A misorientation distribution obtained from extensive EBSD mapping can thus be split into intra- and inter-packet misorientations and the corresponding fractions can be determined by integration. The theoretical framework of the methodology is explained and a step-by-step description of the procedure is given. Results from a trace analysis are provided to justify the assumptions made regarding habit plane and examples are included showing how the grain boundary network can be split into two separate parts, one for lath boundaries and the other for packet boundaries. Moreover, example weld metal microstructures along with the corresponding misorientation distributions as well as quantitative values of the microstructures are presented.

  15. SYNTHESIS METHODOLOGY FOR ACTIVE ELEMENT USING IMPEDANCES –BASED BOND GRAPH METHODS

    Directory of Open Access Journals (Sweden)

    Nasta TANASESCU

    2004-12-01

    Full Text Available This paper introduces a synthesis methodology for active elements within systems that uses frequency response function as a basis for describing required behavior. The method is applicable in the design of a new system or in the retrofit of an existing system in which an active element is required or desired. The two basis principles of bond graph modeling are the “reticulation principle” which decomposes a physical into elemental physical laws represented as network elements interacting through ports and the “power postulate” which assembles the complete model through a network of power flows representing the exchange of energy between the elements. Moreover the bond graph model leads to a rigorous definitions of the structure of the associated dynamical equations.

  16. A Capacitance-Based Methodology for the Estimation of Piezoelectric Coefficients of Poled Piezoelectric Materials

    KAUST Repository

    Al Ahmad, Mahmoud

    2010-10-04

    A methodology is proposed to estimate the piezoelectric coefficients of bulk piezoelectric materials using simple capacitance measurements. The extracted values of d33 and d31 from the capacitance measurements were 506 pC/N and 247 pC/N, respectively. The d33 value is in agreement with that obtained from the Berlincourt method, which gave a d33 value of 500 pC/N. In addition, the d31 value is in agreement with the value obtained from the optical method, which gave a d 31 value of 223 pC/V. These results suggest that the proposed method is a viable way to quickly estimate piezoelectric coefficients of bulk unclamped samples. © 2010 The Electrochemical Society.

  17. [Customer satisfaction in home care: methodological issues based on a survey carried out in Lazio].

    Science.gov (United States)

    Pasquarella, A; Marceca, M; Casagrande, S; Gentile, D; Zeppilli, D; Buonaiuto, N; Cozzolino, M; Guasticchi, G

    2007-01-01

    Home care customer satisfaction has been, until now, rarely evaluated. After illustrating the main italian regional surveys on this issue, the article presents a customer satisfaction survey carried out in the district of Civitavecchia (Local Health Unit 'Rome F'), Lazio, regarding 30 home care beneficiaries. Methodological aspects emerging from the survey are basically focused on: advantages and disadvantages of quantitative and qualitative approaches (possibly associated each other); main criteria of eligibility of people selected for interviewing, both patients or caregivers; conditions that maximize answers reliability, including training on interviewers. Authors highlight opportunity of using such kind of survey, integrated with other different tools, into a systemic vision, for promoting management changes coming from suggested problems, aimed at total quality management.

  18. The Case of Value Based Communication—Epistemological and Methodological Reflections from a System Theoretical Perspective

    Directory of Open Access Journals (Sweden)

    Victoria von Groddeck

    2010-09-01

    Full Text Available The aim of this paper is to reflect the epistemological and methodological aspects of an empirical research study which analyzes the phenomenon of increased value communication within business organizations from a system theoretical perspective in the tradition of Niklas LUHMANN. Drawing on the theoretical term of observation it shows how a research perspective can be developed which opens up the scope for an empirical analysis of communication practices. This analysis focuses on the reconstruction of these practices by first understanding how these practices stabilize themselves and second by contrasting different practices to educe an understanding of different forms of observation of the relevant phenomenon and of the functions of these forms. Thus, this approach combines system theoretical epistemology, analytical research strategies, such as form and functional analysis, and qualitative research methods, such as narrative interviews, participant observation and document analysis. URN: urn:nbn:de:0114-fqs1003177

  19. Towards a Tool-based Development Methodology for Pervasive Computing Applications

    CERN Document Server

    Cassou, Damien; Consel, Charles; Balland, Emilie; 10.1109/TSE.2011.107

    2012-01-01

    Despite much progress, developing a pervasive computing application remains a challenge because of a lack of conceptual frameworks and supporting tools. This challenge involves coping with heterogeneous devices, overcoming the intricacies of distributed systems technologies, working out an architecture for the application, encoding it in a program, writing specific code to test the application, and finally deploying it. This paper presents a design language and a tool suite covering the development life-cycle of a pervasive computing application. The design language allows to define a taxonomy of area-specific building-blocks, abstracting over their heterogeneity. This language also includes a layer to define the architecture of an application, following an architectural pattern commonly used in the pervasive computing domain. Our underlying methodology assigns roles to the stakeholders, providing separation of concerns. Our tool suite includes a compiler that takes design artifacts written in our language as...

  20. Quantifying price risk of electricity retailer based on CAPM and RAROC methodology

    Energy Technology Data Exchange (ETDEWEB)

    Karandikar, R.G.; Khaparde, S.A.; Kulkarni, S.V. [Electrical Engineering Department, Indian Institute of Technology Bombay, Mumbai 400 076 (India)

    2007-12-15

    In restructured electricity markets, electricity retailers set up contracts with generation companies (GENCOs) and with end users to meet their load requirements at agreed upon tariff. The retailers invest consumer payments as capital in the volatile competitive market. In this paper, a model for quantifying price risk of electricity retailer is proposed. An IEEE 30 Bus test system is used to demonstrate the model. The Capital Asset Pricing Model (CAPM) is demonstrated to determine the retail electricity price for the end users. The factor Risk Adjusted Recovery on Capital (RAROC) is used to quantify the price risk involved. The methodology proposed in this paper can be used by retailer while submitting proposal for electricity tariff to the regulatory authority. (author)

  1. [Nursing care systematization according to the nurses' view: a methodological approach based on grounded theory].

    Science.gov (United States)

    de Medeiros, Ana Lúcia; dos Santos, Sérgio Ribeiro; de Cabral, Rômulo Wanderley Lima

    2012-09-01

    This study was aimed at understanding, from the nurses' perspective, the experience of going through the Systematization of nursing care (SNC) in an obstetric service unit. We used grounded theory as the theoretical and methodological framework. The subjects of this study consisted of thirteen nurses from a public hospital in the city of João Pessoa, in the state of Paraíba. The data analysis resulted in the following phenomenon. "perceiving SNC as a working method that organizes, directs and improves the quality of care by bringing visibility and providing security for the nursing staff" The nurses expressed the extent of knowledge about the SNC experienced in obstetrics as well as considered the nursing process as a decision-making process, which guides the reasoning of nurses in the planning of nursing care in obstetrics. It was concluded that nurses perceive the SNC as an instrument of theoretical-practical articulation leading to personalized assistance.

  2. DTC Based Induction Motor Speed Control Using 10-Sector Methodology for Torque Ripple Reduction

    Science.gov (United States)

    Pavithra, S.; Dinesh Krishna, A. S.; Shridharan, S.

    2014-09-01

    A direct torque control (DTC) drive allows direct and independent control of flux linkage and electromagnetic torque by the selection of optimum inverter switching modes. It is a simple method of signal processing which gives excellent dynamic performance. Also transformation of coordinates and voltage decoupling are not required. However, the possible discrete inverter switching vectors cannot always generate exact stator voltage required, to obtain the demanded electromagnetic torque and flux linkages. This results in the production of ripples in the torque as well as flux waveforms. In the present paper a torque ripple reduction methodology is proposed. In this method the circular locus of flux phasor is divided into 10 sector as compared to six sector divisions in conventional DTC method. The basic DTC scheme and the 10-sector method are simulated and compared for their performance. An analysis is done with sector increment so that finally the torque ripple varies slightly as the sector is increased.

  3. Development of six sigma concurrent parameter and tolerance design method based on response surface methodology

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Using Response Surface Methodology (RSM), an optimizing model of concurrent parameter and tolerance design is proposed where response mean equals its target in the target being best. The optimizing function of the model is the sum of quality loss and tolerance cost subjecting to the variance confidence region of which six sigma capability can be assured. An example is illustrated in order to compare the differences between the developed model and the parameter design with minimum variance. The results show that the proposed method not only achieves robustness, but also greatly reduces cast. The objectives of high quality and low cost of product and process can be achieved simultaneously by the application of six sigma concurrent parameter and tolerance design.

  4. Voltage regulation in MV networks with dispersed generations by a neural-based multiobjective methodology

    Energy Technology Data Exchange (ETDEWEB)

    Galdi, Vincenzo [Dipartimento di Ingegneria dell' Informazione e Ingegneria Elettrica, Universita degli studi di Salerno, Via Ponte Don Melillo 1, 84084 Fisciano (Italy); Vaccaro, Alfredo; Villacci, Domenico [Dipartimento di Ingegneria, Universita degli Studi del Sannio, Piazza Roma 21, 82100 Benevento (Italy)

    2008-05-15

    This paper puts forward the role of learning techniques in addressing the problem of an efficient and optimal centralized voltage control in distribution networks equipped with dispersed generation systems (DGSs). The proposed methodology employs a radial basis function network (RBFN) to identify the multidimensional nonlinear mapping between a vector of observable variables describing the network operating point and the optimal set points of the voltage regulating devices. The RBFN is trained by numerical data generated by solving the voltage regulation problem for a set of network operating points by a rigorous multiobjective solution methodology. The RBFN performance is continuously monitored by a supervisor process that notifies when there is the need of a more accurate solution of the voltage regulation problem if nonoptimal network operating conditions (ex post monitoring) or excessive distances between the actual network state and the neuron's centres (ex ante monitoring) are detected. A more rigorous problem solution, if required, can be obtained by solving the voltage regulation problem by a conventional multiobjective optimization technique. This new solution, in conjunction with the corresponding input vector, is then adopted as a new train data sample to adapt the RBFN. This online training process allows RBFN to (i) adaptively learn the more representative domain space regions of the input/output mapping without needing a prior knowledge of a complete and representative training set, and (ii) manage effectively any time varying phenomena affecting this mapping. The results obtained by simulating the regulation policy in the case of a medium-voltage network are very promising. (author)

  5. If ego depletion cannot be studied using identical tasks, it is not ego depletion.

    Science.gov (United States)

    Lange, Florian

    2015-01-01

    The hypothesis that human self-control capacities are fueled by glucose has been challenged on multiple grounds. A recent study by Lange and Eggert adds to this criticism by presenting two powerful but unsuccessful attempts to replicate the effect of sugar drinks on ego depletion. The dual-task paradigms employed in these experiments have been criticized for involving identical self-control tasks, a methodology that has been argued to reduce participants' willingness to exert self-control. The present article addresses this criticism by demonstrating that there is no indication to believe that the study of glucose effects on ego depletion should be restricted to paradigms using dissimilar acts of self-control. Failures to observe such effects in paradigms involving identical tasks pose a serious problem to the proposal that self-control exhaustion might be reversed by rinsing or ingesting glucose. In combination with analyses of statistical credibility, the experiments by Lange and Eggert suggest that the influence of sugar on ego depletion has been systematically overestimated.

  6. Methodology for Developing Evidence-Based Clinical Imaging Guidelines: Joint Recommendations by Korean Society of Radiology and National Evidence-Based Healthcare Collaborating Agency

    Science.gov (United States)

    Choi, Sol Ji; Jeong, Woo Kyoung; Jo, Ae Jeong; Choi, Jin A; Kim, Min-Jeong; Lee, Min; Jung, Seung Eun; Do, Kyung Hyun; Yong, Hwan Seok; Sheen, Seungsoo

    2017-01-01

    This paper is a summary of the methodology including protocol used to develop evidence-based clinical imaging guidelines (CIGs) in Korea, led by the Korean Society of Radiology and the National Evidence-based Healthcare Collaborating Agency. This is the first protocol to reflect the process of developing diagnostic guidelines in Korea. The development protocol is largely divided into the following sections: set-up, process of adaptation, and finalization. The working group is composed of clinical imaging experts, and the developmental committee is composed of multidisciplinary experts to validate the methodology. The Korean CIGs will continue to develop based on this protocol, and these guidelines will act for decision supporting tools for clinicians as well as reduce medical radiation exposure. PMID:28096730

  7. Prevalence of cluster headache in the Republic of Georgia: results of a population-based study and methodological considerations

    DEFF Research Database (Denmark)

    Katsarava, Z; Dzagnidze, A; Kukava, M

    2009-01-01

    We present a study of the general-population prevalence of cluster headache in the Republic of Georgia and discuss the advantages and challenges of different methodological approaches. In a community-based survey, specially trained medical residents visited 500 adjacent households in the capital...... with possible cluster headache, who were then personally interviewed by one of two headache-experienced neurologists. Cluster headache was confirmed in one subject. The prevalence of cluster headache was therefore estimated to be 87/100,000 (95% confidence interval

  8. Advancing absolute sustainability assessments of products with a new Planetary Boundaries based life-cycle impact assessment methodology

    DEFF Research Database (Denmark)

    Ryberg, Morten; Owsianiak, Mikolaj; Richardson, Katherine

    an operational life-cycle impact assessment (LCIA) methodology where the definition of the impact categories is based on the control variables as defined in the PB-framework by Steffen et al (2015). This included the development and calculation of characterization factors for the Earth System processes......The Planetary Boundaries (PB)-framework introduced quantitative boundaries for a set of biophysical Earth System processes. The PBs delimit a ‘safe operating space' for humanity to act within to keep Earth in a Holocene-like state (Rockström et al 2009). The concept has gained strong interest from...

  9. Rotational Mixing and Lithium Depletion

    CERN Document Server

    Pinsonneault, M H

    2010-01-01

    I review basic observational features in Population I stars which strongly implicate rotation as a mixing agent; these include dispersion at fixed temperature in coeval populations and main sequence lithium depletion for a range of masses at a rate which decays with time. New developments related to the possible suppression of mixing at late ages, close binary mergers and their lithium signature, and an alternate origin for dispersion in young cool stars tied to radius anomalies observed in active young stars are discussed. I highlight uncertainties in models of Population II lithium depletion and dispersion related to the treatment of angular momentum loss. Finally, the origins of rotation are tied to conditions in the pre-main sequence, and there is thus some evidence that enviroment and planet formation could impact stellar rotational properties. This may be related to recent observational evidence for cluster to cluster variations in lithium depletion and a connection between the presence of planets and s...

  10. Methodology for qualification of wood-based ash according to REACH - prestudy

    Energy Technology Data Exchange (ETDEWEB)

    Sjoeblom, Rolf (Tekedo AB, Nykoeping (Sweden)); Tivegaard, Anna-Maria (SSAB Merox AB, Oxeloesund (Sweden))

    2010-02-15

    The new European Union framework directive on waste is to be implemented during the year 2010. According to this directive, much of what today is regarded as waste will instead be assessed as by-products and in many cases fall under the new European union regulation REACH (Registration, Evaluation, Authorisation and Restriction of Chemicals). REACH applies in conjunction with the new European Union regulation CLP (Classification, Labelling and Packaging of substances and mixtures). There are introductory periods for both of these regulations, and in the case of CLP this regards transition from the present and previous rules under the dangerous substances and dangerous preparations directives (DSD and DPD, respectively). Similarly, the new framework directive on waste supersedes the previous directive and some other statements. There is a connection between the directives of waste and the rules for classification and labelling in that the classification of waste (in the categories hazardous and non-hazardous) build on (but are not identical to) the rules for labelling. Similarly, the national Swedish rules for acceptance of recycled material (waste) for use in geotechnical constructions relate to the provisions in REACH on assessment of chemical safety in the both request that the risk be assessed to be small, and that the same or similar methodologies can be applied to verify this. There is a 'reference alternative' in REACH that implies substantial testing prior to registration. Registration is the key to use of a substance even though a substance may be used as such, in a mixture, or to be released from an article. However, REACH as well as CLP contain a number of provisions for using literature data, data on similar chemicals e t c in order to avoid unnecessary testing. This especially applies to testing on humans and vertebrate animals. Vaermeforsk, through its Programme on Environmentally Friendly Use of Non-Coal Ashes has developed methodologies and

  11. A Methodology for Assessing Technology Trade-Offs of Space-Based Radar Concepts.

    Science.gov (United States)

    1985-12-01

    objective reality and intuition as reflected in a decision maker’s judgments. Especially in the case of future space sys- teams , decisions must be based...data base based on their perceptions of the technology issues from their own field of expertise. Networking can provide a means for intradisciplinary

  12. Replacements For Ozone-Depleting Foaming Agents

    Science.gov (United States)

    Blevins, Elana; Sharpe, Jon B.

    1995-01-01

    Fluorinated ethers used in place of chlorofluorocarbons and hydrochlorofluorocarbons. Replacement necessary because CFC's and HCFC's found to contribute to depletion of ozone from upper atmosphere, and manufacture and use of them by law phased out in near future. Two fluorinated ethers do not have ozone-depletion potential and used in existing foam-producing equipment, designed to handle liquid blowing agents soluble in chemical ingredients that mixed to make foam. Any polyurethane-based foams and several cellular plastics blown with these fluorinated ethers used in processes as diverse as small batch pours, large sprays, or double-band lamination to make insulation for private homes, commercial buildings, shipping containers, and storage tanks. Fluorinated ethers proved useful as replacements for CFC refrigerants and solvents.

  13. OPTIMIZATION OF POTASSIUM NITRATE BASED SOLID PROPELLANT GRAINS FORMULATION USING RESPONSE SURFACE METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Oladipupo Olaosebikan Ogunleye

    2015-08-01

    Full Text Available This study was designed to evaluate the effect of propellant formulation and geometry on the solid propellant grains internal ballistic performance using core, bates, rod and tubular and end-burn geometries. Response Surface Methodology (RSM was used to analyze and optimize the effect of sucrose, potassium nitrate and carbon on the chamber pressure, temperature, thrust and specific impulse of the solid propellant grains through Central Composite Design (CCD of the experiment. An increase in potassium nitrate increased the specific impulse while an increase in sucrose and carbon decreased specific impulse. The coefficient of determination (R2 for models of chamber pressure, temperature, thrust and specific impulse in terms of composition and geometry were 0.9737, 0.9984, 0.9745 and 0.9589, respectively. The optimum specific impulse of 127.89 s, pressure (462201 Pa, temperature (1618.3 K and thrust (834.83 N were obtained using 0.584 kg of sucrose, 1.364 kg of potassium nitrate and 0.052 kg of carbon as well as bate geometry. There was no significant difference between the calculated and experimented ballistic properties at p < 0.05. The bate grain geometry is more efficient for minimizing the oscillatory pressure in the combustion chamber.

  14. Spatial analysis of electricity demand patterns in Greece: Application of a GIS-based methodological framework

    Science.gov (United States)

    Tyralis, Hristos; Mamassis, Nikos; Photis, Yorgos N.

    2016-04-01

    We investigate various uses of electricity demand in Greece (agricultural, commercial, domestic, industrial use as well as use for public and municipal authorities and street lightning) and we examine their relation with variables such as population, total area, population density and the Gross Domestic Product. The analysis is performed on data which span from 2008 to 2012 and have annual temporal resolution and spatial resolution down to the level of prefecture. We both visualize the results of the analysis and we perform cluster and outlier analysis using the Anselin local Moran's I statistic as well as hot spot analysis using the Getis-Ord Gi* statistic. The definition of the spatial patterns and relationships of the aforementioned variables in a GIS environment provides meaningful insight and better understanding of the regional development model in Greece and justifies the basis for an energy demand forecasting methodology. Acknowledgement: This research has been partly financed by the European Union (European Social Fund - ESF) and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: ARISTEIA II: Reinforcement of the interdisciplinary and/ or inter-institutional research and innovation (CRESSENDO project; grant number 5145).

  15. IMPROVING PSYCHOMOTRICITY COMPONENTS IN PRESCHOOL CHILDREN USING TEACHING METHODOLOGIES BASED ON MIRROR NEURONS ACTIVATION

    Directory of Open Access Journals (Sweden)

    Gáll Zs. Sz.

    2015-08-01

    Full Text Available The scientific substrate of the study relies upon the concept of mirror neurons. Unlike other neurons, these are characterized by an imitation feature. They play an important role in learning processes – especially during childhood, enabling the imitation of motions and determining the primary acquirement thereof. Using this as a starting point, the study aims to work out and apply a methodology in keeping with the content of the psychomotor expression activities curriculum for preschool education, resorting to the demonstration procedures as a main teaching-learning method. Thus, we deem that mirror neurons reactivity will be determined more thoroughly, with a view to enhance the subject's psychomotor development according to body scheme, self-image and performance of basic postures and motions. For the research progress, an experimental group and a control group has been set up and the children’s psychomotor development level has been assessed both before the application of the independent variable and after the effects of the same upon the experimental group. As soon as the planned procedure was completed, the experimental group members showed a significant evolution in terms of the investigated psychomotor fields as compared to the control group.

  16. Theoretical, methodological and methodical bases of structural policy of territorial subjects of the russian federation

    Directory of Open Access Journals (Sweden)

    Valentina Sergeevna Antonyuk

    2013-03-01

    Full Text Available In this article, the maintenance of the various points of view on a category «the structural policy» is revealed. The sight of authors of the article is reflected: the structural policy is understood as a subsystem of a social and economic policy of the state, called to carry out function managing by development of branches of the economy together with private business, distributions of financial resources between sectors, control over use of the given money resources with a view of, actual for a certain historical stage, by use of administrative, standard and financial tools of regulation. The methodological basis of a structural policy is defined, functions with that end in view reveal, the target system, subjects and objects, and also are specified principles and classification of tools of a structural policy. By sight authors, regional branch shifts which promote progressivechanges of branch structure of a region in directions of formation V and VI technological ways, increase of a diversification of manufacture by stimulation of innovative changes should become a target reference point of a structural policy. The most sensitiveto tactical both technological fluctuations and vulnerablein the economic plan are monospecialized regions. In this connection, the technique of carrying out of a structural policy in monospecialized subjects of the Russian Federation taking into account shifts in branches of their industrial specializations is offered.

  17. 1H NMR based metabolic profiling in Crohn's disease by random forest methodology.

    Science.gov (United States)

    Fathi, Fariba; Majari-Kasmaee, Laleh; Mani-Varnosfaderani, Ahmad; Kyani, Anahita; Rostami-Nejad, Mohammad; Sohrabzadeh, Kaveh; Naderi, Nosratollah; Zali, Mohammad Reza; Rezaei-Tavirani, Mostafa; Tafazzoli, Mohsen; Arefi-Oskouie, Afsaneh

    2014-07-01

    The present study was designed to search for metabolic biomarkers and their correlation with serum zinc in Crohn's disease patients. Crohn's disease (CD) is a form of inflammatory bowel disease that may affect any part of the gastrointestinal tract and can be difficult to diagnose using the clinical tests. Thus, introduction of a novel diagnostic method would be a major step towards CD treatment. Proton nuclear magnetic resonance spectroscopy ((1)H NMR) was employed for metabolic profiling to find out which metabolites in the serum have meaningful significance in the diagnosis of CD. CD and healthy subjects were correctly classified using random forest methodology. The classification model for the external test set showed a 94% correct classification of CD and healthy subjects. The present study suggests Valine and Isoleucine as differentiating metabolites for CD diagnosis. These metabolites can be used for screening of risky samples at the early stages of CD diagnoses. Moreover, a robust random forest regression model with good prediction outcomes was developed for correlating serum zinc level and metabolite concentrations. The regression model showed the correlation (R(2)) and root mean square error values of 0.83 and 6.44, respectively. This model suggests valuable clues for understanding the mechanism of zinc deficiency in CD patients.

  18. Vibrational Study and Force Field of the Citric Acid Dimer Based on the SQM Methodology

    Directory of Open Access Journals (Sweden)

    Laura Cecilia Bichara

    2011-01-01

    Full Text Available We have carried out a structural and vibrational theoretical study for the citric acid dimer. The Density Functional Theory (DFT method with the B3LYP/6-31G∗ and B3LYP/6-311++G∗∗ methods have been used to study its structure and vibrational properties. Then, in order to get a good assignment of the IR and Raman spectra in solid phase of dimer, the best fit possible between the calculated and recorded frequencies was carry out and the force fields were scaled using the Scaled Quantum Mechanic Force Field (SQMFF methodology. An assignment of the observed spectral features is proposed. A band of medium intensity at 1242 cm−1 together with a group of weak bands, previously not assigned to the monomer, was in this case assigned to the dimer. Furthermore, the analysis of the Natural Bond Orbitals (NBOs and the topological properties of electronic charge density by employing Bader's Atoms in Molecules theory (AIM for the dimer were carried out to study the charge transference interactions of the compound.

  19. Towards a cognitive robotics methodology for reward-based decision-making: dynamical systems modelling of the Iowa Gambling Task

    Science.gov (United States)

    Lowe, Robert; Ziemke, Tom

    2010-09-01

    The somatic marker hypothesis (SMH) posits that the role of emotions and mental states in decision-making manifests through bodily responses to stimuli of import to the organism's welfare. The Iowa Gambling Task (IGT), proposed by Bechara and Damasio in the mid-1990s, has provided the major source of empirical validation to the role of somatic markers in the service of flexible and cost-effective decision-making in humans. In recent years the IGT has been the subject of much criticism concerning: (1) whether measures of somatic markers reveal that they are important for decision-making as opposed to behaviour preparation; (2) the underlying neural substrate posited as critical to decision-making of the type relevant to the task; and (3) aspects of the methodological approach used, particularly on the canonical version of the task. In this paper, a cognitive robotics methodology is proposed to explore a dynamical systems approach as it applies to the neural computation of reward-based learning and issues concerning embodiment. This approach is particularly relevant in light of a strongly emerging alternative hypothesis to the SMH, the reversal learning hypothesis, which links, behaviourally and neurocomputationally, a number of more or less complex reward-based decision-making tasks, including the 'A-not-B' task - already subject to dynamical systems investigations with a focus on neural activation dynamics. It is also suggested that the cognitive robotics methodology may be used to extend systematically the IGT benchmark to more naturalised, but nevertheless controlled, settings that might better explore the extent to which the SMH, and somatic states per se, impact on complex decision-making.

  20. Technology-Enhanced Problem-Based Learning Methodology in Geographically Dispersed Learners of Tshwane University of Technology

    Directory of Open Access Journals (Sweden)

    Sibitse M. Tlhapane

    2010-03-01

    Full Text Available Improving teaching and learning methodologies is not just a wish but rather strife for most educational institutions globally. To attain this, the Adelaide Tambo School of Nursing Science implemented a Technology-enhanced Problem-Based Learning methodology in the programme B Tech Occupational Nursing, in 2006. This is a two-year post-basic nursing program. The students are geographically dispersed and the curriculum design is the typically student-centred outcomes-based education. The research question posed by this paper is: How does technology-enhanced problem-based learning enhance student-centred learning, thinking skills, social skills and social space for learners? To answer the above question, a case study with both qualitative and quantitative data was utilised. The participants consisted of all students registered for the subject Occupational Health level 4. The sample group was chosen from willing participants from the Pretoria, eMalahleni and Polokwane learning sites, using the snowball method. This method was seen as appropriate due to the timing of the study. Data was collected using a questionnaire with both open and closed-ended questions. An analyses of the students‟ end of year examination was also done, including a comparison of performances by students on technology enhanced problem-based learning and those on problem-based learning only. The findings revealed that with Technology-enhanced Problem Based Learning (PBL, students‟ critical thinking, problem solving, and social skills improved and that social space was enhanced. This was supported by improved grades in students‟ on Technology-enhanced PBL as compared to those on PBL only.

  1. Improved base-collector depletion charge and capacitance model for SiGe HBT on thin-film SOI%薄膜SOI上SiGe HBT集电结耗尽电荷和电容改进模型

    Institute of Scientific and Technical Information of China (English)

    徐小波; 张鹤鸣; 胡辉勇

    2011-01-01

    文章研究了SOI衬底上SiGe npn异质结晶体管集电结耗尽电荷和电容.根据器件实际工作情况,基于课题组前面的工作,对耗尽电荷和电容模型进行扩展和优化.研究结果表明,耗尽电荷模型具有更好的光滑性;耗尽电容模型为纵向耗尽与横向耗尽电容的串联,考虑了不同电流流动面积,与常规器件相比,SOI器件全耗尽工作模式下表现出更小的集电结耗尽电容,因此更大的正向Early电压;在纵向工作模式到横向工作模式转变的电压偏置点,耗尽电荷和电容的变化趋势发生改变.SOI薄膜上纵向SiGe HBT集电结耗尽电荷和电容模型的建立和扩展为毫米波SOI BiCMOS工艺中双极器件核心参数如Early电压、特征频率等的设计提供了有价值的参考.%The SiGe heterojunction bipolar transistor (HBT) on thin film SOI is successfully integrated with SOI CMOS by "folded collector".This paper deals with the collector depletion charge and the capacitance of this structure.An optimized model is presented based on our previous research.The results show that the charge model is smoother,and that the capacitance model with considering different current flow areas,is vertical and horizontal depletion capacitances in series,showing that the depletion capacitance is smaller than that of a bulk HBT.The charge and capacitance vary with the increase of reverse collector-base bias.This collector depletion charge and capacitance model provides valuable reference to the SOI SiGe HBT electrical parameters design and simulation such as Early voltage and transit frequency in the latest 0.13μm SOI BiCMOS technology.

  2. Random forest methodology for model-based recursive partitioning: the mobForest package for R

    OpenAIRE

    Garge, Nikhil R; Bobashev, Georgiy; Eggleston, Barry

    2013-01-01

    Background Recursive partitioning is a non-parametric modeling technique, widely used in regression and classification problems. Model-based recursive partitioning is used to identify groups of observations with similar values of parameters of the model of interest. The mob() function in the party package in R implements model-based recursive partitioning method. This method produces predictions based on single tree models. Predictions obtained through single tree models are very sensitive to...

  3. Scenario development methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Eng, T. [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden); Hudson, J. [Rock Engineering Consultants, Welwyn Garden City, Herts (United Kingdom); Stephansson, O. [Royal Inst. of Tech., Stockholm (Sweden). Div. of Engineering Geology; Skagius, K.; Wiborgh, M. [Kemakta, Stockholm (Sweden)

    1994-11-01

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are (a) Event tree analysis, (b) Influence diagrams and (c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs.

  4. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  5. Data Centric Development Methodology

    Science.gov (United States)

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  6. Computing elastic‐rebound‐motivated rarthquake probabilities in unsegmented fault models: a new methodology supported by physics‐based simulators

    Science.gov (United States)

    Field, Edward H.

    2015-01-01

    A methodology is presented for computing elastic‐rebound‐based probabilities in an unsegmented fault or fault system, which involves computing along‐fault averages of renewal‐model parameters. The approach is less biased and more self‐consistent than a logical extension of that applied most recently for multisegment ruptures in California. It also enables the application of magnitude‐dependent aperiodicity values, which the previous approach does not. Monte Carlo simulations are used to analyze long‐term system behavior, which is generally found to be consistent with that of physics‐based earthquake simulators. Results cast doubt that recurrence‐interval distributions at points on faults look anything like traditionally applied renewal models, a fact that should be considered when interpreting paleoseismic data. We avoid such assumptions by changing the "probability of what" question (from offset at a point to the occurrence of a rupture, assuming it is the next event to occur). The new methodology is simple, although not perfect in terms of recovering long‐term rates in Monte Carlo simulations. It represents a reasonable, improved way to represent first‐order elastic‐rebound predictability, assuming it is there in the first place, and for a system that clearly exhibits other unmodeled complexities, such as aftershock triggering.

  7. A stale challenge to the philosophy of science: commentary on "Is psychology based on a methodological error?" by Michael Schwarz.

    Science.gov (United States)

    Ruck, Nora; Slunecko, Thomas

    2010-06-01

    In his article "Is psychology based on a methodological error?" and based on a quite convincing empirical basis, Michael Schwarz offers a methodological critique of one of mainstream psychology's key test theoretical axioms, i.e., that of the in principle normal distribution of personality variables. It is characteristic of this paper--and at first seems to be a strength of it--that the author positions his critique within a frame of philosophy of science, particularly positioning himself in the tradition of Karl Popper's critical rationalism. When scrutinizing Schwarz's arguments, however, we find Schwarz's critique profound only as an immanent critique of test theoretical axioms. We raise doubts, however, as to Schwarz's alleged 'challenge' to the philosophy of science because the author not at all seems to be in touch with the state of the art of contemporary philosophy of science. Above all, we question the universalist undercurrent that Schwarz's 'bio-psycho-social model' of human judgment boils down to. In contrast to such position, we close our commentary with a plea for a context- and culture sensitive philosophy of science.

  8. Response surface methodology based optimization of β-glucosidase production from Pichia pastoris.

    Science.gov (United States)

    Batra, Jyoti; Beri, Dhananjay; Mishra, Saroj

    2014-01-01

    The thermotolerant yeast Pichia etchellsii produces multiple cell bound β-glucosidases that can be used for synthesis of important alkyl- and aryl-glucosides. Present work focuses on enhancement of β-glucosidase I (BGLI) production in Pichia pastoris. In the first step, one-factor-at-a-time experimentation was used to investigate the effect of aeration, antifoam addition, casamino acid addition, medium pH, methanol concentration, and mixed feed components on BGLI production. Among these, initial medium pH, methanol concentration, and mixed feed in the induction phase were found to affect BGLI production. A 3.3-fold improvement in β-glucosidase expression was obtained at pH 7.5 as compared to pH 6.0 on induction with 1 % methanol. Addition of sorbitol, a non-repressing substrate, led to further enhancement in β-glucosidase production by 1.4-fold at pH 7.5. These factors were optimized with response surface methodology using Box-Behnken design. Empirical model obtained was used to define the optimum "operating space" for fermentation which was a pH of 7.5, methanol concentration of 1.29 %, and sorbitol concentration of 1.28 %. Interaction of pH and sorbitol had maximum effect leading to the production of 4,400 IU/L. The conditions were validated in a 3-L bioreactor with accumulation of 88 g/L biomass and 2,560 IU/L β-glucosidase activity.

  9. An accelerometry-based methodology for assessment of real-world bilateral upper extremity activity.

    Directory of Open Access Journals (Sweden)

    Ryan R Bailey

    Full Text Available The use of both upper extremities (UE is necessary for the completion of many everyday tasks. Few clinical assessments measure the abilities of the UEs to work together; rather, they assess unilateral function and compare it between affected and unaffected UEs. Furthermore, clinical assessments are unable to measure function that occurs in the real-world, outside the clinic. This study examines the validity of an innovative approach to assess real-world bilateral UE activity using accelerometry.Seventy-four neurologically intact adults completed ten tasks (donning/doffing shoes, grooming, stacking boxes, cutting playdough, folding towels, writing, unilateral sorting, bilateral sorting, unilateral typing, and bilateral typing while wearing accelerometers on both wrists. Two variables, the Bilateral Magnitude and Magnitude Ratio, were derived from accelerometry data to distinguish between high- and low-intensity tasks, and between bilateral and unilateral tasks. Estimated energy expenditure and time spent in simultaneous UE activity for each task were also calculated.The Bilateral Magnitude distinguished between high- and low-intensity tasks, and the Magnitude Ratio distinguished between unilateral and bilateral UE tasks. The Bilateral Magnitude was strongly correlated with estimated energy expenditure (ρ = 0.74, p<0.02, and the Magnitude Ratio was strongly correlated with time spent in simultaneous UE activity (ρ = 0.93, p<0.01 across tasks.These results demonstrate face validity and construct validity of this methodology to quantify bilateral UE activity during the performance of everyday tasks performed in a laboratory setting, and can now be used to assess bilateral UE activity in real-world environments.

  10. Brain-Based Learning and Classroom Practice: A Study Investigating Instructional Methodologies of Urban School Teachers

    Science.gov (United States)

    Morris, Lajuana Trezette

    2010-01-01

    The purpose of this study was to examine the implementation of brain-based instructional strategies by teachers serving at Title I elementary, middle, and high schools within the Memphis City School District. This study was designed to determine: (a) the extent to which Title I teachers applied brain-based strategies, (b) the differences in…

  11. Global depletion of groundwater resources

    NARCIS (Netherlands)

    Wada, Y.; Beek, L.P.H. van; van Kempen, C.M.; Reckman, J.W.T.M.; Vasak, S.; Bierkens, M.F.P.

    2010-01-01

    In regions with frequent water stress and large aquifer systems groundwater is often used as an additional water source. If groundwater abstraction exceeds the natural groundwater recharge for extensive areas and long times, overexploitation or persistent groundwater depletion occurs. Here we provid

  12. Assessment of RANS Based CFD Methodology using JAEA Experiment with a Wire-wrapped 127-pin Fuel Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, J. H.; Yoo, J.; Lee, K. L.; Ha, K. S. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    In this paper, we assess the RANS based CFD methodology with JAEA experimental data. The JAEA experiment study with the 127-pin wire-wrapped fuel assembly was implemented using water for validating pressure drop formulas in ASFRE code. Complicated and vortical flow phenomena in the wire-wrapped fuel bundles were captured by vortex structure identification technique based on the critical point theory. The SFR system is one of the nuclear reactors in which a recycling of transuranics (TRUs) by reusing spent nuclear fuel sustains the fission chain reaction. This situation strongly motivated the Korea Atomic Energy Research Institute (KAERI) to start a prototype Gen-4 Sodium-cooled Fast Reactor (PGSFR) design project under the national nuclear R and D program. Generally, the SFR system has a tight package of the fuel bundle and a high power density. The sodium material has a high thermal conductivity and boiling temperature than the water. That can make core design to be more compact than Light Water Reactor (LWR) through narrower sub-channels. The fuel assembly of the SFR system consists of long and thin wire-wrapped fuel bundles and a hexagonal duct, in which wire-wrapped fuel bundles in the hexagonal tube has triangular loose array. The main purpose of a wire spacer is to avoid collisions between adjacent rods. Furthermore, a wire spacer can mitigate a vortex induced vibration, and enhance convective heat transfer due to the secondary flow by helical type wire spacers. Most of numerical studies in the nuclear fields was widely conducted based on the simplified sub-channel analysis codes such as COBRA (Rowe), SABRE (Macdougall and Lillington), ASFRE (Ninokata), and MATRA-LMR (Kim et al.). The relationship between complex flow phenomena and helically wrapped-wire spacers will be discussed. The RANS based CFD methodology is evaluated with JAEA experimental data of the 127-pin wirewrapped fuel assembly. Complicated and vortical flow phenomena in the wire-wrapped fuel

  13. Heterogeneous reactions important in atmospheric ozone depletion: a theoretical perspective.

    Science.gov (United States)

    Bianco, Roberto; Hynes, James T

    2006-02-01

    Theoretical studies of the mechanisms of several heterogeneous reactions involving ClONO(2), H(2)O, HCl, HBr, and H(2)SO(4) important in atmospheric ozone depletion are described, focused primarily on reactions on aqueous aerosol surfaces. Among the insights obtained is the active chemical participation of the surface water molecules in several of these reactions. The general methodology adopted allows reduction of these complex chemical problems to meaningful model systems amenable to quantum chemical calculations.

  14. Project Management Methodology for the Development of M-Learning Web Based Applications

    Directory of Open Access Journals (Sweden)

    Adrian VISOIU

    2010-01-01

    Full Text Available M-learning web based applications are a particular case of web applications designed to be operated from mobile devices. Also, their purpose is to implement learning aspects. Project management of such applications takes into account the identified peculiarities. M-learning web based application characteristics are identified. M-learning functionality covers the needs of an educational process. Development is described taking into account the mobile web and its influences over the analysis, design, construction and testing phases. Activities building up a work breakdown structure for development of m-learning web based applications are presented. Project monitoring and control techniques are proposed. Resources required for projects are discussed.

  15. A New Genetic Algorithm Methodology for Design Optimization of Truss Structures: Bipopulation-Based Genetic Algorithm with Enhanced Interval Search

    Directory of Open Access Journals (Sweden)

    Tugrul Talaslioglu

    2009-01-01

    Full Text Available A new genetic algorithm (GA methodology, Bipopulation-Based Genetic Algorithm with Enhanced Interval Search (BGAwEIS, is introduced and used to optimize the design of truss structures with various complexities. The results of BGAwEIS are compared with those obtained by the sequential genetic algorithm (SGA utilizing a single population, a multipopulation-based genetic algorithm (MPGA proposed for this study and other existing approaches presented in literature. This study has two goals: outlining BGAwEIS's fundamentals and evaluating the performances of BGAwEIS and MPGA. Consequently, it is demonstrated that MPGA shows a better performance than SGA taking advantage of multiple populations, but BGAwEIS explores promising solution regions more efficiently than MPGA by exploiting the feasible solutions. The performance of BGAwEIS is confirmed by better quality degree of its optimal designations compared to algorithms proposed here and described in literature.

  16. Producing K indices by the interactive method based on the traditional hand-scaling methodology - preliminary results

    Science.gov (United States)

    Valach, Fridrich; Váczyová, Magdaléna; Revallo, Miloš

    2016-01-01

    This paper reports on an interactive computer method for producing K indices. The method is based on the traditional hand-scaling methodology that had been practised at Hurbanovo Geomagnetic Observatory till the end of 1997. Here, the performance of the method was tested on the data of the Kakioka Magnetic Observatory. We have found that in some ranges of the K-index values our method might be a beneficial supplement to the computer-based methods approved and endorsed by IAGA. This result was achieved for both very low (K=0) and high (K ≥ 5) levels of the geomagnetic activity. The method incorporated an interactive procedure of selecting quiet days by a human operator (observer). This introduces a certain amount of subjectivity, similarly as the traditional hand-scaling method.

  17. Evolutionary programming-based methodology for economical output power from PEM fuel cell for micro-grid application

    Science.gov (United States)

    El-Sharkh, M. Y.; Rahman, A.; Alam, M. S.

    This paper presents a methodology for finding the optimal output power from a PEM fuel cell power plant (FCPP). The FCPP is used to supply power to a small micro-grid community. The technique used is based on evolutionary programming (EP) to find a near-optimal solution of the problem. The method incorporates the Hill-Climbing technique (HCT) to maintain feasibility during the solution process. An economic model of the FCPP is used. The model considers the production cost of energy and the possibility of selling and buying electrical energy from the local grid. In addition, the model takes into account the thermal energy output from the FCPP and the thermal energy requirement for the micro-grid community. The results obtained are compared against a solution based on genetic algorithms. Results are encouraging and indicate viability of the proposed technique.

  18. Mitochondrial DNA depletion analysis by pseudogene ratioing.

    Science.gov (United States)

    Swerdlow, Russell H; Redpath, Gerard T; Binder, Daniel R; Davis, John N; VandenBerg, Scott R

    2006-01-30

    The mitochondrial DNA (mtDNA) depletion status of rho(0) cell lines is typically assessed by hybridization or polymerase chain reaction (PCR) experiments, in which the failure to hybridize mtDNA or amplify mtDNA using mtDNA-directed primers suggests thorough mitochondrial genome removal. Here, we report the use of an mtDNA pseudogene ratioing technique for the additional confirmation of rho0 status. Total genomic DNA from a U251 human glioma cell line treated with ethidium bromide was amplified using primers designed to anneal either mtDNA or a previously described nuclear DNA-embedded mtDNA pseudogene (mtDNApsi). The resultant PCR product was used to generate plasmid clones. Sixty-two plasmid clones were genotyped, and all arose from mtDNApsi template. These data allowed us to determine with 95% confidence that the resultant mtDNA-depleted cell line contains less than one copy of mtDNA per 10 cells. Unlike previous hybridization or PCR-based analyses of mtDNA depletion, this mtDNApsi ratioing technique does not rely on interpretation of a negative result, and may prove useful as an adjunct for the determination of rho0 status or mtDNA copy number.

  19. Ecological modernization of socio-economic development of the region in the context of social transformations: theoretical and methodological bases

    Directory of Open Access Journals (Sweden)

    O.V. Shkarupa

    2015-09-01

    Full Text Available The aim of the article. The aim of the article is to study the theoretical and methodological bases of ecological modernization of socio-economic development of the region. The results of the analysis. This paper studies scientific basis of transformation processes for sustainable development, which is important at this time. Considering direction of the history of sustainable development concept, the author emphasizes that if to follow the basic guidelines to upgrade social and economic systems to «green» economy, then it can expected results of ecological modernization. It will save funds due to forewarned of economic damage from pollution and environmental cost savings compensation «rehabilitation» resources and territories. Moreover, prevention of anthropogenic pressure increases the chances of social systems to improve the quality of life and improve the health of the nation. From economic point of view, clean production is more competitive. This study considered the theoretical and methodological bases of ecological modernization of process of socio-economic systems in a region that involves making special reference points of development. Ecological modernization is a prerequisite for ecologically transformation based on quality eco-oriented reforms in social and economic systems. Ecologically safe transformation of socio-economic development means certain progressive changes (intersystem, and intersystem synergistic and transformations that are strategic in view of eco-focused goal-setting. Such understanding provides for: 1 understanding of transformation as a process which is already identified in environmental trend of socio-economic system; 2 spatial certainty of eco-oriented reforms in connection with certainty qualities of the future development of the system. Arguably, it can and should lead to the structural changes in innovation for sustainable development. Conclusions and directions of further researches. It is shown that

  20. The assessment of emotional intelligence: a comparison of performance-based and self-report methodologies.

    Science.gov (United States)

    Goldenberg, Irina; Matheson, Kimberly; Mantler, Janet

    2006-02-01

    We assessed the patterns of convergent validity for the Mayer-Salovey-Caruso Emotional Intelligence Test (Mayer, Salovey, & Caruso, 2002), a performance-based measure of emotional intelligence (EI) that entails presenting problems thought to have correct responses, and a self-report measure of EI (Schutte et al., 1998). The relations between EI and demographic characteristics of a diverse community sample (N = 223) concurred with previous research. However, the performance-based and self-report scales were not related to one another. Only self-reported EI scores showed a consistent pattern of relations with self-reported coping styles and depressive affect, whereas the performance-based measure demonstrated stronger relations with age, education, and receiving psychotherapy. We discuss implications for the validity of these measures and their utility.

  1. Novel 3-D Object Recognition Methodology Employing a Curvature-Based Histogram

    Directory of Open Access Journals (Sweden)

    Liang-Chia Chen

    2013-07-01

    Full Text Available In this paper, a new object recognition algorithm employing a curvature-based histogram is presented. Recognition of three-dimensional (3-D objects using range images remains one of the most challenging problems in 3-D computer vision due to its noisy and cluttered scene characteristics. The key breakthroughs for this problem mainly lie in defining unique features that distinguish the similarity among various 3-D objects. In our approach, an object detection scheme is developed to identify targets underlining an automated search in the range images using an initial process of object segmentation to subdivide all possible objects in the scenes and then applying a process of object recognition based on geometric constraints and a curvature-based histogram for object recognition. The developed method has been verified through experimental tests for its feasibility confirmation.

  2. Quantification of rainfall prediction uncertainties using a cross-validation based technique. Methodology description and experimental validation.

    Science.gov (United States)

    Fraga, Ignacio; Cea, Luis; Puertas, Jerónimo; Salsón, Santiago; Petazzi, Alberto

    2016-04-01

    In this paper we present a new methodology to compute rainfall fields including the quantification of predictions uncertainties using raingauge network data. The proposed methodology comprises two steps. Firstly, the ordinary krigging technique is used to determine the estimated rainfall depth in every point of the study area. Then multiple equi-probable errors fields, which comprise both interpolation and measuring uncertainties, are added to the krigged field resulting in multiple rainfall predictions. To compute these error fields first the standard deviation of the krigging estimation is determined following the cross-validation based procedure described in Delrieu et al. (2014). Then, the standard deviation field is sampled using non-conditioned Gaussian random fields. The proposed methodology was applied to study 7 rain events in a 60x60 km area of the west coast of Galicia, in the Northwest of Spain. Due to its location at the junction between tropical and polar regions, the study area suffers from frequent intense rainfalls characterized by a great variability in terms of both space and time. Rainfall data from the tipping bucket raingauge network operated by MeteoGalicia were used to estimate the rainfall fields using the proposed methodology. The obtained predictions were then validated using rainfall data from 3 additional rain gauges installed within the CAPRI project (Probabilistic flood prediction with high resolution hydrologic models from radar rainfall estimates, funded by the Spanish Ministry of Economy and Competitiveness. Reference CGL2013-46245-R.). Results show that both the mean hyetographs and the peak intensities are correctly predicted. The computed hyetographs present a good fit to the experimental data and most of the measured values fall within the 95% confidence intervals. Also, most of the experimental values outside the confidence bounds correspond to time periods of low rainfall depths, where the inaccuracy of the measuring devices

  3. Attribute Based Selection of Thermoplastic Resin for Vacuum Infusion Process: A Decision Making Methodology

    DEFF Research Database (Denmark)

    Raghavalu Thirumalai, Durai Prabhakaran; Lystrup, Aage; Løgstrup Andersen, Tom

    2012-01-01

    be beneficial. In this paper, the authors introduce a new decision making tool for resin selection based on significant attributes. This article provides a broad overview of suitable thermoplastic material systems for vacuum infusion process available in today’s market. An illustrative example—resin selection......The composite industry looks toward a new material system (resins) based on thermoplastic polymers for the vacuum infusion process, similar to the infusion process using thermosetting polymers. A large number of thermoplastics are available in the market with a variety of properties suitable...

  4. DEVELOPMENT OF THE CONTROL METHODOLOGY OF THE GIANT MAGNETOSTRICTIVE ACTUATOR BASED ON MAGNETIC FLUX DENSITY

    Institute of Scientific and Technical Information of China (English)

    Jia Zhenyuan; Yang Xing; Shi Chun; Guo Dongming

    2003-01-01

    According to the principle of the magnetostriction generating mechanism, the control model of giant magnetostriction material based on magnetic field and the control method with magnetic flux density are developed. Furthermore, this control method is used to develop a giant magnetostrictive micro-displacement actuator (GMA) and its driving system. Two control methods whose control variables are current intensity and magnetic flux density are compared with each other by experimental studies. Finally, effective methods on improving the linearity and control precision of micro-displacement actuator and reducing the hysteresis based on the controlling magnetic flux density are obtained.

  5. An Event-Based Methodology to Generate Class Diagrams and its Empirical Evaluation

    Directory of Open Access Journals (Sweden)

    Sandeep K. Singh

    2010-01-01

    Full Text Available Problem statement: Event-based systems have importance in many application domains ranging from real time monitoring systems in production, logistics, medical devices and networking to complex event processing in finance and security. The increasing popularity of Event-based systems has opened new challenging issues for them. One such issue is to carry out requirements analysis of event-based systems and build conceptual models. Currently, Object Oriented Analysis (OOA using Unified Modeling Language (UML is the most popular requirement analysis approach for which several OOA tools and techniques have been proposed. But none of the techniques and tools to the best of our knowledge, have focused on event-based requirements analysis, rather all are behavior-based approaches. Approach: This study described a requirement analysis approach specifically for event based systems. The proposed approach started from events occurring in the system and derives an importable class diagram specification in XML Metadata Interchange (XMI format for Argo UML tool. Requirements of the problem domain are captured as events in restricted natural language using the proposed Event Templates in order to reduce the ambiguity. Results: Rules were designed to extract a domain model specification (analysis-level class diagram from Event Templates. A prototype tool 'EV-ClassGEN' is also developed to provide automation support to extract events from requirements, document the extracted events in Event Templates and implement rules to derive specification for an analysis-level class diagram. The proposed approach is also validated through a controlled experiment by applying it on many cases from different application domains like real time systems, business applications, gaming. Conclusion: Results of the controlled experiment had shown that after studying and applying Event-based approach, student's perception about ease of use and usefulness of OOA technique has

  6. An optimum design on rollers containing the groove with changeable inner diameter based on response surface methodology

    Directory of Open Access Journals (Sweden)

    Xi Zhao

    2016-05-01

    Full Text Available In order to realize the precision plastic forming of the revolving body component with changeable wall thickness, a kind of roller containing grooves with changeable inner diameter is put forward, as the forming mould of the technology of rolling-extrusion. Specifically, first, the arc length of the groove in the roller is designed according to the prediction on the forward slip value during the process of forming, to make accurate control of the actual length of the forming segments; then, to obtain better parameters of the roller structure, a second-order response surface model combining finite element numerical simulation and response surface methodology was put forward, taking the factor of forming uniformity as evaluation index. The result of the experiment shows that, for the formed component, not only the size can meet the needs but also each mechanical property index can be greatly improved, which verify the rationality of the forward slip model and the structural parameter of the optimum model based on the response surface methodology.

  7. Methodology for sodium fire vulnerability assessment of sodium cooled fast reactor based on the Monte-Carlo principle

    Energy Technology Data Exchange (ETDEWEB)

    Song, Wei [Nuclear and Radiation Safety Center, P. O. Box 8088, Beijing (China); Wu, Yuanyu [ITER Organization, Route de Vinon-sur-Verdon, 13115 Saint-Paul-lès-Durance (France); Hu, Wenjun [China Institute of Atomic Energy, P. O. Box 275(34), Beijing (China); Zuo, Jiaxu, E-mail: zuojiaxu@chinansc.cn [Nuclear and Radiation Safety Center, P. O. Box 8088, Beijing (China)

    2015-11-15

    Highlights: • Monte-Carlo principle coupling with fire dynamic code is adopted to perform sodium fire vulnerability assessment. • The method can be used to calculate the failure probability of sodium fire scenarios. • A calculation example and results are given to illustrate the feasibility of the methodology. • Some critical parameters and experience are shared. - Abstract: Sodium fire is a typical and distinctive hazard in sodium cooled fast reactors, which is significant for nuclear safety. In this paper, a method of sodium fire vulnerability assessment based on the Monte-Carlo principle was introduced, which could be used to calculate the probabilities of every failure mode in sodium fire scenarios. After that, the sodium fire scenario vulnerability assessment of primary cold trap room of China Experimental Fast Reactor was performed to illustrate the feasibility of the methodology. The calculation result of the example shows that the conditional failure probability of key cable is 23.6% in the sodium fire scenario which is caused by continuous sodium leakage because of the isolation device failure, but the wall temperature, the room pressure and the aerosol discharge mass are all lower than the safety limits.

  8. Associative Interactions in Crowded Solutions of Biopolymers Counteract Depletion Effects.

    Science.gov (United States)

    Groen, Joost; Foschepoth, David; te Brinke, Esra; Boersma, Arnold J; Imamura, Hiromi; Rivas, Germán; Heus, Hans A; Huck, Wilhelm T S

    2015-10-14

    The cytosol of Escherichia coli is an extremely crowded environment, containing high concentrations of biopolymers which occupy 20-30% of the available volume. Such conditions are expected to yield depletion forces, which strongly promote macromolecular complexation. However, crowded macromolecule solutions, like the cytosol, are very prone to nonspecific associative interactions that can potentially counteract depletion. It remains unclear how the cytosol balances these opposing interactions. We used a FRET-based probe to systematically study depletion in vitro in different crowded environments, including a cytosolic mimic, E. coli lysate. We also studied bundle formation of FtsZ protofilaments under identical crowded conditions as a probe for depletion interactions at much larger overlap volumes of the probe molecule. The FRET probe showed a more compact conformation in synthetic crowding agents, suggesting strong depletion interactions. However, depletion was completely negated in cell lysate and other protein crowding agents, where the FRET probe even occupied slightly more volume. In contrast, bundle formation of FtsZ protofilaments proceeded as readily in E. coli lysate and other protein solutions as in synthetic crowding agents. Our experimental results and model suggest that, in crowded biopolymer solutions, associative interactions counterbalance depletion forces for small macromolecules. Furthermore, the net effects of macromolecular crowding will be dependent on both the size of the macromolecule and its associative interactions with the crowded background.

  9. SEGMENTING RETAIL MARKETS ON STORE IMAGE USING A CONSUMER-BASED METHODOLOGY

    NARCIS (Netherlands)

    STEENKAMP, JBEM; WEDEL, M

    1991-01-01

    Various approaches to segmenting retail markets based on store image are reviewed, including methods that have not yet been applied to retailing problems. It is argued that a recently developed segmentation technique, fuzzy clusterwise regression analysis (FCR), holds high potential for store-image

  10. How to create a methodology of conceptual visualization based on experiential cognitive science and diagrammatology

    DEFF Research Database (Denmark)

    Toft, Birthe

    2013-01-01

    Based on the insights of experiential cognitive science and of diagrammatology as defined by Charles S. Peirce and Frederik Stjernfelt, this article analyses the iconic links connecting visualizations of Stjernfelt diagrams with human perception and action and starts to lay the theoretical...

  11. DEVELOPMENT OF GENETIC ALGORITHM-BASED METHODOLOGY FOR SCHEDULING OF MOBILE ROBOTS

    DEFF Research Database (Denmark)

    Dang, Vinh Quang

    -time operations of production managers. Hence to deal with large-scale applications, each heuristic based on genetic algorithms is then developed to find near-optimal solutions within a reasonable computation time for each problem. The quality of these solutions is then compared and evaluated by using...

  12. Partition-based Low Power DFT Methodology for System-on-chips

    Institute of Scientific and Technical Information of China (English)

    LI Yu-fei; CHEN Jian; FU Yu-zhuo

    2007-01-01

    This paper presents a partition-based Design-forTest (DFT) technique to reduce the power consumption during scan-based testing. This method is based on partitioning the chip into several independent scan domains. By enabling the scan domains alternatively, only a fraction of the entire chip will be active at the same time, leading to Iow power consumption during test. Therefore, it will significantly reduce the possibility of Electronic Migration and Overheating. In order to prevent the drop of fault coverage, wrappers on the boundaries between scan domains are employed. This paper also presents a detailed design flow based on Electronics Design Automation(EDA) tools from Synopsy(s) to implement the proposed test structure. The proposed DFT method is experimented on a state-of-theart System-on-chips (SOC). The simulation results show a significant reduction in both average and peak power dissipation without sacrificing the fault coverage and test time. This SOC has been taped out in TSMC and finished the final test in ADVANTEST.

  13. Cognitive Activity-based Design Methodology for Novice Visual Communication Designers

    Science.gov (United States)

    Kim, Hyunjung; Lee, Hyunju

    2016-01-01

    The notion of design thinking is becoming more concrete nowadays, as design researchers and practitioners study the thinking processes involved in design and employ the concept of design thinking to foster better solutions to complex and ill-defined problems. The goal of the present research is to develop a cognitive activity-based design…

  14. Methodology for Design of Web-Based Laparoscopy e-Training System

    Science.gov (United States)

    Borissova, Daniela; Mustakerov, Ivan

    2011-01-01

    The Web-based e-learning can benefit from the modern multimedia tools combined with network capabilities to overcome traditional education. The objective of this paper is focused on e-training system development to improve performance of theoretical knowledge and providing ample opportunity for practical attainment of manual skills in virtual…

  15. Application of Microprocessor-Based Equipment in Nuclear Power Plants - Technical Basis for a Qualification Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Korsah, K.

    2001-08-24

    This document (1) summarizes the most significant findings of the ''Qualification of Advanced Instrumentation and Control (I&C) Systems'' program initiated by the Nuclear Regulatory Commission (NRC); (2) documents a comparative analysis of U.S. and European qualification standards; and (3) provides recommendations for enhancing regulatory guidance for environmental qualification of microprocessor-based safety-related systems. Safety-related I&C system upgrades of present-day nuclear power plants, as well as I&C systems of Advanced Light-Water Reactors (ALWRs), are expected to make increasing use of microprocessor-based technology. The Nuclear Regulatory Commission (NRC) recognized that the use of such technology may pose environmental qualification challenges different from current, analog-based I&C systems. Hence, it initiated the ''Qualification of Advanced Instrumentation and Control Systems'' program. The objectives of this confirmatory research project are to (1) identify any unique environmental-stress-related failure modes posed by digital technologies and their potential impact on the safety systems and (2) develop the technical basis for regulatory guidance using these findings. Previous findings from this study have been documented in several technical reports. This final report in the series documents a comparative analysis of two environmental qualification standards--Institute of Electrical and Electronics Engineers (IEEE) Std 323-1983 and International Electrotechnical Commission (IEC) 60780 (1998)--and provides recommendations for environmental qualification of microprocessor-based systems based on this analysis as well as on the findings documented in the previous reports. The two standards were chosen for this analysis because IEEE 323 is the standard used in the U.S. for the qualification of safety-related equipment in nuclear power plants, and IEC 60780 is its European counterpart. In addition, the IEC

  16. Putting Order into Our Universe: The Concept of Blended Learning—A Methodology within the Concept-based Terminology Framework

    Directory of Open Access Journals (Sweden)

    Joana Fernandes

    2016-06-01

    Full Text Available This paper aims at discussing the advantages of a methodology design grounded on a concept-based approach to Terminology applied to the most prominent scenario of current Higher Education: blended learning. Terminology is a discipline that aims at representing, describing and defining specialized knowledge through language, putting order into our universe (Nuopponen, 2011. Concepts, as elements of the structure of knowledge (Sager, 1990 emerge as a complex research object. Can they be found in language? A concept-based approach to Terminology implies a clear-cut view of the role of language in terminological work: though language is postulated as being a fundamental tool to grasp, describe and organize knowledge, an isomorphic relationship between language and knowledge cannot be taken for granted. In other words, the foundational premise of a concept-based approach is that there is no one-to-one correspondence between atomic elements of knowledge and atomic elements of linguistic expression. This is why a methodological approach to Terminology merely based upon specialized text research is regarded as biased (Costa, 2013. As a consequence, we argue that interactional strategies between terminologist and domain expert deserve particular research attention. To our mind, the key to concept-based terminological work is to carry out a concept analysis of data gathered from a specialised text corpora combined with an elicitation process of the tacit knowledge and concept-oriented discursive negotiation. Following such view, we put forward a methodology to answer the question: how is blended learning defined in the Post-Bologna scenario? Even though there are numerous high-quality models and practical descriptions for its implementation (similarly to other concepts related to distance learning, the need to understand, demarcate and harmonize the concept of blended learning against the current Higher Education background results from the premise that

  17. A grammar based methodology for structural motif finding in ncRNA database search.

    Science.gov (United States)

    Quest, Daniel; Tapprich, William; Ali, Hesham

    2007-01-01

    In recent years, sequence database searching has been conducted through local alignment heuristics, pattern-matching, and comparison of short statistically significant patterns. While these approaches have unlocked many clues as to sequence relationships, they are limited in that they do not provide context-sensitive searching capabilities (e.g. considering pseudoknots, protein binding positions, and complementary base pairs). Stochastic grammars (hidden Markov models HMMs and stochastic context-free grammars SCFG) do allow for flexibility in terms of local context, but the context comes at the cost of increased computational complexity. In this paper we introduce a new grammar based method for searching for RNA motifs that exist within a conserved RNA structure. Our method constrains computational complexity by using a chain of topology elements. Through the use of a case study we present the algorithmic approach and benchmark our approach against traditional methods.

  18. A methodology toward manufacturing grid-based virtual enterprise operation platform

    Science.gov (United States)

    Tan, Wenan; Xu, Yicheng; Xu, Wei; Xu, Lida; Zhao, Xianhua; Wang, Li; Fu, Liuliu

    2010-08-01

    Virtual enterprises (VEs) have become one of main types of organisations in the manufacturing sector through which the consortium companies organise their manufacturing activities. To be competitive, a VE relies on the complementary core competences among members through resource sharing and agile manufacturing capacity. Manufacturing grid (M-Grid) is a platform in which the production resources can be shared. In this article, an M-Grid-based VE operation platform (MGVEOP) is presented as it enables the sharing of production resources among geographically distributed enterprises. The performance management system of the MGVEOP is based on the balanced scorecard and has the capacity of self-learning. The study shows that a MGVEOP can make a semi-automated process possible for a VE, and the proposed MGVEOP is efficient and agile.

  19. A Trajectory-Oriented, Carriageway-Based Road Network Data Model, Part 2: Methodology

    Institute of Scientific and Technical Information of China (English)

    LI Xiang; LIN Hui

    2006-01-01

    This is the second of a three-part series of papers which presents the principle and architecture of the CRNM, a trajectory-oriented, carriageway-based road network data model. The first part of the series has introduced a general background of building trajectory-oriented road network data models, including motivation, related works, and basic concepts. Based on it, this paper describs the CRNM in detail. At first, the notion of basic roadway entity is proposed and discussed. Secondly, carriageway is selected as the basic roadway entity after compared with other kinds of roadway, and approaches to representing other roadways with carriageways are introduced. At last, an overall architecture of the CRNM is proposed.

  20. A Theory-Based Methodology for Analyzing Domain Suitability for Expert Systems Technology Applications

    Science.gov (United States)

    1989-06-01

    complex auditory stimuli on ambiguous or esoteric factors. Describing a musical piece as being composed by Chopin or classifying a submarine from the...classifications of complex auditory stimuli on ambiguous or esoteric factors. Storing examples of music composed by Chopin or submarine classifications based...classifying a particular musical piece as being composed by 130 Chopin or classifying a specific submarine’s sonar return as an "Alfa" class ship. F C

  1. Neurorehabilitation using the virtual reality based Rehabilitation Gaming System: methodology, design, psychometrics, usability and validation

    Directory of Open Access Journals (Sweden)

    Verschure Paul FMJ

    2010-09-01

    Full Text Available Abstract Background Stroke is a frequent cause of adult disability that can lead to enduring impairments. However, given the life-long plasticity of the brain one could assume that recovery could be facilitated by the harnessing of mechanisms underlying neuronal reorganization. Currently it is not clear how this reorganization can be mobilized. Novel technology based neurorehabilitation techniques hold promise to address this issue. Here we describe a Virtual Reality (VR based system, the Rehabilitation Gaming System (RGS that is based on a number of hypotheses on the neuronal mechanisms underlying recovery, the structure of training and the role of individualization. We investigate the psychometrics of the RGS in stroke patients and healthy controls. Methods We describe the key components of the RGS and the psychometrics of one rehabilitation scenario called Spheroids. We performed trials with 21 acute/subacute stroke patients and 20 healthy controls to study the effect of the training parameters on task performance. This allowed us to develop a Personalized Training Module (PTM for online adjustment of task difficulty. In addition, we studied task transfer between physical and virtual environments. Finally, we assessed the usability and acceptance of the RGS as a rehabilitation tool. Results We show that the PTM implemented in RGS allows us to effectively adjust the difficulty and the parameters of the task to the user by capturing specific features of the movements of the arms. The results reported here also show a consistent transfer of movement kinematics between physical and virtual tasks. Moreover, our usability assessment shows that the RGS is highly accepted by stroke patients as a rehabilitation tool. Conclusions We introduce a novel VR based paradigm for neurorehabilitation, RGS, which combines specific rehabilitative principles with a psychometric evaluation to provide a personalized and automated training. Our results show that the

  2. Report of the LSPI/NASA Workshop on Lunar Base Methodology Development

    Science.gov (United States)

    Nozette, Stewart; Roberts, Barney

    1985-01-01

    Groundwork was laid for computer models which will assist in the design of a manned lunar base. The models, herein described, will provide the following functions for the successful conclusion of that task: strategic planning; sensitivity analyses; impact analyses; and documentation. Topics addressed include: upper level model description; interrelationship matrix; user community; model features; model descriptions; system implementation; model management; and plans for future action.

  3. The Scientific and Methodological Principles of Modeling Consumer Behavior in the Value-Based Enterprise Management

    OpenAIRE

    Stadnyk Valentyna V; Zamazii Oksana V.

    2015-01-01

    The need to improve the conceptual framework for managing the activity of domestic industrial enterprises in view of perspectives of their entering the EU markets has been actualized. The problems of practical implementation in Ukraine of the concept of Value Based Management, which is an effective tool for managing capitalization of enterprises in the corporate sector, have been identified. It was emphasized that because of the high instability of the economic activity envi...

  4. Knowledge search for new product development: a multi-agent based methodology

    OpenAIRE

    2011-01-01

    Manufacturers are the leaders in developing new products to drive productivity. Higher productivity means more products based on the same materials, energy, labour, and capitals. New product development plays a critical role in the success of manufacturing firms. Activities in the product development process are dependent on the knowledge of new product development team members. Increasingly, many enterprises consider effective knowledge search to be a source of competitive advantage. Th...

  5. Effect of methodology, dilution, and exposure time on the tuberculocidal activity of glutaraldehyde-based disinfectants.

    OpenAIRE

    1990-01-01

    The Association of Official Analytical Chemists (AOAC) test for assessing the tuberculocidal activity of disinfectants has been shown to be variable. A modified AOAC test, which substituted Middlebrook 7H9 broth as the primary subculture medium and used neutralization by dilution, was compared with the standard AOAC method to assess the mycobactericidal activity of three glutaraldehyde-based disinfectants at 20 degrees C and various exposure times. These changes had a marked effect on results...

  6. Design-Based Research in Science Education: One Step Towards Methodology

    Directory of Open Access Journals (Sweden)

    Kalle Juuti

    2012-10-01

    Full Text Available Recently, there has been critiques towards science education research, as the potential of this research has not been actualised in science teaching and learning praxis. The paper describes an analysis of a design-based research approach (DBR that has been suggested as a solution for the discontinuation between science education research and praxis. We propose that a pragmatic frame helps to clarify well the design-based research endeavour. We abstracted three aspects from the analysis that constitute design-based research: (a a design process is essentially iterative starting from the recognition of the change of the environment of praxis, (b it generates a widely usable artefact, (c and it provides educational knowledge for more intelligible praxis. In the knowledge acquisition process, the pragmatic viewpoint emphasises the role of a teacher’s reflected actions as well as the researches’ involvement in the authentic teaching and learning settings.

  7. Forecasting Electricity Market Risk Using Empirical Mode Decomposition (EMD—Based Multiscale Methodology

    Directory of Open Access Journals (Sweden)

    Kaijian He

    2016-11-01

    Full Text Available The electricity market has experienced an increasing level of deregulation and reform over the years. There is an increasing level of electricity price fluctuation, uncertainty, and risk exposure in the marketplace. Traditional risk measurement models based on the homogeneous and efficient market assumption no longer suffice, facing the increasing level of accuracy and reliability requirements. In this paper, we propose a new Empirical Mode Decomposition (EMD-based Value at Risk (VaR model to estimate the downside risk measure in the electricity market. The proposed model investigates and models the inherent multiscale market risk structure. The EMD model is introduced to decompose the electricity time series into several Intrinsic Mode Functions (IMF with distinct multiscale characteristics. The Exponential Weighted Moving Average (EWMA model is used to model the individual risk factors across different scales. Experimental results using different models in the Australian electricity markets show that EMD-EWMA models based on Student’s t distribution achieves the best performance, and outperforms the benchmark EWMA model significantly in terms of model reliability and predictive accuracy.

  8. How can activity-based costing methodology be performed as a powerful tool to calculate costs and secure appropriate patient care?

    Science.gov (United States)

    Lin, Blossom Yen-Ju; Chao, Te-Hsin; Yao, Yuh; Tu, Shu-Min; Wu, Chun-Ching; Chern, Jin-Yuan; Chao, Shiu-Hsiung; Shaw, Keh-Yuong

    2007-04-01

    Previous studies have shown the advantages of using activity-based costing (ABC) methodology in the health care industry. The potential values of ABC methodology in health care are derived from the more accurate cost calculation compared to the traditional step-down costing, and the potentials to evaluate quality or effectiveness of health care based on health care activities. This project used ABC methodology to profile the cost structure of inpatients with surgical procedures at the Department of Colorectal Surgery in a public teaching hospital, and to identify the missing or inappropriate clinical procedures. We found that ABC methodology was able to accurately calculate costs and to identify several missing pre- and post-surgical nursing education activities in the course of treatment.

  9. Methodology for allocation of remotely controlled switches in distribution networks based on a fuzzy multi-criteria decision making algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Bernardon, D.P.; Sperandio, M.; Garcia, V.J.; Russi, J. [UNIPAMPA - Federal University of Pampa (Brazil); Canha, L.N.; Abaide, A.R. [UFSM - Federal University of Santa Maria (Brazil); Daza, E.F.B. [AES Sul (Brazil)

    2011-02-15

    Continuity in power supply for the consumers is a permanent concern from the utilities, pursued with the development of technological solutions in order to improve the performance of network restoration conditions. Using remotely controlled switches corresponds to one possible approach to reach such an improvement and giving some convenient remote resources such as the fault detect, isolation and transfer loads. This paper presents a methodology implemented in a computer programming language for allocation these devices in electric distribution systems based on multi-criteria fuzzy analysis. The main contributions are focus on considering the impact of installing remote-controlled switches in the reliability indexes and algorithm of fuzzy multi-criteria decision making for the switches allocation. The effectiveness of the proposed algorithm is demonstrated with case studies involving actual systems of the AES Sul utility located in the south of Brazil. (author)

  10. GIS-based regionalized life cycle assessment: how big is small enough? Methodology and case study of electricity generation.

    Science.gov (United States)

    Mutel, Christopher L; Pfister, Stephan; Hellweg, Stefanie

    2012-01-17

    We describe a new methodology for performing regionalized life cycle assessment and systematically choosing the spatial scale of regionalized impact assessment methods. We extend standard matrix-based calculations to include matrices that describe the mapping from inventory to impact assessment spatial supports. Uncertainty in inventory spatial data is modeled using a discrete spatial distribution function, which in a case study is derived from empirical data. The minimization of global spatial autocorrelation is used to choose the optimal spatial scale of impact assessment methods. We demonstrate these techniques on electricity production in the United States, using regionalized impact assessment methods for air emissions and freshwater consumption. Case study results show important differences between site-generic and regionalized calculations, and provide specific guidance for future improvements of inventory data sets and impact assessment methods.

  11. Physiologically Based Pharmacokinetic Modeling: Methodology, Applications, and Limitations with a Focus on Its Role in Pediatric Drug Development

    Directory of Open Access Journals (Sweden)

    Feras Khalil

    2011-01-01

    Full Text Available The concept of physiologically based pharmacokinetic (PBPK modeling was introduced years ago, but it has not been practiced significantly. However, interest in and implementation of this modeling technique have grown, as evidenced by the increased number of publications in this field. This paper demonstrates briefly the methodology, applications, and limitations of PBPK modeling with special attention given to discuss the use of PBPK models in pediatric drug development and some examples described in detail. Although PBPK models do have some limitations, the potential benefit from PBPK modeling technique is huge. PBPK models can be applied to investigate drug pharmacokinetics under different physiological and pathological conditions or in different age groups, to support decision-making during drug discovery, to provide, perhaps most important, data that can save time and resources, especially in early drug development phases and in pediatric clinical trials, and potentially to help clinical trials become more “confirmatory” rather than “exploratory”.

  12. Automatic flow injection based methodologies for determination of scavenging capacity against biologically relevant reactive species of oxygen and nitrogen.

    Science.gov (United States)

    Magalhães, Luís M; Lúcio, Marlene; Segundo, Marcela A; Reis, Salette; Lima, José L F C

    2009-06-15

    Redox reactions are the heart of numerous biochemical pathways found in cellular chemistry, generating reactive oxygen species (ROS) and reactive nitrogen species (RNS), that includes superoxide anion radical (O2-), hydrogen peroxide (H2O2), hydroxyl radical (HO), singlet oxygen ((1)O2), hypochlorite anion (OCl-), peroxynitrite anion (ONOO-) and nitric oxide radical (NO). The measurement of scavenging capacity against these reactive species presents new challenges, which can be met by flow injection analysis (FIA). In the present review several methods based on FIA and also on its predecessors computer-controlled techniques (sequential injection analysis, multisyringe flow injection analysis, multicommutated and multipumping flow systems) are critically discussed. The selectivity and applicability of the methodology, the generation and detection of the target reactive species, the benefits and limitations of automation when compared to batch methods are some of the issues addressed.

  13. Assessment of a generalizable methodology to assess learning from manikin-based simulation technology.

    Science.gov (United States)

    Giuliano, Dominic A; McGregor, Marion

    2014-01-01

    Objective : This study combined a learning outcomes-based checklist and salient characteristics derived from wisdom-of-crowds theory to test whether differing groups of judges (diversity maximized versus expertise maximized) would be able to appropriately assess videotaped, manikin-based simulation scenarios. Methods : Two groups of 3 judges scored 9 videos of interns managing a simulated cardiac event. The first group had a diverse range of knowledge of simulation procedures, while the second group was more homogeneous in their knowledge and had greater simulation expertise. All judges viewed 3 types of videos (predebriefing, postdebriefing, and 6 month follow-up) in a blinded fashion and provided their scores independently. Intraclass correlation coefficients (ICCs) were used to assess the reliability of judges as related to group membership. Scores from each group of judges were averaged to determine the impact of group on scores. Results : Results revealed strong ICCs for both groups of judges (diverse, 0.89; expert, 0.97), with the diverse group of judges having a much wider 95% confidence interval for the ICC. Analysis of variance of the average checklist scores indicated no significant difference between the 2 groups of judges for any of the types of videotapes assessed (F = 0.72, p = .4094). There was, however, a statistically significant difference between the types of videos (F = 14.39, p = .0004), with higher scores at the postdebrief and 6-month follow-up time periods. Conclusions : Results obtained in this study provide optimism for assessment procedures in simulation using learning outcomes-based checklists and a small panel of judges.

  14. Strain-Based Design Methodology of Large Diameter Grade X80 Linepipe

    Energy Technology Data Exchange (ETDEWEB)

    Lower, Mark D. [ORNL

    2014-04-01

    Continuous growth in energy demand is driving oil and natural gas production to areas that are often located far from major markets where the terrain is prone to earthquakes, landslides, and other types of ground motion. Transmission pipelines that cross this type of terrain can experience large longitudinal strains and plastic circumferential elongation as the pipeline experiences alignment changes resulting from differential ground movement. Such displacements can potentially impact pipeline safety by adversely affecting structural capacity and leak tight integrity of the linepipe steel. Planning for new long-distance transmission pipelines usually involves consideration of higher strength linepipe steels because their use allows pipeline operators to reduce the overall cost of pipeline construction and increase pipeline throughput by increasing the operating pressure. The design trend for new pipelines in areas prone to ground movement has evolved over the last 10 years from a stress-based design approach to a strain-based design (SBD) approach to further realize the cost benefits from using higher strength linepipe steels. This report presents an overview of SBD for pipelines subjected to large longitudinal strain and high internal pressure with emphasis on the tensile strain capacity of high-strength microalloyed linepipe steel. The technical basis for this report involved engineering analysis and examination of the mechanical behavior of Grade X80 linepipe steel in both the longitudinal and circumferential directions. Testing was conducted to assess effects on material processing including as-rolled, expanded, and heat treatment processing intended to simulate coating application. Elastic-plastic and low-cycle fatigue analyses were also performed with varying internal pressures. Proposed SBD models discussed in this report are based on classical plasticity theory and account for material anisotropy, triaxial strain, and microstructural damage effects

  15. Methodology and application for health risk classification of chemicals in foods based on risk matrix.

    Science.gov (United States)

    Zhou, Ping Ping; Liu, Zhao Ping; Zhang, Lei; Liu, Ai Dong; Song, Yan; Yong, Ling; Li, Ning

    2014-11-01

    The method has been developed to accurately identify the magnitude of health risks and provide scientific evidence for implementation of risk management in food safety. It combines two parameters including consequence and likelihood of adverse effects based on risk matrix. Score definitions and classification for the consequence and the likelihood of adverse effects are proposed. The risk score identifies the intersection of consequence and likelihood in risk matrix represents its health risk level with different colors: 'low', 'medium', 'high'. Its use in an actual case is shown.

  16. Decision support systems for morphology-based diagnosis and prognosis of prostate neoplasms: a methodological approach.

    Science.gov (United States)

    Montironi, Rodolfo; Cheng, Liang; Lopez-Beltran, Antonio; Mazzucchelli, Roberta; Scarpelli, Marina; Bartels, Peter H

    2009-07-01

    Recent advances in computer and information technologies have allowed the integration of both numeric and non-numeric data, that is, descriptive, linguistic terms. This has led at 1 end of the spectrum of technology development to machine vision based on image understanding and, at the other, to decision support systems. This has had a significant impact on our capability to derive diagnostic and prognostic information from histopathological material with prostate neoplasms. Cancer 2009;115(13 suppl):3068-77. (c) 2009 American Cancer Society.

  17. Complete methodology on generating realistic wind speed profiles based on measurements

    DEFF Research Database (Denmark)

    Gavriluta, Catalin; Spataru, Sergiu; Mosincat, Ioan;

    2012-01-01

    , wind modelling for medium and large time scales is poorly treated in the present literature. This paper presents methods for generating realistic wind speed profiles based on real measurements. The wind speed profile is divided in a low- frequency component (describing long term variations......The wind speed represents the main exogenous signal applied to a Wind Energy Conversion System (WECS) and determines its behavior. The erratic variation of the wind speed, highly dependent on the given site and on the atmospheric conditions, makes the wind speed quite difficult to model. Moreover...

  18. Ozone Depletion from Nearby Supernovae

    Science.gov (United States)

    Gehrels, Neil; Laird, Claude M.; Jackman, Charles H.; Cannizzo, John K.; Mattson, Barbara J.; Chen, Wan; Bhartia, P. K. (Technical Monitor)

    2002-01-01

    Estimates made in the 1970's indicated that a supernova occurring within tens of parsecs of Earth could have significant effects on the ozone layer. Since that time improved tools for detailed modeling of atmospheric chemistry have been developed to calculate ozone depletion, and advances have been made also in theoretical modeling of supernovae and of the resultant gamma ray spectra. In addition, one now has better knowledge of the occurrence rate of supernovae in the galaxy, and of the spatial distribution of progenitors to core-collapse supernovae. We report here the results of two-dimensional atmospheric model calculations that take as input the spectral energy distribution of a supernova, adopting various distances from Earth and various latitude impact angles. In separate simulations we calculate the ozone depletion due to both gamma rays and cosmic rays. We find that for the combined ozone depletion from these effects roughly to double the 'biologically active' UV flux received at the surface of the Earth, the supernova must occur at approximately or less than 8 parsecs.

  19. Ozone Depletion from Nearby Supernovae

    CERN Document Server

    Gehrels, N; Jackman, C H; Cannizzo, J K; Mattson, B J; Chen, W; Gehrels, Neil; Laird, Claude M.; Jackman, Charles H.; Cannizzo, John K.; Mattson, Barbara J.; Chen, Wan

    2003-01-01

    Estimates made in the 1970's indicated that a supernova occurring within tens of parsecs of Earth could have significant effects on the ozone layer. Since that time, improved tools for detailed modeling of atmospheric chemistry have been developed to calculate ozone depletion, and advances have been made in theoretical modeling of supernovae and of the resultant gamma-ray spectra. In addition, one now has better knowledge of the occurrence rate of supernovae in the galaxy, and of the spatial distribution of progenitors to core-collapse supernovae. We report here the results of two-dimensional atmospheric model calculations that take as input the spectral energy distribution of a supernova, adopting various distances from Earth and various latitude impact angles. In separate simulations we calculate the ozone depletion due to both gamma-rays and cosmic rays. We find that for the combined ozone depletion roughly to double the ``biologically active'' UV flux received at the surface of the Earth, the supernova mu...

  20. HD depletion in starless cores

    CERN Document Server

    Sipilä, O; Harju, J

    2013-01-01

    Aims: We aim to investigate the abundances of light deuterium-bearing species such as HD, H2D+ and D2H+ in a gas-grain chemical model including an extensive description of deuterium and spin state chemistry, in physical conditions appropriate to the very centers of starless cores. Methods: We combine a gas-grain chemical model with radiative transfer calculations to simulate density and temperature structure in starless cores. The chemical model includes deuterated forms of species with up to 4 atoms and the spin states of the light species H2, H2+ and H3+ and their deuterated forms. Results: We find that HD eventually depletes from the gas phase because deuterium is efficiently incorporated to grain-surface HDO, resulting in inefficient HD production on grains. HD depletion has consequences not only on the abundances of e.g. H2D+ and D2H+, whose production depends on the abundance of HD, but also on the spin state abundance ratios of the various light species, when compared with the complete depletion model ...

  1. Transnational Social Movements and the Globalization Agenda: A methodological Approach Based oh the Analysis of the World Social Forum

    Directory of Open Access Journals (Sweden)

    Carlos Milani

    2016-09-01

    Full Text Available Globalization is not merely a competition for market shares and well-timed economic growth initiatives; neither is it just a matter of trade opportunities and liberalization. It has also evolved into a social and political struggle for imposing cultural values and individual preferences. Based on this broader context, this paper adopts the following assumption: transnational networks of social movements are the expression of a new social subject and have shifted their scale of political intervention since the 1990s in order to make their fight for social justice a politically pertinent action. Global social justice has become the motto of transnational social movements in world politics, where political decisions no longer rely exclusively on nation-states. In pursuance of developing this assumption, this paper approaches the discussion in two general parts: firstly, it presents a theoretical and methodological approach for analysing transnational social movements; secondly, it looks into the World Social Forum as one of their key political expressions.Globalization is not merely a competition for market shares and well-timed economic growth initiatives; neither is it just a matter of trade opportunities and liberalization. It has also evolved into a social and political struggle for imposing cultural values and individual preferences. Based on this broader context, this paperadopts the following assumption: transnational networks of social movements are the expression of a new social subject and have shifted their scale of political intervention since the 1990s in order to make their fight for social justice a politically pertinent action. Global social justice has become the motto of transnational social movements in world politics, where political decisions no longer rely exclusively on nation-states. In pursuance of developing this assumption, this paper approaches the discussion in two general parts: firstly, it presents a theoretical and

  2. Genetic algorithm-based fuzzy-PID control methodologies for enhancement of energy efficiency of a dynamic energy system

    Energy Technology Data Exchange (ETDEWEB)

    Jahedi, G. [Energy Research Center, Department of Electrical Engineering, Amirkabir University of Technology (Tehran Polytechnic), 424 Hafez Ave, Tehran (Iran, Islamic Republic of); Ardehali, M.M., E-mail: ardehali@aut.ac.i [Energy Research Center, Department of Electrical Engineering, Amirkabir University of Technology (Tehran Polytechnic), 424 Hafez Ave, Tehran (Iran, Islamic Republic of)

    2011-01-15

    The simplicity in coding the heuristic judgment of experienced operator by means of fuzzy logic can be exploited for enhancement of energy efficiency. Fuzzy logic has been used as an effective tool for scheduling conventional PID controllers gain coefficients (F-PID). However, to search for the most desirable fuzzy system characteristics that allow for best performance of the energy system with minimum energy input, optimization techniques such as genetic algorithm (GA) could be utilized and the control methodology is identified as GA-based F-PID (GA-F-PID). The objective of this study is to examine the performance of PID, F-PID, and GA-F-PID controllers for enhancement of energy efficiency of a dynamic energy system. The performance evaluation of the controllers is accomplished by means of two cost functions that are based on the quadratic forms of the energy input and deviation from a setpoint temperature, referred to as energy and comfort costs, respectively. The GA-F-PID controller is examined in two different forms, namely, global form and local form. For the global form, all possible combinations of fuzzy system characteristics in the search domain are explored by GA for finding the fittest chromosome for all discrete time intervals during the entire operation period. For the local form, however, GA is used in each discrete time interval to find the fittest chromosome for implementation. The results show that the global form GA-F-PID and local form GA-F-PID control methodologies, in comparison with PID controller, achieve higher energy efficiency by lowering energy costs by 51.2%, and 67.8%, respectively. Similarly, the comfort costs for deviation from setpoint are enhanced by 54.4%, and 62.4%, respectively. It is determined that GA-F-PID performs better in local from than global form.

  3. Agro-designing: sustainability-driven, vision-oriented, problem preventing and knowledge-based methodology for improving farming systems sustainability

    OpenAIRE

    Znaor, Darko; Goewie, Eric

    1999-01-01

    ABSTRACT While classical research focuses to problem solving, design is a problem- prevention methodology, and is suitable for multi- and and interdisciplinary research teams with the vision of how to improve the agricultural sustainability. Since organic agriculture is based on the holistic approach and is also problem-prevention oriented in that it refrains from certain inputs and practices, design is an interesting methodology that could be applied more often in organic agriculture. ...

  4. A discourse analysis methodology based on semantic principles - an application to brands, journalists and consumers discourses

    Directory of Open Access Journals (Sweden)

    Luc Grivel

    2011-12-01

    Full Text Available This is a R&D Paper. It describes an analysis coming from a research project about opinion measurement and monitoring on the Internet. This research is realized within "Paragraphe" laboratory, in partnership with the market research institute Harris Interactive (CIFRE grant beginning July 2010. The purpose of the study was to define CRM possibilities. The targets of the study were self-employed workers and very small businesses. The discourses analysis is linked to a qualitative study. It turns around three types of discourses: brands, journalists and clients’ discourses. In the brand discourses analysis we benchmarked brand websites belonging to several businesses. In this first step, we tried to identify the most used words and promises by brands to the target we were studying. For that benchmark, we downloaded "Professionals" sections of the websites. Clients’ discourses analysis is based on opened answers coming from satisfaction questionnaires. The questions we are studying have been asked after a call to a hot line or after a technician intervention. Journalists’ discourses analysis is based on articles, published on information websites specialized in Harris Interactive's client sector. These websites were chosen because we considered them to be representative of information sources, which the target could consult.

  5. Analysis of current and alternative phenol based RNA extraction methodologies for cyanobacteria

    Directory of Open Access Journals (Sweden)

    Lindblad Peter

    2009-08-01

    Full Text Available Abstract Background The validity and reproducibility of gene expression studies depend on the quality of extracted RNA and the degree of genomic DNA contamination. Cyanobacteria are gram-negative prokaryotes that synthesize chlorophyll a and carry out photosynthetic water oxidation. These organisms possess an extended array of secondary metabolites that impair cell lysis, presenting particular challenges when it comes to nucleic acid isolation. Therefore, we used the NHM5 strain of Nostoc punctiforme ATCC 29133 to compare and improve existing phenol based chemistry and procedures for RNA extraction. Results With this work we identify and explore strategies for improved and lower cost high quality RNA isolation from cyanobacteria. All the methods studied are suitable for RNA isolation and its use for downstream applications. We analyse different Trizol based protocols, introduce procedural changes and describe an alternative RNA extraction solution. Conclusion It was possible to improve purity of isolated RNA by modifying protocol procedures. Further improvements, both in RNA purity and experimental cost, were achieved by using a new extraction solution, PGTX.

  6. Effect of methodology, dilution, and exposure time on the tuberculocidal activity of glutaraldehyde-based disinfectants.

    Science.gov (United States)

    Cole, E C; Rutala, W A; Nessen, L; Wannamaker, N S; Weber, D J

    1990-01-01

    The Association of Official Analytical Chemists (AOAC) test for assessing the tuberculocidal activity of disinfectants has been shown to be variable. A modified AOAC test, which substituted Middlebrook 7H9 broth as the primary subculture medium and used neutralization by dilution, was compared with the standard AOAC method to assess the mycobactericidal activity of three glutaraldehyde-based disinfectants at 20 degrees C and various exposure times. These changes had a marked effect on results, with the modified AOAC test providing more positive penicylinders per 10 replicates in 12 of the 13 comparisons that provided positive results. These differences were observed with both Mycobacterium bovis (ATCC 35743) and a clinical isolate of Mycobacterium tuberculosis. The effects of various exposure times to and dilutions of the glutaraldehyde-based disinfectants were also examined. The minimum exposure time needed to inactivate reliably M. bovis or M. tuberculosis with 2% glutaraldehyde was 20 min at 20 degrees C. Diluting 2% glutaraldehyde caused a significant decline in mycobactericidal activity. Modification of the standard AOAC test to improve its sensitivity in detecting the failure of disinfectants to inactivate mycobacteria is indicated. PMID:2116760

  7. A rapid automatic analyzer and its methodology for effective bentonite content based on image recognition technology

    Directory of Open Access Journals (Sweden)

    Wei Long

    2016-09-01

    Full Text Available Fast and accurate determination of effective bentonite content in used clay bonded sand is very important for selecting the correct mixing ratio and mixing process to obtain high-performance molding sand. Currently, the effective bentonite content is determined by testing the ethylene blue absorbed in used clay bonded sand, which is usually a manual operation with some disadvantages including complicated process, long testing time and low accuracy. A rapid automatic analyzer of the effective bentonite content in used clay bonded sand was developed based on image recognition technology. The instrument consists of auto stirring, auto liquid removal, auto titration, step-rotation and image acquisition components, and processor. The principle of the image recognition method is first to decompose the color images into three-channel gray images based on the photosensitive degree difference of the light blue and dark blue in the three channels of red, green and blue, then to make the gray values subtraction calculation and gray level transformation of the gray images, and finally, to extract the outer circle light blue halo and the inner circle blue spot and calculate their area ratio. The titration process can be judged to reach the end-point while the area ratio is higher than the setting value.

  8. The Narrative-Based Enhancement Methodology for the Cultural Property and the Cultural Institution

    Directory of Open Access Journals (Sweden)

    Kastytis Rudokas

    2013-10-01

    Full Text Available The paper addresses the contemporary conception of the place marketing, image and the impact of multiple identities on the cultural institution of a city. The first part of the paper is based on the most famous Clare A. Gunn theory on two possible perceptions of the postmodern place image. The author of the article points out that the cultural value of an object is conditional and depends on communicational strategies and community needs. As an example of identity introduction to a place, the case of Berlin Platten is taken, where creative society is creating a new public image of multi-dwelling buildings. Basketball Club Žalgiris is taken as a Lithuanian example of an image that emerges deep from the past. In this case, the author of the article shows how the Club is constructing its narrative by manipulating historical facts at present. In the last part of the paper, it is argued that rapidly changing society causes the abstract valuation of culture, which is also based on a wider communicational context. As a conclusion, the author points out that processes of identity construction will provide the background for cultural and economic rivalry between cities of the World.

  9. Interparticle force based methodology for prediction of cohesive powder flow properties

    Science.gov (United States)

    Esayanur, Madhavan Sujatha Sarma

    The transport and handling of powders are key areas in the process industry that have a direct impact on the efficiency and/or the quality of the finished product. A lack of fundamental understanding of powder flow properties as a function of operating variables such as relative humidity, and particle size, leading to problems such as arching, rat-holing and segregation, is one the main causes for unscheduled down times in plant operation and loss of billions of dollars in revenues. Most of the current design strategies and characterization techniques for industrial powders are based on a continuum approach similar to the field of soil mechanics. Due to an increase in complexity of the synthesis process and reduction in size of powders to the nanoscale, the surface properties and inter particle forces play a significant role in determining the flow characteristics. The use of ensemble techniques such as direct shear testing to characterize powders are no longer adequate due to lack of understanding of the changes in the property of powders as a function of the major operating variables such as relative humidity, temperature etc. New instrumentation or techniques need to be developed to reliably characterize powder flow behavior. Simultaneously, scalability of the current models to predict powder flow needs to be revisited. Specifically, this study focuses on the development of an inter particle force based model for predicting the unconfined yield strength of cohesive powders. To understand the role of interparticle forces in determining the strength of cohesive powders, the particle scale interactions were characterized using Atomic Force Microscopy (AFM), contact angle, surface tension, and coefficient of friction. The bulk scale properties such as unconfined yield strength, packing structure, and size of the shear zone were also investigated. It was determined that an interparticle force based model incorporating the effect of particle size and packing structure

  10. ANALYSIS OF THE EFFECTIVENESS AND EFFICIENCY OF MANAGEMENT SYSTEMS BASED ON SYSTEM ANALYSIS METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Yurij Vasilkov

    2014-09-01

    Full Text Available In this paper we consider the problem of analyzing the effectiveness and efficiency of management systems that are relevant, especially in the implementation at the enterprise requirements of ISO 9001, 14001 and others. Research management system based on a systematic approach focused on the disclosure of its integrative qualities (i.e. systemic, on identifying the variety of relationships and mechanisms for these qualities. It allows to identify the causes of the real state of affairs, to explain the successes and failures. An important aspect of a systematic approach to the analysis of the effectiveness and efficiency of production control management is the multiplicity of "stakeholders" interests involved in the production process in the formation of operational goals and ways to achieve them.

  11. Methodology for the effective stabilization of tin-oxide-based oxidation/reduction catalysts

    Science.gov (United States)

    Jordan, Jeffrey D. (Inventor); Schryer, David R. (Inventor); Davis, Patricia P. (Inventor); Leighty, Bradley D. (Inventor); Watkins, Anthony N. (Inventor); Schryer, Jacqueline L. (Inventor); Oglesby, Donald M. (Inventor); Gulati, Suresh T. (Inventor); Summers, Jerry C. (Inventor)

    2011-01-01

    The invention described herein involves a novel approach to the production of oxidation/reduction catalytic systems. The present invention serves to stabilize the tin oxide reducible metal-oxide coating by co-incorporating at least another metal-oxide species, such as zirconium. In one embodiment, a third metal-oxide species is incorporated, selected from the group consisting of cerium, lanthanum, hafnium, and ruthenium. The incorporation of the additional metal oxide components serves to stabilize the active tin-oxide layer in the catalytic process during high-temperature operation in a reducing environment (e.g., automobile exhaust). Moreover, the additional metal oxides are active components due to their oxygen-retention capabilities. Together, these features provide a mechanism to extend the range of operation of the tin-oxide-based catalyst system for automotive applications, while maintaining the existing advantages.

  12. Design Novel Model Reference Artificial Intelligence Based Methodology to Optimized Fuel Ratio in IC Engine

    Directory of Open Access Journals (Sweden)

    FarzinPiltan

    2013-08-01

    Full Text Available In this research, model reference fuzzy based control is presented as robust controls for IC engine. The objective of the study is to design controls for IC engines without the knowledge of the boundary of uncertainties and dynamic information by using fuzzy model reference PD plus mass of air while improve the robustness of the PD plus mass of air control. A PD plus mass of air provides for eliminate the mass of air and ultimate accuracy in the presence of the bounded disturbance/uncertainties, although this methods also causes some oscillation. The fuzzy PD plus mass of air is proposed as a solution to the problems crated by unstability. This method has a good performance in presence of uncertainty.

  13. Comparing syndromic surveillance detection methods: EARS' versus a CUSUM-based methodology.

    Science.gov (United States)

    Fricker, Ronald D; Hegler, Benjamin L; Dunfee, David A

    2008-07-30

    This paper compares the performance of three detection methods, entitled C1, C2, and C3, that are implemented in the early aberration reporting system (EARS) and other syndromic surveillance systems versus the CUSUM applied to model-based prediction errors. The cumulative sum (CUSUM) performed significantly better than the EARS' methods across all of the scenarios we evaluated. These scenarios consisted of various combinations of large and small background disease incidence rates, seasonal cycles from large to small (as well as no cycle), daily effects, and various types and levels of random daily variation. This leads us to recommend replacing the C1, C2, and C3 methods in existing syndromic surveillance systems with an appropriately implemented CUSUM method.

  14. Semi-automated Method for Failed Eruptions Search in SDO Data Base: Methodology and First Results

    Science.gov (United States)

    Mrozek, T.; Gronkiewicz, D.; Kołomański, S.; Chmielewska, E.; Chruślińska, M.

    It is well known that not all solar flares are connected with eruptions followed by coronal mass ejection (CME). Even strongest X-class flares may not be accompanied by eruptions or are accompanied by failed eruptions. There are several mechanisms responsible which were proposed. Present observations of SDO/AIA give a chance for deep statistical analysis of properties of an active region that may confine an eruption. Therefore, we developed automated method which can recognize moving structures and confined eruptions in AIA images. We present the algorithm and its performance for 1 April 2012 - 1 July 2012 period. The algorithm found more than 600 dynamic events. More than 30% of them are failed eruptions. Developed algorithm is very effective and gives a chance for huge increase of failed eruption data base.

  15. Evaluation of social vulnerability to floods in Huaihe River basin: a methodology based on catastrophe theory

    Science.gov (United States)

    You, W. J.; Zhang, Y. L.

    2015-08-01

    Huaihe River is one of the seven largest rivers in China, in which floods occurred frequently. Disasters cause huge casualties and property losses to the basin, and also make it famous for high social vulnerability to floods. Based on the latest social-economic data, the index system of social vulnerability to floods was constructed, and Catastrophe theory method was used in the assessment process. The conclusion shows that social vulnerability as a basic attribute attached to urban environment, with significant changes from city to city across the Huaihe River basin. Different distribution characteristics are present in population, economy, flood prevention vulnerability. It is important to make further development of social vulnerability, which will play a positive role in disaster prevention, improvement of comprehensive ability to respond to disasters.

  16. A Methodological Review of Piezoelectric Based Acoustic Wave Generation and Detection Techniques for Structural Health Monitoring

    Directory of Open Access Journals (Sweden)

    Zhigang Sun

    2013-01-01

    Full Text Available Piezoelectric transducers have a long history of applications in nondestructive evaluation of material and structure integrity owing to their ability of transforming mechanical energy to electrical energy and vice versa. As condition based maintenance has emerged as a valuable approach to enhancing continued aircraft airworthiness while reducing the life cycle cost, its enabling structural health monitoring (SHM technologies capable of providing on-demand diagnosis of the structure without interrupting the aircraft operation are attracting increasing R&D efforts. Piezoelectric transducers play an essential role in these endeavors. This paper is set forth to review a variety of ingenious ways in which piezoelectric transducers are used in today’s SHM technologies as a means of generation and/or detection of diagnostic acoustic waves.

  17. Repeating cytological preparations on liquid-based cytology samples: A methodological advantage?

    Science.gov (United States)

    Pinto, Alvaro P; Maia, Henrique Felde; di Loretto, Celso; Krunn, Patrícia; Túlio, Siumara; Collaço, Luis Martins

    2007-10-01

    This study investigates the rule that repeating cytological preparations on liquid-based cytology improves sample adequacy, diagnosis, microbiological, and hormonal evaluations. We reviewed 156 cases of pap-stained preparations of exfoliated cervical cells in two slides processed by DNA-Cytoliq System. After sample repeat/dilution, limiting factors affecting sample adequacy were removed in nine cases and three unsatisfactory cases were reclassified as satisfactory. Diagnosis was altered in 24 cases. Of these, the original diagnosis in 15 was atypical squamous cells of undetermined significance; after the second slide examination, diagnosis in 5 of the 15 cases changed to low-grade squamous intraepithelial lesion, 3 to high-grade squamous intraepithelial lesion, and 7 to absence of lesion. Microbiological evaluation was altered, with Candida sp. detected in two repeated slides. Repeat slide preparation or dilution of residual samples enhances cytological diagnosis and decreases effects of limiting factors in manually processed DIGENE DCS LBC.

  18. The pathogen- and incidence-based DALY approach: an appropriate [corrected] methodology for estimating the burden of infectious diseases.

    Directory of Open Access Journals (Sweden)

    Marie-Josée J Mangen

    Full Text Available In 2009, the European Centre for Disease Prevention and Control initiated the 'Burden of Communicable Diseases in Europe (BCoDE' project to generate evidence-based and comparable burden-of-disease estimates of infectious diseases in Europe. The burden-of-disease metric used was the Disability-Adjusted Life Year (DALY, composed of years of life lost due to premature death (YLL and due to disability (YLD. To better represent infectious diseases, a pathogen-based approach was used linking incident cases to sequelae through outcome trees. Health outcomes were included if an evidence-based causal relationship between infection and outcome was established. Life expectancy and disability weights were taken from the Global Burden of Disease Study and alternative studies. Disease progression parameters were based on literature. Country-specific incidence was based on surveillance data corrected for underestimation. Non-typhoidal Salmonella spp. and Campylobacter spp. were used for illustration. Using the incidence- and pathogen-based DALY approach the total burden for Salmonella spp. and Campylobacter spp. was estimated at 730 DALYs and at 1,780 DALYs per year in the Netherlands (average of 2005-2007. Sequelae accounted for 56% and 82% of the total burden of Salmonella spp. and Campylobacter spp., respectively. The incidence- and pathogen-based DALY methodology allows in the case of infectious diseases a more comprehensive calculation of the disease burden as subsequent sequelae are fully taken into account. Not considering subsequent sequelae would strongly underestimate the burden of infectious diseases. Estimates can be used to support prioritisation and comparison of infectious diseases and other health conditions, both within a country and between countries.

  19. Application of hazard analysis and critical control point methodology and risk-based grading to consumer food safety surveys.

    Science.gov (United States)

    Røssvoll, Elin Halbach; Ueland, Øydis; Hagtvedt, Therese; Jacobsen, Eivind; Lavik, Randi; Langsrud, Solveig

    2012-09-01

    Traditionally, consumer food safety survey responses have been classified as either "right" or "wrong" and food handling practices that are associated with high risk of infection have been treated in the same way as practices with lower risks. In this study, a risk-based method for consumer food safety surveys has been developed, and HACCP (hazard analysis and critical control point) methodology was used for selecting relevant questions. We conducted a nationally representative Web-based survey (n = 2,008), and to fit the self-reported answers we adjusted a risk-based grading system originally developed for observational studies. The results of the survey were analyzed both with the traditional "right" and "wrong" classification and with the risk-based grading system. The results using the two methods were very different. Only 5 of the 10 most frequent food handling violations were among the 10 practices associated with the highest risk. These 10 practices dealt with different aspects of heat treatment (lacking or insufficient), whereas the majority of the most frequent violations involved storing food at room temperature for too long. Use of the risk-based grading system for survey responses gave a more realistic picture of risks associated with domestic food handling practices. The method highlighted important violations and minor errors, which are performed by most people and are not associated with significant risk. Surveys built on a HACCP-based approach with risk-based grading will contribute to a better understanding of domestic food handling practices and will be of great value for targeted information and educational activities.

  20. Uranium resource assessment by the Geological Survey; methodology and plan to update the national resource base

    Science.gov (United States)

    Finch, Warren Irvin; McCammon, Richard B.

    1987-01-01

    Based on the Memorandum of Understanding {MOU) of September 20, 1984, between the U.S. Geological Survey of the U.S. Department of Interior and the Energy Information Administration {EIA) of the U.S. Department of Energy {DOE), the U.S. Geological Survey began to make estimates of the undiscovered uranium endowment of selected areas of the United States in 1985. A modified NURE {National Uranium Resource Evaluation) method will be used in place of the standard NURE method of the DOE that was used for the national assessment reported in October 1980. The modified method, here named the 'deposit-size-frequency' {DSF) method, is presented for the first time, and calculations by the two methods are compared using an illustrative example based on preliminary estimates for the first area to be evaluated under the MOU. The results demonstrate that the estimate of the endowment using the DSF method is significantly larger and more uncertain than the estimate obtained by the NURE method. We believe that the DSF method produces a more realistic estimate because the principal factor estimated in the endowment equation is disaggregated into more parts and is more closely tied to specific geologic knowledge than by the NURE method. The DSF method consists of modifying the standard NURE estimation equation, U=AxFxTxG, by replacing the factors FxT by a single factor that represents the tonnage for the total number of deposits in all size classes. Use of the DSF method requires that the size frequency of deposits in a known or control area has been established and that the relation of the size-frequency distribution of deposits to probable controlling geologic factors has been determined. Using these relations, the principal scientist {PS) first estimates the number and range of size classes and then, for each size class, estimates the lower limit, most likely value, and upper limit of the numbers of deposits in the favorable area. Once these probable estimates have been refined

  1. Application of automated methodologies based on digital images for phenological behaviour analysis in Mediterranean species

    Science.gov (United States)

    Cesaraccio, Carla; Piga, Alessandra; Ventura, Andrea; Arca, Angelo; Duce, Pierpaolo; Granados, Joel

    2015-04-01

    The importance of phenological research for understanding the consequences of global environmental change on vegetation is highlighted in the most recent IPCC reports. Collecting time series of phenological events appears to be of crucial importance to better understand how vegetation systems respond to climatic regime fluctuations, and, consequently, to develop effective management and adaptation strategies. Vegetation monitoring based on "near-surface" remote sensing techniques have been proposed in recent researches. In particular, the use of digital cameras has become more common for phenological monitoring. Digital images provide spectral information in the red, green, and blue (RGB) wavelengths. Inflection points in seasonal variations of intensities of each color channel can be used to identify phenological events. In this research, an Automated Phenological Observation System (APOS), based on digital image sensors, was used for monitoring the phenological behavior of shrubland species in a Mediterranean site. Major species of the shrubland ecosystem that were analyzed were: Cistus monspeliensis L., Cistus incanus L., Rosmarinus officinalis L., Pistacia lentiscus L., and Pinus halepensis Mill. The system was developed under the INCREASE (an Integrated Network on Climate Change Research) EU-funded research infrastructure project, which is based upon large scale field experiments with non-intrusive climatic manipulations. Monitoring of phenological behavior was conducted during 2012-2014 years. To the end of retrieve phenological information from digital images, a routine of commands to process the digital image file using the program MATLAB (R2013b, The MathWorks, Natick, Mass.) was specifically created. The images of the dataset have been re-classified and renamed files according to the date and time of acquisition. The analysis was focused on regions of interest (ROIs) of the panoramas acquired, defined by the presence of the most representative species of

  2. A methodology for uncertainty quantification in quantitative technology valuation based on expert elicitation

    Science.gov (United States)

    Akram, Muhammad Farooq Bin

    uncertainty propagation. Non-linear behavior in technology interactions is captured through expert elicitation based technology synergy matrices (TSM). Proposed TSMs increase the fidelity of current technology forecasting methods by including higher order technology interactions. A test case for quantification of epistemic uncertainty on a large scale problem of combined cycle power generation system was selected. A detailed multidisciplinary modeling and simulation environment was adopted for this problem. Results have shown that evidence theory based technique provides more insight on the uncertainties arising from incomplete information or lack of knowledge as compared to deterministic or probability theory methods. Margin analysis was also carried out for both the techniques. A detailed description of TSMs and their usage in conjunction with technology impact matrices and technology compatibility matrices is discussed. Various combination methods are also proposed for higher order interactions, which can be applied according to the expert opinion or historical data. The introduction of technology synergy matrix enabled capturing the higher order technology interactions, and improvement in predicted system performance.

  3. Experimental study of heat pump thermodynamic cycles using CO2 based mixtures - Methodology and first results

    Science.gov (United States)

    Bouteiller, Paul; Terrier, Marie-France; Tobaly, Pascal

    2017-02-01

    The aim of this work is to study heat pump cycles, using CO2 based mixtures as working fluids. Since adding other chemicals to CO2 moves the critical point and generally equilibrium lines, it is expected that lower operating pressures as well as higher global efficiencies may be reached. A simple stage pure CO2 cycle is used as reference, with fixed external conditions. Two scenarios are considered: water is heated from 10 °C to 65 °C for Domestic Hot Water scenario and from 30 °C to 35 °C for Central Heating scenario. In both cases, water at the evaporator inlet is set at 7 °C to account for such outdoor temperature conditions. In order to understand the dynamic behaviour of thermodynamic cycles with mixtures, it is essential to measure the fluid circulating composition. To this end, we have developed a non intrusive method. Online optical flow cells allow the recording of infrared spectra by means of a Fourier Transform Infra Red spectrometer. A careful calibration is performed by measuring a statistically significant number of spectra for samples of known composition. Then, a statistical model is constructed to relate spectra to compositions. After calibration, compositions are obtained by recording the spectrum in few seconds, thus allowing for a dynamic analysis. This article will describe the experimental setup and the composition measurement techniques. Then a first account of results with pure CO2, and with the addition of propane or R-1234yf will be given.

  4. An Efficient Methodology for Calibrating Traffic Flow Models Based on Bisection Analysis

    Directory of Open Access Journals (Sweden)

    Enzo C. Jia

    2014-01-01

    Full Text Available As urban planning becomes more sophisticated, the accurate detection and counting of pedestrians and cyclists become more important. Accurate counts can be used to determine the need for additional pedestrian walkways and intersection reorganization, among other planning initiatives. In this project, a camera-based approach is implemented to create a real-time pedestrian and cyclist counting system which is regularly accurate to 85% and often achieves higher accuracy. The approach retasks a state-of-the-art traffic camera, the Autoscope Solo Terra, for pedestrian and bicyclist counting. Object detection regions are sized to identify multiple pedestrians moving in either direction on an urban sidewalk and bicyclists in an adjacent bicycle lane. Collected results are processed in real time, eliminating the need for video storage and postprocessing. In this paper, results are presented for a pedestrian walkway for pedestrian flow up to 108 persons/min and the limitations of the implemented system are enumerated. Both pedestrian and cyclist counting accuracy of over 90% is achieved.

  5. Weibull-Based Design Methodology for Rotating Structures in Aircraft Engines

    Directory of Open Access Journals (Sweden)

    Erwin V. Zaretsky

    2003-01-01

    Full Text Available The NASA Energy-Efficient Engine (E3-Engine is used as the basis of a Weibull-based life and reliability analysis. Each component's life, and thus the engine's life, is defined by high-cycle fatigue or low-cycle fatigue. Knowing the cumulative life distribution of each of the components making up the engine as represented by a Weibull slope is a prerequisite to predicting the life and reliability of the entire engine. As the engine's Weibull slope increases, the predicted life decreases. The predicted engine lives L5 (95% probability of survival of approximately 17,000 and 32,000 hr do correlate with current engine-maintenance practices without and with refurbishment, respectively. The individual high-pressure turbine (HPT blade lives necessary to obtain a blade system life L0.1 (99.9% probability of survival of 9000 hr for Weibull slopes of 3, 6, and 9 are 47,391; 20,652; and 15,658 hr, respectively. For a design life of the HPT disks having probable points of failure equal to or greater than 36,000 hr at a probability of survival of 99.9%, the predicted disk system life L0.1 can vary from 9408 to 24,911 hr.

  6. Activity-based costing methodology as tool for costing in hematopathology laboratory

    Directory of Open Access Journals (Sweden)

    Gujral Sumeet

    2010-01-01

    Full Text Available Background: Cost analysis in laboratories represents a necessary phase in their scientific progression. Aim: To calculate indirect cost and thus total cost per sample of various tests at Hematopathology laboratory (HPL Settings and Design: Activity-based costing (ABC method is used to calculate per cost test of the hematopathology laboratory. Material and Methods: Information is collected from registers, purchase orders, annual maintenance contracts (AMCs, payrolls, account books, hospital bills and registers along with informal interviews with hospital staff. Results: Cost per test decreases as total number of samples increases. Maximum annual expense at the HPL is on reagents and consumables followed by manpower. Cost per test is higher for specialized tests which interpret morphological or flow data and are done by a pathologist. Conclusions: Despite several limitations and assumptions, this was an attempt to understand how the resources are consumed in a large size government-run laboratory. The rate structure needs to be revised for most of the tests, mainly for complete blood counts (CBC, bone marrow examination, coagulation tests and Immunophenotyping. This costing exercise is laboratory specific and each laboratory needs to do its own costing. Such an exercise may help a laboratory redesign its costing structure or at least understand the economics involved in the laboratory management.

  7. DSM/DMM-based Methodology for Handling Changes in Product Development Project

    Directory of Open Access Journals (Sweden)

    Feng Xin

    2016-01-01

    Full Text Available With the growing complexity of the business context, companies are confronting more challenging Product Development (PD projects because of the potential risks externally brought by the partners’ participation and internally accumulated through the activities’ execution. In this paper, we formulate and decompose a PD project in a hierarchical way, and the obtained end elements perceived as objects are built into a number of Design Structure Matrices (DSMs and Domain Mapping Matrices (DMMs. We also propose a conceptual model of change occurring and change propagation, base on which the potential change propagating channels are discovered through observing and tracking change occurring and change propagation in DSMs and DMMs. Associated with the depth-first and breath-first search methods executed in matrices the critical objects during change propagation are discovered and the implicit change propagating channels cross multiple fields are identified. As the exploratory results, the critical objects taking account of change propagation are highlighted as the contribution to change management in PD project.

  8. Looking for phase-space structures in star-forming regions: An MST-based methodology

    CERN Document Server

    Alfaro, Emilio J

    2015-01-01

    We present a method for analysing the phase space of star-forming regions. In particular we are searching for clumpy structures in the 3D subspace formed by two position coordinates and radial velocity. The aim of the method is the detection of kinematic segregated radial velocity groups, that is, radial velocity intervals whose associated stars are spatially concentrated. To this end we define a kinematic segregation index, $\\tilde{\\Lambda}$(RV), based on the Minimum Spanning Tree (MST) graph algorithm, which is estimated for a set of radial velocity intervals in the region. When $\\tilde{\\Lambda}$(RV) is significantly greater than 1 we consider that this bin represents a grouping in the phase space. We split a star-forming region into radial velocity bins and calculate the kinematic segregation index for each bin, and then we obtain the spectrum of kinematic groupings, which enables a quick visualization of the kinematic behaviour of the region under study. We carried out numerical models of different config...

  9. A new methodology for energy-based seismic design of steel moment frames

    Science.gov (United States)

    Mezgebo, Mebrahtom Gebrekirstos; Lui, Eric M.

    2017-01-01

    A procedure is proposed whereby input and hysteretic energy spectra developed for single-degree-of-freedom (SDOF) systems are applied to multi-degree-of-freedom (MDOF) steel moment resisting frames. The proposed procedure is verified using four frames, viz., frame with three-, five-, seven- and nine-stories, each of which is subjected to the fault-normal and fault-parallel components of three actual earthquakes. A very good estimate for the three- and five-story frames, and a reasonably acceptable estimate for the seven-, and nine-story frames, have been obtained. A method for distributing the hysteretic energy over the frame height is also proposed. This distribution scheme allows for the determination of the energy demand component of a proposed energy-based seismic design (EBSD) procedure for each story. To address the capacity component of EBSD, a story-wise optimization design procedure is developed by utilizing the energy dissipating capacity from plastic hinge formation/rotation for these moment frames. The proposed EBSD procedure is demonstrated in the design of a three-story one-bay steel moment frame.

  10. Assessment of Pharmaceutical Powder Flowability using Shear Cell-Based Methods and Application of Jenike's Methodology.

    Science.gov (United States)

    Jager, Paul D; Bramante, Tommasina; Luner, Paul E

    2015-11-01

    Jenike's approach to hopper design for a large-scale (3150 L) conical hopper was applied to pharmaceutical powders to evaluate flow issues, such as funnel flow or cohesive arching. Seven grades of microcrystalline cellulose (MCC) and six powder blends were tested. A Schulze Ring Shear Tester measured the flow function, wall friction (using stainless steel coupons with a #2B or #8 finish) and compressibility. Hopper Index (HI, maximum hopper angle required for mass flow) and Arching Index (AI, minimum hopper outlet size to prevent cohesive arch formation) were computed using Mathcad(©) . For MCC, a linear relationship was observed between median particle size and the Jenike flow function coefficient. A curvilinear relationship was observed for powder blends indicating more complex flow behavior than based on median particle size alone. Powder bulk density had a minimal effect on AI for MCC grades. Overestimation of AI can occur with this method for pharmaceutical powders because the true shape of the flow function is not defined at very low consolidation pressures and linear extrapolation becomes unrepresentative. This instrumental limitation underscores the need for a precise and accurate test method to determine powder flow functions at very low levels of consolidation stress for pharmaceutical applications.

  11. Foresight for commanders: a methodology to assist planning for effects-based operations

    Science.gov (United States)

    Davis, Paul K.; Kahan, James P.

    2006-05-01

    Looking at the battlespace as a system of systems is a cornerstone of Effects-Based Operations and a key element in the planning of such operations, and in developing the Commander's Predictive Environment. Instead of a physical battleground to be approached with weapons of force, the battlespace is an interrelated super-system of political, military, economic, social, information and infrastructure systems to be approached with diplomatic, informational, military and economic actions. A concept that has proved useful in policy arenas other than defense, such as research and development for information technology, addressing cybercrime, and providing appropriate and cost-effective health care, is foresight. In this paper, we provide an overview of how the foresight approach addresses the inherent uncertainties in planning courses of action, present a set of steps in the conduct of foresight, and then illustrate the application of foresight to a commander's decision problem. We conclude that foresight approach that we describe is consistent with current doctrinal intelligence preparation of the battlespace and operational planning, but represents an advance in that it explicitly addresses the uncertainties in the environment and planning in a way that identifies strategies that are robust over different possible ground truths. It should supplement other planning methods.

  12. ELABORATION OF METHODOLOGICAL TOOLS FOR AGRICULTURAL RISK MANAGEMENT BASED ON INNOVATION

    Directory of Open Access Journals (Sweden)

    Voroshilova I. V.

    2015-03-01

    Full Text Available The article deals with the possibility of expanding of agricultural tools in risk management based on commodity financial instruments and weather derivatives. On the basis of summarizing the research results of domestic and foreign scholars and creative interpretation of the results the authors supplemented and refined definition of the category of "risk" and "risk of agricultural production” is obtained. The article supplements classification of risk in agricultural production and circulation of agricultural products, considers a proven techniques and methods of agricultural risk management, discusses the current trends of the global and domestic market of derivatives, gives a market segmentation by type of derivative instruments and the characteristics of the underlying assets, analyzes the reasons for the low level of development of derivatives markets at the meso level using the example of the Krasnodar Region, describes the potential derivatives in addressing management of agricultural risks on the basis of foreign sources, gives an insufficient level of financial literacy of potential participants, the lack of regulations and regulatory infrastructures, describe the problem of accounting and reporting of the results of operations in this segment, insufficient training of market operators and reveals the possibility of expanding the agricultural tools of risk management

  13. THEORETICAL - METHODOLOGICAL BASES FOR THE FORMATION OF NATIONAL IDENTITY IN PRESCHOOL CHILDREN

    Directory of Open Access Journals (Sweden)

    Irina Aleksandrovna Galkina

    2015-01-01

    Full Text Available The article highlights the modern approaches to organization of education of preschool children in multicultural environment, states some basic principles and conditions for the formation of national identity, socialization of children in modern multicultural society, where multicultural education is based on the attitude of children to their homeland, family and immediate environment, to cultures of neighboring nations, from the expression of interest and sympathy towards people of other nationalities to learning their traditions and customs, developing knowledge about them, friendly relationship and respect to them. The article presents scientific positions in the context of philosophical, ethnographic, sociological researches, which reconsider the understanding and interpretation of the concept of "multicultural education" in the context of such categories as "internationality", "nationality", "humaneness". It defines and explains the most effective means of the formation of spiritual and moral character of a growing person, such as literature and theatre. The article presents the project enhancing the formation of national identity among preschool children, reflecting the multicultural traditions of Irkutsk region by fiction writers of the Angara region and Siberia.

  14. A Solution Methodology of Bi-Level Linear Programming Based on Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    M. S. Osman

    2009-01-01

    Full Text Available Problem statement: We deal with the bi-level linear programming problem. A bi-level programming problem is formulated for a problem in which two Decision-Makers (DMs make decisions successively. Approach: In this research we studied and designs a Genetic Algorithm (GA of Bi-Level Linear Programming Problems (BLPP by constructing the fitness function of the upper-level programming problems based on the definition of the feasible degree. This GA avoids the use of penalty function to deal with the constraints, by changing the randomly generated initial population into an initial population satisfying the constraints in order to improve the ability of the GA to deal with the constraints. Also we designed software to solve this problem. A comparative study between proposed method and previous methods through numerical results of some examples. Finally, parametric information of the GA was introduced. Results: Results of the study showed that the proposed method is feasible and more efficient to solve (BLPP, also there exist package to solve (BLPP problem. Conclusion: This GA avoids the use of penalty function to deal with the constraints, by changing the randomly generated initial population into an initial population satisfying the constraints in order to improve the ability of the GA to deal with the constraints.

  15. Attentional bias modification based on visual probe task: methodological issues, results and clinical relevance

    Directory of Open Access Journals (Sweden)

    Fernanda Machado Lopes

    2015-12-01

    Full Text Available Introduction: Attentional bias, the tendency that a person has to drive or maintain attention to a specific class of stimuli, may play an important role in the etiology and persistence of mental disorders. Attentional bias modification has been studied as a form of additional treatment related to automatic processing. Objectives: This systematic literature review compared and discussed methods, evidence of success and potential clinical applications of studies about attentional bias modification (ABM using a visual probe task. Methods: The Web of Knowledge, PubMed and PsycInfo were searched using the keywords attentional bias modification, attentional bias manipulation and attentional bias training. We selected empirical studies about ABM training using a visual probe task written in English and published between 2002 and 2014. Results: Fifty-seven studies met inclusion criteria. Most (78% succeeded in training attention in the predicted direction, and in 71% results were generalized to other measures correlated with the symptoms. Conclusions: ABM has potential clinical utility, but to standardize methods and maximize applicability, future studies should include clinical samples and be based on findings of studies about its effectiveness.

  16. Remote Sensing-based Methodologies for Snow Model Adjustments in Operational Streamflow Prediction

    Science.gov (United States)

    Bender, S.; Miller, W. P.; Bernard, B.; Stokes, M.; Oaida, C. M.; Painter, T. H.

    2015-12-01

    Water management agencies rely on hydrologic forecasts issued by operational agencies such as NOAA's Colorado Basin River Forecast Center (CBRFC). The CBRFC has partnered with the Jet Propulsion Laboratory (JPL) under funding from NASA to incorporate research-oriented, remotely-sensed snow data into CBRFC operations and to improve the accuracy of CBRFC forecasts. The partnership has yielded valuable analysis of snow surface albedo as represented in JPL's MODIS Dust Radiative Forcing in Snow (MODDRFS) data, across the CBRFC's area of responsibility. When dust layers within a snowpack emerge, reducing the snow surface albedo, the snowmelt rate may accelerate. The CBRFC operational snow model (SNOW17) is a temperature-index model that lacks explicit representation of snowpack surface albedo. CBRFC forecasters monitor MODDRFS data for emerging dust layers and may manually adjust SNOW17 melt rates. A technique was needed for efficient and objective incorporation of the MODDRFS data into SNOW17. Initial development focused in Colorado, where dust-on-snow events frequently occur. CBRFC forecasters used retrospective JPL-CBRFC analysis and developed a quantitative relationship between MODDRFS data and mean areal temperature (MAT) data. The relationship was used to generate adjusted, MODDRFS-informed input for SNOW17. Impacts of the MODDRFS-SNOW17 MAT adjustment method on snowmelt-driven streamflow prediction varied spatially and with characteristics of the dust deposition events. The largest improvements occurred in southwestern Colorado, in years with intense dust deposition events. Application of the method in other regions of Colorado and in "low dust" years resulted in minimal impact. The MODDRFS-SNOW17 MAT technique will be implemented in CBRFC operations in late 2015, prior to spring 2016 runoff. Collaborative investigation of remote sensing-based adjustment methods for the CBRFC operational hydrologic forecasting environment will continue over the next several years.

  17. Verification & Validation of High-Order Short-Characteristics-Based Deterministic Transport Methodology on Unstructured Grids

    Energy Technology Data Exchange (ETDEWEB)

    Azmy, Yousry [North Carolina State Univ., Raleigh, NC (United States); Wang, Yaqi [North Carolina State Univ., Raleigh, NC (United States)

    2013-12-20

    The research team has developed a practical, high-order, discrete-ordinates, short characteristics neutron transport code for three-dimensional configurations represented on unstructured tetrahedral grids that can be used for realistic reactor physics applications at both the assembly and core levels. This project will perform a comprehensive verification and validation of this new computational tool against both a continuous-energy Monte Carlo simulation (e.g. MCNP) and experimentally measured data, an essential prerequisite for its deployment in reactor core modeling. Verification is divided into three phases. The team will first conduct spatial mesh and expansion order refinement studies to monitor convergence of the numerical solution to reference solutions. This is quantified by convergence rates that are based on integral error norms computed from the cell-by-cell difference between the code’s numerical solution and its reference counterpart. The latter is either analytic or very fine- mesh numerical solutions from independent computational tools. For the second phase, the team will create a suite of code-independent benchmark configurations to enable testing the theoretical order of accuracy of any particular discretization of the discrete ordinates approximation of the transport equation. For each tested case (i.e. mesh and spatial approximation order), researchers will execute the code and compare the resulting numerical solution to the exact solution on a per cell basis to determine the distribution of the numerical error. The final activity comprises a comparison to continuous-energy Monte Carlo solutions for zero-power critical configuration measurements at Idaho National Laboratory’s Advanced Test Reactor (ATR). Results of this comparison will allow the investigators to distinguish between modeling errors and the above-listed discretization errors introduced by the deterministic method, and to separate the sources of uncertainty.

  18. Effect of Quenching Parameters on Mechanical Property of Ultra High Strength Steel BR1500HS Based on Response Surface Methodology

    Institute of Scientific and Technical Information of China (English)

    Jie Zhou; Zhi-Yong Huang; Lei Lin; Shi-Yun Li

    2014-01-01

    Hot stamping of high strength steels is defined as a process in which blank is heated to the temperature of the austenite stabilization region for a definite time and then formed and quenched simultaneously in a mold with cooling channels. During this process, the processing parameters of austenite temperature and soaking time have strong effects on the mechanical properties such as quenching hardness, tensile strength and elongation. Hence, it is necessary to investigate the relationship between the mechanical properties and the two processing parameters. In this paper, the orthogonal experiment with two factors and five levels was applied, and the experimental data based on the orthogonal experiment was acquired. Based on the data, regression models were set up and the results of the analysis of variance ( ANOVA) showed that it is reliable to predict the quenching hardness, tensile strength and elongation by the regression models. Besides, the optimal results of each single object were obtained based on response surface methodology ( RSM ) , and global optimums was gained by employing ideal point method in which the quenching hardness, and tensile strength and elongation were considered simultaneously.

  19. A randomized clinical trial of home-based telepsychiatric outpatient care via videoconferencing: design, methodology, and implementation

    Directory of Open Access Journals (Sweden)

    Ines Hungerbuehler

    2015-06-01

    Full Text Available Background Healthcare providers are continuously challenged to find innovative, cost-effective alternatives and to scale up existent services to meet the growing demand upon mental health care delivery. Due to continuous advances in technologies, telepsychiatry has become an effective tool for psychiatric care. In 2012, the Institute of Psychiatry of the University of São Paulo Medical School started a randomized clinical trial of home-based telepsychiatric outpatient care via videoconferencing. Objective The objective of this article is to describe the design, methodology and implementation of a pilot project, which aimed to verify the applicability and efficiency of psychiatric attendance via Internet-based videoconferencing in a resource-constrained environment. Methods The project consisted of a 12 months follow-up study with a randomized clinical trial, which compared various quality indicators between home-based telepsychiatric aftercare via videoconferencing and face-to-face aftercare. Results The final sample comprised 107 outpatients (53 in the telepsychiatry group and 54 in the control group. Among 1,227 realized consultations, 489 were held by videoconferencing. Satisfaction with the aftercare by videoconferencing and the medication delivery was high among patients. Attending psychiatrists were satisfied with the assistance by videoconferencing. Discussion The experiences during this pilot project have overall been very positive and psychiatric outpatient care by videoconferencing seems viable to treat patients even in a resource-constrained environment.

  20. Capture zone delineation methodology based on the maximum concentration: Preventative groundwater well protection areas for heat exchange fluid mixtures

    Science.gov (United States)

    Okkonen, Jarkko; Neupauer, Roseanna M.

    2016-05-01

    Capture zones of water supply wells are most often delineated based on travel times of water or solute to the well, with the assumption that if the travel time is sufficiently large, the concentration of chemical at the well will not exceed the drinking water standards. In many situations, the likely source concentrations or release masses of contamination from the potential sources are unknown; therefore, the exact concentration at the well cannot be determined. In situations in which the source mass can be estimated with some accuracy, the delineation of the capture zone should be based on the maximum chemical concentration that can be expected at the well, rather than on an arbitrary travel time. We present a new capture zone delineation methodology that is based on this maximum chemical concentration. The method delineates capture zones by solving the adjoint of the advection-dispersion-reaction equation and relating the adjoint state and the known release mass to the expected chemical concentration at the well. We demonstrate the use of this method through a case study in which soil heat exchange systems are potential sources of contamination. The heat exchange fluid mixtures contain known fluid volumes and chemical concentrations; thus, in the event of a release, the release mass of the chemical is known. We also demonstrate the use of a concentration basis in quantifying other measures of well vulnerability including exposure time and time to exceed a predefined threshold concentration at the well.

  1. A Methodology for the Management of Estuarine Restoration Plans at the Regional Scale based on Classification Techniques

    Science.gov (United States)

    Castanedo Bárcena, S.; Jiménez Tobío, M.; Medina, R.; Camus, P.

    2014-12-01

    A major challenge for environmental managers is the planning of the restoration of estuarine areas, moreover when several estuaries covering a large region are involved. These restoration plans must be designed by finding a balance between two opposing factors: a complete restoration of the area, i.e. the return to its natural conditions, and the maintenance of its current socioeconomic uses. At the regional level this balance can be obtained by choosing to recover some areas and not others. This work introduces a methodology to help decision-makers in the planning phase of restoration plans by providing an objective tool that measures the level of restoration achievable by different actions. The new approach is based on a classification of the areas to be restored according to some characteristics that represent the possible hydrodynamic and morphodynamic effects that the restoration action may induce on the rest of the estuary. The four parameters chosen to classify the restoration actions are: (1) the change in tidal prism induced by the restoration of the zone, (2) the distance between the area to be restored and the estuary mouth, (3) the tidal wave phase lag and (4) the flood potential of the area to be restored. The classification procedure combines self-organizing maps (SOM) and the K-means algorithm. This combination allows to easily represent multi-dimensional maps in two-dimensional plots while obtaining a concise final classification. The methodology was applied to a total of ten estuaries along the entire coast of Cantabria (Northern Spain), totaling 139 potential restoration areas, where a Spanish Ministry of the Environment Recuperation Plan is underway. The methodology classifies the 139 areas of restoration into five clusters. Empirical relationships were used to estimate the effects the restoration of each cluster may induce on the estuary's various morphodynamic elements (cross-sectional area of the estuary mouth, area of tidal flats, volume of

  2. On methodology

    DEFF Research Database (Denmark)

    Cheesman, Robin; Faraone, Roque

    2002-01-01

    This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública".......This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública"....

  3. A novel image reconstruction methodology based on inverse Monte Carlo analysis for positron emission tomography

    Science.gov (United States)

    Kudrolli, Haris A.

    2001-04-01

    A three dimensional (3D) reconstruction procedure for Positron Emission Tomography (PET) based on inverse Monte Carlo analysis is presented. PET is a medical imaging modality which employs a positron emitting radio-tracer to give functional images of an organ's metabolic activity. This makes PET an invaluable tool in the detection of cancer and for in-vivo biochemical measurements. There are a number of analytical and iterative algorithms for image reconstruction of PET data. Analytical algorithms are computationally fast, but the assumptions intrinsic in the line integral model limit their accuracy. Iterative algorithms can apply accurate models for reconstruction and give improvements in image quality, but at an increased computational cost. These algorithms require the explicit calculation of the system response matrix, which may not be easy to calculate. This matrix gives the probability that a photon emitted from a certain source element will be detected in a particular detector line of response. The ``Three Dimensional Stochastic Sampling'' (SS3D) procedure implements iterative algorithms in a manner that does not require the explicit calculation of the system response matrix. It uses Monte Carlo techniques to simulate the process of photon emission from a source distribution and interaction with the detector. This technique has the advantage of being able to model complex detector systems and also take into account the physics of gamma ray interaction within the source and detector systems, which leads to an accurate image estimate. A series of simulation studies was conducted to validate the method using the Maximum Likelihood - Expectation Maximization (ML-EM) algorithm. The accuracy of the reconstructed images was improved by using an algorithm that required a priori knowledge of the source distribution. Means to reduce the computational time for reconstruction were explored by using parallel processors and algorithms that had faster convergence rates

  4. A novel methodology based on contact angle hysteresis approach for surface changes monitoring in model PMMA-Corega Tabs system

    Science.gov (United States)

    Pogorzelski, Stanisław J.; Berezowski, Zdzisław; Rochowski, Paweł; Szurkowski, Janusz

    2012-02-01

    The aim of the paper is to propose a quantitative description of dental surface modifications, resulting from application of Corega and oral cavity liquids, with several surface parameters derived from liquid/solid contact angle measurements. In particular, to predict the long-term effectiveness of denture cleansers in prosthetics, it is necessary to determine surface wettability variations for model dental materials/probe liquid systems related to the contamination effect caused by substances found in the oral cavity. A novel simple low-cost methodology, based on liquid drop contact angle hysteresis CAH approach developed by Chibowski, was adopted to trace solid surface free energy changes in the model PMMA-Corega Tabs interfacial layer. Contact angle and its hysteresis were studied with a sessile drop-inclined plate method in contact with a cleanser (Corega Tabs) and model liquids found in the oral cavity. The apparent solid surface free energy, adsorptive film pressure, work of adhesion and spreading were derived from contact angle hysteresis data for both model solid surfaces (reference) and samples affected by different reactive liquids for a certain time. A time-dependent surface wettability changes of dentures were expressed quantitatively in terms of the corresponding variations of the surface energy parameters which turned out to be unequivocally related to the cleanser exposure time and polarity of the liquids applied to the dental material. The novel methodology appeared to be a useful tool for long term surface characterization of dental materials treated with surfactants-containing liquids capable of forming adhesive layers. The time of optimal use and effectiveness of cleansers are also reflected dynamically in the corresponding variations of the surface wettability parameters. Further studies on a large group of dental surface-probe liquid systems are required to specify the role played by other important factors (liquid polarity, pH and temperature).

  5. Templating gold surfaces with function: a self-assembled dendritic monolayer methodology based on monodisperse polyester scaffolds.

    Science.gov (United States)

    Öberg, Kim; Ropponen, Jarmo; Kelly, Jonathan; Löwenhielm, Peter; Berglin, Mattias; Malkoch, Michael

    2013-01-01

    The antibiotic resistance developed among several pathogenic bacterial strains has spurred interest in understanding bacterial adhesion down to a molecular level. Consequently, analytical methods that rely on bioactive and multivalent sensor surfaces are sought to detect and suppress infections. To deliver functional sensor surfaces with an optimized degree of molecular packaging, we explore a library of compact and monodisperse dendritic scaffolds based on the nontoxic 2,2-bis(methylol)propionic acid (bis-MPA). A self-assembled dendritic monolayer (SADM) methodology to gold surfaces capitalizes on the design of aqueous soluble dendritic structures that bear sulfur-containing core functionalities. The nature of sulfur (either disulfide or thiol), the size of the dendritic framework (generation 1-3), the distance between the sulfur and the dendritic wedge (4 or 14 Å), and the type of functional end group (hydroxyl or mannose) were key structural elements that were identified to affect the packaging densities assembled on the surfaces. Both surface plasmon resonance (SPR) and resonance-enhanced surface impedance (RESI) experiments revealed rapid formation of homogenously covered SADMs on gold surfaces. The array of dendritic structures enabled the fabrication of functional gold surfaces displaying molecular covering densities of 0.33-2.2 molecules·nm(-2) and functional availability of 0.95-5.5 groups·nm(-2). The cell scavenging ability of these sensor surfaces for Escherichia coli MS7fim+ bacteria revealed 2.5 times enhanced recognition for G3-mannosylated surfaces when compared to G3-hydroxylated SADM surfaces. This promising methodology delivers functional gold sensor surfaces and represents a facile route for probing surface interactions between multivalently presented motifs and cells in a controlled surface setting.

  6. The burden of headache disorders in India: methodology and questionnaire validation for a community-based survey in Karnataka State.

    Science.gov (United States)

    Rao, Girish N; Kulkarni, Girish B; Gururaj, Gopalkrishna; Rajesh, Kavita; Subbakrishna, D Kumaraswamy; Steiner, Timothy J; Stovner, Lars J

    2012-10-01

    Primary headache disorders are a major public-health problem globally and, possibly more so, in low- and middle-income countries. No methodologically sound studies of prevalence and burden of headache in the adult Indian population have been published previously. The present study was a door-to-door cold-calling survey in urban and rural areas in and around Bangalore, Karnataka State. From 2,714 households contacted, 2,514 biologically unrelated individuals were eligible for the survey and 2,329 (92.9 %) participated (1,103 [48 %] rural; 1,226 [52 %] urban; 1,141 [49 %] male; 1,188 [51 %] female; mean age 38.0 years). The focus was on primary headache (migraine and tension-type headache [TTH]) and medication-overuse headache. A structured questionnaire administered by trained lay interviewers was the instrument both for diagnosis (algorithmically determined from responses) and burden estimation. The screening question enquired into headache in the last year. The validation study compared questionnaire-based diagnoses with those obtained soon after through personal interview by a neurologist in a random sub-sample of participants (n = 381; 16 %). It showed high values (> 80 %) for sensitivity, specificity and predictive values for any headache, and for specificity and negative predictive value for migraine and TTH. Kappa values for diagnostic agreement were good for any headache (0.69 [95 % CI 0.61-0.76]), moderate (0.46 [0.35-0.56]) for migraine and fair (0.39 [0.29-0.49]) for TTH. The survey methodology, including identification of and access to participants, proved feasible. The questionnaire proved effective in the survey population. The study will give reliable estimates of the prevalence and burden of headache, and of migraine and TTH specifically, in urban and rural Karnataka.

  7. Percentage of Hypothetical Well Pumpage Causing Depletions to Simulated Base Flow, Evapotranspiration, and Groundwater Storage in the Elkhorn and Loup River Basins, 2006 through 2055

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This data release includes a polygon shapefile of grid cells attributed with values representing the simulated base-flow, evapotranspiration, and groundwater-storage...

  8. Mesoscale modelling methodology based on nudging to increase accuracy in WRA

    Science.gov (United States)

    Mylonas Dirdiris, Markos; Barbouchi, Sami; Hermmann, Hugo

    2016-04-01

    applying observational nudging will be validated against the higher level of FINO1 met mast using verification statistical metrics such as root mean square error (RMSE), standard deviation of mean error (ME Std), mean error average (bias) and Pearson correlation coefficient (R). The respective process will be followed for different atmospheric stratification regimes in order to evaluate the sensibility of the method to the atmospheric stability. Finally, since wind speed does not have an equally distributed impact on the power yield, the uncertainty will be measured using two ways resulting in a global uncertainty and one per wind speed bin based on a wind turbine power curve in order to evaluate the WRF for the purposes of wind power generation. Conclusion This study shows the higher accuracy of the WRF model after nudging observational data. In a next step these results will be compared with traditional vertical extrapolation methods such as power and log laws. The larger picture of this work would be to nudge the observations from a short offshore metmast in order for the WRF to reconstruct accurately the entire wind profile of the atmosphere up to hub height. This is an important step in order to reduce the cost of offshore WRA. Learning objectives 1. The audience will get a clear view of the added value of observational nudging; 2. An interesting way to calculate WRF uncertainty will be described, linking wind speed uncertainty to energy uncertainty.

  9. Electrical performance verification methodology for large reflector antennas: based on the P-band SAR payload of the ESA BIOMASS candidate mission

    DEFF Research Database (Denmark)

    Pivnenko, Sergey; Kim, Oleksiy S.; Nielsen, Jeppe Majlund;

    2013-01-01

    In this paper, an electrical performance verification methodology for large reflector antennas is proposed. The verification methodology was developed for the BIOMASS P-band (435 MHz) synthetic aperture radar (SAR), but can be applied to other large deployable or fixed reflector antennas for which...... the verification of the entire antenna or payload is impossible. The two-step methodology is based on accurate measurement of the feed structure characteristics, such as complex radiation pattern and radiation efficiency, with an appropriate Measurement technique, and then accurate calculation of the radiation...... pattern and gain of the entire antenna including support and satellite structure with an appropriate computational software. A preliminary investigation of the proposed methodology was carried out by performing extensive simulations of different verification approaches. The experimental validation...

  10. Exploiting Behaviorist and Communicative Action-Based Methodologies in CALL Applications for the Teaching of Pronunciation in French as a Foreign Language

    Science.gov (United States)

    Burston, Jack; Georgiadou, Olga; Monville-Burston, Monique

    2016-01-01

    This article describes the use of instructional technology to promote, through a combination of behaviorist/structuralist and communicative/action-based methodologies, correct pronunciation of French as a Foreign Language (FFL) to native Greek-speaking false-beginner-level learners. It is based on the analysis of a teaching programme, complemented…

  11. The Case of Ozone Depletion

    Science.gov (United States)

    Lambright, W. Henry

    2005-01-01

    While the National Aeronautics and Space Administration (NASA) is widely perceived as a space agency, since its inception NASA has had a mission dedicated to the home planet. Initially, this mission involved using space to better observe and predict weather and to enable worldwide communication. Meteorological and communication satellites showed the value of space for earthly endeavors in the 1960s. In 1972, NASA launched Landsat, and the era of earth-resource monitoring began. At the same time, in the late 1960s and early 1970s, the environmental movement swept throughout the United States and most industrialized countries. The first Earth Day event took place in 1970, and the government generally began to pay much more attention to issues of environmental quality. Mitigating pollution became an overriding objective for many agencies. NASA's existing mission to observe planet Earth was augmented in these years and directed more toward environmental quality. In the 1980s, NASA sought to plan and establish a new environmental effort that eventuated in the 1990s with the Earth Observing System (EOS). The Agency was able to make its initial mark via atmospheric monitoring, specifically ozone depletion. An important policy stimulus in many respects, ozone depletion spawned the Montreal Protocol of 1987 (the most significant international environmental treaty then in existence). It also was an issue critical to NASA's history that served as a bridge linking NASA's weather and land-resource satellites to NASA s concern for the global changes affecting the home planet. Significantly, as a global environmental problem, ozone depletion underscored the importance of NASA's ability to observe Earth from space. Moreover, the NASA management team's ability to apply large-scale research efforts and mobilize the talents of other agencies and the private sector illuminated its role as a lead agency capable of crossing organizational boundaries as well as the science-policy divide.

  12. ECO-ENVIRONMENTAL ASSESSMENT AND ANALYSIS OF TONGLVSHAN MINING AREA IN DAYE CITY, HUBEI PROVINCE BASED ON SPATIOTEMPORAL METHODOLOGY

    Directory of Open Access Journals (Sweden)

    X. M. Zhang

    2015-07-01

    Full Text Available Mine exploitation has a significant impact on the ecological environment status of the surroundings. To analyze the impact of Tonglvshan Mining area to its surroundings, this paper adopted the spatiotemporal methodology based on the extracted Eco-environmental Quality Index (EQI to analysis the extent and degree of the effect. The spatiotemporal methodologies are based on two scales: buffers and administrative units. EQI includes Biological Abundance Index (BAI, Vegetation Index (VI, Water Network Density Index (WNDI, and Land Degradation Index (LDI. The weight of each Index was determined by the analytic hierarchy process (AHP and scores of the experts. The calculating of EQI was referenced to the standard “Technical criterion for Eco-environment Status Evaluation” (HJ/T192-2006)and the “Standards for Classification and Gradation of Soil Erosion” (SL 190-96). Considering ecological and environmental characteristics relevant to China, this method has been widely used to study the environment status of specific regions in China. The assessment based on buffers adopted the radius of 300m, 500m, 700m, 1000m, 1500m, 2000m, 2500m, 3000m, 3500m, and 4000m as the buffers in 3 typical miners respectively. The calculated result indicates that, the REI is increasing with the radius and the increasing rate becoming smaller until REI is stable. Which means the effect of miner is getting weaker with the distance to the miner is increasing and the effect is diminished when the distance is far enough. The analysis of the 3 typical miner shows that the extent and degree of the effect of miner relates not only with the area of the miner, but also with type of mineral resource, the status of mining and the ecological restoration. The assessment was also carried out by calculating the EQI in 14 administrative units in Daye city in 2000, 2005, and 2010. The study shows that the EQI is decreasing in 14 units from 2000 to 2010. The spatiotemporal

  13. ARC-VM: An architecture real options complexity-based valuation methodology for military systems-of-systems acquisitions

    Science.gov (United States)

    Domercant, Jean Charles

    The combination of today's national security environment and mandated acquisition policies makes it necessary for military systems to interoperate with each other to greater degrees. This growing interdependency results in complex Systems-of-Systems (SoS) that only continue to grow in complexity to meet evolving capability needs. Thus, timely and affordable acquisition becomes more difficult, especially in the face of mounting budgetary pressures. To counter this, architecting principles must be applied to SoS design. The research objective is to develop an Architecture Real Options Complexity-Based Valuation Methodology (ARC-VM) suitable for acquisition-level decision making, where there is a stated desire for more informed tradeoffs between cost, schedule, and performance during the early phases of design. First, a framework is introduced to measure architecture complexity as it directly relates to military SoS. Development of the framework draws upon a diverse set of disciplines, including Complexity Science, software architecting, measurement theory, and utility theory. Next, a Real Options based valuation strategy is developed using techniques established for financial stock options that have recently been adapted for use in business and engineering decisions. The derived complexity measure provides architects with an objective measure of complexity that focuses on relevant complex system attributes. These attributes are related to the organization and distribution of SoS functionality and the sharing and processing of resources. The use of Real Options provides the necessary conceptual and visual framework to quantifiably and traceably combine measured architecture complexity, time-valued performance levels, as well as programmatic risks and uncertainties. An example suppression of enemy air defenses (SEAD) capability demonstrates the development and usefulness of the resulting architecture complexity & Real Options based valuation methodology. Different

  14. A Systematic Methodology for Uncertainty Analysis of Group Contribution Based and Atom Connectivity Index Based Models for Estimation of Properties of Pure Components

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Sarup, Bent; Sin, Gürkan

    One of the most widely employed group contribution method for estimation of properties of pure components is the Marrero and Gani (MG) method. For the given component whose molecular structure is not completely described by any of the available groups, group contribution+ method (combined MG method...... and atomic connectivity index method) has been employed to create the missing groups and predict their contributions through the regressed contributions of connectivity indices. The objective of this work is to develop a systematic methodology to carry out uncertainty analysis of group contribution based...... and atom connectivity index based property prediction models. This includes: (i) parameter estimation using available MG based property prediction models and large training sets to determine improved group and atom contributions; and (ii) uncertainty analysis to establish statistical information...

  15. Mindfulness-based intervention for prodromal sleep disturbances in older adults: design and methodology of a randomized controlled trial.

    Science.gov (United States)

    Black, David S; O'Reilly, Gillian A; Olmstead, Richard; Breen, Elizabeth C; Irwin, Michael R

    2014-09-01

    Sleep problems are prevalent among older adults, often persist untreated, and are predictive of health detriments. Given the limitations of conventional treatments, non-pharmacological treatments such as mindfulness-based interventions (MBIs) are gaining popularity for sleep ailments. However, nothing is yet known about the impact of MBIs on sleep in older adults with prodromal sleep disturbances. This article details the design and methodology of a 6-week parallel-group RCT calibrated to test the treatment effect of the Mindful Awareness Practices (MAPs) program versus sleep hygiene education for improving sleep quality, as the main outcome, in older adults with prodromal sleep disturbances. Older adults with current sleep disturbances will be recruited from the urban Los Angeles community. Participants will be randomized into two standardized treatment conditions, MAPs and sleep hygiene education. Each condition will consist of weekly 2-hour group-based classes over the course of the 6-week intervention. The primary objective of this study is to determine if mindfulness meditation practice as engaged through the MAPs program leads to improved sleep quality relative to sleep hygiene education in older adults with prodromal sleep disturbances.

  16. Methodology and a preliminary data base for examining the health risks of electricity generation from uranium and coal fuels

    Energy Technology Data Exchange (ETDEWEB)

    El-Bassioni, A.A.

    1980-08-01

    An analytical model was developed to assess and examine the health effects associated with the production of electricity from uranium and coal fuels. The model is based on a systematic methodology that is both simple and easy to check, and provides details about the various components of health risk. A preliminary set of data that is needed to calculate the health risks was gathered, normalized to the model facilities, and presented in a concise manner. Additional data will become available as a result of other evaluations of both fuel cycles, and they should be included in the data base. An iterative approach involving only a few steps is recommended for validating the model. After each validation step, the model is improved in the areas where new information or increased interest justifies such upgrading. Sensitivity analysis is proposed as the best method of using the model to its full potential. Detailed quantification of the risks associated with the two fuel cycles is not presented in this report. The evaluation of risks from producing electricity by these two methods can be completed only after several steps that address difficult social and technical questions. Preliminary quantitative assessment showed that several factors not considered in detail in previous studies are potentially important. 255 refs., 21 figs., 179 tabs.

  17. Response Surface Methodology for the Optimization of Preparation of Biocomposites Based on Poly(lactic acid and Durian Peel Cellulose

    Directory of Open Access Journals (Sweden)

    Patpen Penjumras

    2015-01-01

    Full Text Available Response surface methodology was used to optimize preparation of biocomposites based on poly(lactic acid and durian peel cellulose. The effects of cellulose loading, mixing temperature, and mixing time on tensile strength and impact strength were investigated. A central composite design was employed to determine the optimum preparation condition of the biocomposites to obtain the highest tensile strength and impact strength. A second-order polynomial model was developed for predicting the tensile strength and impact strength based on the composite design. It was found that composites were best fit by a quadratic regression model with high coefficient of determination (R2 value. The selected optimum condition was 35 wt.% cellulose loading at 165°C and 15 min of mixing, leading to a desirability of 94.6%. Under the optimum condition, the tensile strength and impact strength of the biocomposites were 46.207 MPa and 2.931 kJ/m2, respectively.

  18. Transparent Guideline Methodology Needed

    DEFF Research Database (Denmark)

    Lidal, Ingeborg; Norén, Camilla; Mäkelä, Marjukka

    2013-01-01

    Group.2 Similar criteria for guideline quality have been suggested elsewhere.3 Our conclusion was that this much needed guideline is currently unclear about several aspects of the methodology used in developing the recommendations. This means potential users cannot be certain that the recommendations...... are based on best currently available evidence. Our concerns are in two main categories: the rigor of development, including methodology of searching, evaluating, and combining the evidence; and editorial independence, including funding and possible conflicts of interest....

  19. Case-Based Learning as Pedagogy for Teaching Information Ethics Based on the Dervin Sense-Making Methodology

    Science.gov (United States)

    Dow, Mirah J.; Boettcher, Carrie A.; Diego, Juana F.; Karch, Marziah E.; Todd-Diaz, Ashley; Woods, Kristine M.

    2015-01-01

    The purpose of this mixed methods study is to determine the effectiveness of case-based pedagogy in teaching basic principles of information ethics and ethical decision making. Study reports results of pre- and post-assessment completed by 49 library and information science (LIS) graduate students at a Midwestern university. Using Creswell's…

  20. "When the going gets tough, who keeps going?" : Depletion sensitivity moderates the ego-depletion effect

    NARCIS (Netherlands)

    Salmon, Stefanie J.; Adriaanse, Marieke A.; De Vet, Emely; Fennis, Bob M.; De Ridder, Denise T. D.

    2014-01-01

    Self-control relies on a limited resource that can get depleted, a phenomenon that has been labeled ego-depletion. We argue that individuals may differ in their sensitivity to depleting tasks, and that consequently some people deplete their self-control resource at a faster rate than others. In thre