WorldWideScience

Sample records for grounded case-based methodology

  1. GROUNDED THEORY METHODOLOGY and GROUNDED THEORY RESEARCH in TURKEY

    OpenAIRE

    ARIK, Ferhat; ARIK, Işıl Avşar

    2016-01-01

    This research discusses the historical development of the Grounded Theory Methodology, which is one of the qualitative research method, its transformation over time and how it is used as a methodology in Turkey. The Grounded Theory which was founded by Strauss and Glaser, is a qualitative methodology based on inductive logic to discover theories in contrast with the deductive understanding which is based on testing an existing theory in sociology. It is possible to examine the Grounded Theory...

  2. An ontological case base engineering methodology for diabetes management.

    Science.gov (United States)

    El-Sappagh, Shaker H; El-Masri, Samir; Elmogy, Mohammed; Riad, A M; Saddik, Basema

    2014-08-01

    Ontology engineering covers issues related to ontology development and use. In Case Based Reasoning (CBR) system, ontology plays two main roles; the first as case base and the second as domain ontology. However, the ontology engineering literature does not provide adequate guidance on how to build, evaluate, and maintain ontologies. This paper proposes an ontology engineering methodology to generate case bases in the medical domain. It mainly focuses on the research of case representation in the form of ontology to support the case semantic retrieval and enhance all knowledge intensive CBR processes. A case study on diabetes diagnosis case base will be provided to evaluate the proposed methodology.

  3. Essential methodological considerations when using grounded theory.

    Science.gov (United States)

    Achora, Susan; Matua, Gerald Amandu

    2016-07-01

    To suggest important methodological considerations when using grounded theory. A research method widely used in nursing research is grounded theory, at the centre of which is theory construction. However, researchers still struggle with some of its methodological issues. Although grounded theory is widely used to study and explain issues in nursing practice, many researchers are still failing to adhere to its rigorous standards. Researchers should articulate the focus of their investigations - the substantive area of interest as well as the focal population. This should be followed by a succinct explanation of the strategies used to collect and analyse data, supported by clear coding processes. Finally, the resolution of the core issues, including the core category and related categories, should be explained to advance readers' understanding. Researchers should endeavour to understand the tenets of grounded theory. This enables 'neophytes' in particular to make methodological decisions that will improve their studies' rigour and fit with grounded theory. This paper complements the current dialogue on improving the understanding of grounded theory methodology in nursing research. The paper also suggests important procedural decisions researchers need to make to preserve their studies' scientific merit and fit with grounded theory.

  4. A Java-Web-Based-Learning Methodology, Case Study ...

    African Journals Online (AJOL)

    A Java-Web-Based-Learning Methodology, Case Study : Waterborne diseases. The recent advances in web technologies have opened new opportunities for computer-based-education. One can learn independently of time and place constraints, and have instantaneous access to relevant updated material at minimal cost.

  5. Grounded Theory Methodology: Positivism, Hermeneutics, and Pragmatism

    Science.gov (United States)

    Age, Lars-Johan

    2011-01-01

    Glaserian grounded theory methodology, which has been widely adopted as a scientific methodology in recent decades, has been variously characterised as "hermeneutic" and "positivist." This commentary therefore takes a different approach to characterising grounded theory by undertaking a comprehensive analysis of: (a) the philosophical paradigms of…

  6. Case Study Research Methodology in Nursing Research.

    Science.gov (United States)

    Cope, Diane G

    2015-11-01

    Through data collection methods using a holistic approach that focuses on variables in a natural setting, qualitative research methods seek to understand participants' perceptions and interpretations. Common qualitative research methods include ethnography, phenomenology, grounded theory, and historic research. Another type of methodology that has a similar qualitative approach is case study research, which seeks to understand a phenomenon or case from multiple perspectives within a given real-world context.

  7. Business analysis methodology in telecommunication industry – the research based on the grounded theory

    Directory of Open Access Journals (Sweden)

    Hana Nenickova

    2013-10-01

    Full Text Available The objective of this article is to present the grounded theory using in the qualitative research as a basis to build a business analysis methodology for the implementation of information systems in telecommunication enterprises in Czech Republic. In the preparation of the methodology I have used the current needs of telecommunications companies, which are characterized mainly by high dependence on information systems. Besides that, this industry is characterized by high flexibility and competition and compressing of the corporate strategy timeline. The grounded theory of business analysis defines the specifics of the telecommunications industry, focusing on the very specific description of the procedure for collecting the business requirements and following the business strategy.

  8. The Methodological Dynamism of Grounded Theory

    Directory of Open Access Journals (Sweden)

    Nicholas Ralph

    2015-11-01

    Full Text Available Variations in grounded theory (GT interpretation are the subject of ongoing debate. Divergences of opinion, genres, approaches, methodologies, and methods exist, resulting in disagreement on what GT methodology is and how it comes to be. From the postpositivism of Glaser and Strauss, to the symbolic interactionist roots of Strauss and Corbin, through to the constructivism of Charmaz, the field of GT methodology is distinctive in the sense that those using it offer new ontological, epistemological, and methodological perspectives at specific moments in time. We explore the unusual dynamism attached to GT’s underpinnings. Our view is that through a process of symbolic interactionism, in which generations of researchers interact with their context, moments are formed and philosophical perspectives are interpreted in a manner congruent with GT’s essential methods. We call this methodological dynamism, a process characterized by contextual awareness and moment formation, contemporaneous translation, generational methodology, and methodological consumerism.

  9. Surface Signature Characterization at SPE through Ground-Proximal Methods: Methodology Change and Technical Justification

    Energy Technology Data Exchange (ETDEWEB)

    Schultz-Fellenz, Emily S. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-09-09

    A portion of LANL’s FY15 SPE objectives includes initial ground-based or ground-proximal investigations at the SPE Phase 2 site. The area of interest is the U2ez location in Yucca Flat. This collection serves as a baseline for discrimination of surface features and acquisition of topographic signatures prior to any development or pre-shot activities associated with SPE Phase 2. Our team originally intended to perform our field investigations using previously vetted ground-based (GB) LIDAR methodologies. However, the extended proposed time frame of the GB LIDAR data collection, and associated data processing time and delivery date, were unacceptable. After technical consultation and careful literature research, LANL identified an alternative methodology to achieve our technical objectives and fully support critical model parameterization. Very-low-altitude unmanned aerial systems (UAS) photogrammetry appeared to satisfy our objectives in lieu of GB LIDAR. The SPE Phase 2 baseline collection was used as a test of this UAS photogrammetric methodology.

  10. Grounded theory methodology - has it become a movement?

    OpenAIRE

    Berterö, Carina

    2012-01-01

    There is an ongoing debate regarding the nature of grounded theory, and an examination of many studies claiming to follow grounded theory indicates a wide range of approaches. In 1967 Glaser and Strauss’s ‘‘The Discovery of Grounded Theory; Strategies for Qualitative Research’’ was published and represented a breakthrough in qualitative research; it offered methodological consensus and systematic strategies for qualitative research practice. The defining characteristics of grounded theory inc...

  11. Selection of Grounded Theory as an Appropriate Research Methodology for a Dissertation: One Student’s Perspective

    Directory of Open Access Journals (Sweden)

    James W. Jones, Ed.D.

    2009-06-01

    Full Text Available Doctoral students wanting to use grounded theory as a methodological approach for their dissertation often face multiple challenges gaining acceptance of their approach by their committee. This paper presents the case that the author used to overcome these challenges through the process of eliminating other methodologies, leaving grounded theory as the preferred method for the desired research issue. Through examining the approach used successfully by the author, other doctoral students will be able to frame similar arguments justifying the use of grounded theory in their dissertations and seeing the use of the method continue to spread into new fields and applications. This paper examines the case built for selecting grounded theory as a defensible dissertation approach. The basic research issue that I wanted to investigate was how practitioners in an applied field sought information in their work; in other words, how they researched. I further narrowed the investigation down to a more specific field, but the paper presented here is left in broader form so that other students can see the approach in more general terms.

  12. Methodological Grounds of Managing Innovation Development of Restaurants

    OpenAIRE

    Naidiuk V. S.

    2013-01-01

    The goal of the article lies in identification and further development of methodological grounds of managing the innovation development of restaurants. Based on the data of the critical analysis of existing scientific views on interpretation of the essence of the "managing innovation development of an enterprise" notion, the article conducts clarification of this definition. In the result of the study the article builds up a cause-effect diagram of solution of the problem of ensuring efficien...

  13. [The grounded theory as a methodological alternative for nursing research].

    Science.gov (United States)

    dos Santos, Sérgio Ribeiro; da Nóbrega, Maria Miriam

    2002-01-01

    This study presents a method of interpretative and systematic research with appliance to the development of studies in nursing called "the grounded theory", whose theoretical support is the symbolic interactionism. The purpose of the paper is to describe the grounded theory as an alternative methodology for the construction of knowledge in nursing. The study highlights four topics: the basic principle, the basic concepts, the trajectory of the method and the process of analysis of the data. We conclude that the systematization of data and its interpretation, based on social actors' experience, constitute strong subsidies to generate theories through this research tool.

  14. A Proven Methodology for Developing Secure Software and Applying It to Ground Systems

    Science.gov (United States)

    Bailey, Brandon

    2016-01-01

    Part Two expands upon Part One in an attempt to translate the methodology for ground system personnel. The goal is to build upon the methodology presented in Part One by showing examples and details on how to implement the methodology. Section 1: Ground Systems Overview; Section 2: Secure Software Development; Section 3: Defense in Depth for Ground Systems; Section 4: What Now?

  15. Are There Two Methods of Grounded Theory? Demystifying the Methodological Debate

    Directory of Open Access Journals (Sweden)

    Cheri Ann Hernandez, RN, Ph.D., CDE

    2008-06-01

    Full Text Available Grounded theory is an inductive research method for the generation of substantive or formal theory, using qualitative or quantitative data generated from research interviews, observation, or written sources, or some combination thereof (Glaser & Strauss, 1967. In recent years there has been much controversy over the etiology of its discovery, as well as, the exact way in which grounded theory research is to be operationalized. Unfortunately, this situation has resulted in much confusion, particularly among novice researchers who wish to utilize this research method. In this article, the historical, methodological and philosophical roots of grounded theory are delineated in a beginning effort to demystify this methodological debate. Grounded theory variants such as feminist grounded theory (Wuest, 1995 or constructivist grounded theory (Charmaz, 1990 are beyond the scope of this discussion.

  16. Grounded theory: a methodological spiral from positivism to postmodernism.

    Science.gov (United States)

    Mills, Jane; Chapman, Ysanne; Bonner, Ann; Francis, Karen

    2007-04-01

    Our aim in this paper is to explain a methodological/methods package devised to incorporate situational and social world mapping with frame analysis, based on a grounded theory study of Australian rural nurses' experiences of mentoring. Situational analysis, as conceived by Adele Clarke, shifts the research methodology of grounded theory from being located within a postpositivist paradigm to a postmodern paradigm. Clarke uses three types of maps during this process: situational, social world and positional, in combination with discourse analysis. During our grounded theory study, the process of concurrent interview data generation and analysis incorporated situational and social world mapping techniques. An outcome of this was our increased awareness of how outside actors influenced participants in their constructions of mentoring. In our attempts to use Clarke's methodological package, however, it became apparent that our constructivist beliefs about human agency could not be reconciled with the postmodern project of discourse analysis. We then turned to the literature on symbolic interactionism and adopted frame analysis as a method to examine the literature on rural nursing and mentoring as secondary form of data. While we found situational and social world mapping very useful, we were less successful in using positional maps. In retrospect, we would argue that collective action framing provides an alternative to analysing such positions in the literature. This is particularly so for researchers who locate themselves within a constructivist paradigm, and who are therefore unwilling to reject the notion of human agency and the ability of individuals to shape their world in some way. Our example of using this package of situational and social worlds mapping with frame analysis is intended to assist other researchers to locate participants more transparently in the social worlds that they negotiate in their everyday practice.

  17. A case study for INPRO methodology based on Indian advanced heavy water reactor

    International Nuclear Information System (INIS)

    Anantharaman, K.; Saha, D.; Sinha, R.K.

    2004-01-01

    Under Phase 1A of the International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO) a methodology (INPRO methodology) has been developed which can be used to evaluate a given energy system or a component of such a system on a national and/or global basis. The INPRO study can be used for assessing the potential of the innovative reactor in terms of economics, sustainability and environment, safety, waste management, proliferation resistance and cross cutting issues. India, a participant in INPRO program, is engaged in a case study applying INPRO methodology based on Advanced Heavy Water Reactor (AHWR). AHWR is a 300 MWe, boiling light water cooled, heavy water moderated and vertical pressure tube type reactor. Thorium utilization is very essential for Indian nuclear power program considering the indigenous resource availability. The AHWR is designed to produce most of its power from thorium, aided by a small input of plutonium-based fuel. The features of AHWR are described in the paper. The case study covers the fuel cycle, to be followed in the near future, for AHWR. The paper deals with initial observations of the case study with regard to fuel cycle issues. (authors)

  18. Where Are the Grounds for Grounded Theory? A Troubled Empirical Methodology Meets Wittgenstein

    Science.gov (United States)

    James, Fiona

    2018-01-01

    This article provides a critical exposition of the epistemological underpinnings of a recent redevelopment of Grounded Theory (GT) methodology, "Constructivist" GT. Although proffered as freed from the "objectivist" tenets of the original version, critical examination exposes the essentialism threaded through its integral…

  19. Back- and fore-grounding ontology: exploring the linkages between critical realism, pragmatism, and methodologies in health & rehabilitation sciences.

    Science.gov (United States)

    DeForge, Ryan; Shaw, Jay

    2012-03-01

    Back- and fore-grounding ontology: exploring the linkages between critical realism, pragmatism, and methodologies in health & rehabilitation sciences As two doctoral candidates in a health and rehabilitation sciences program, we describe in this paper our respective paradigmatic locations along a quite nonlinear ontological-epistemological-axiological-methodological chain. In a turn-taking fashion, we unpack the tenets of critical realism and pragmatism, and then trace the linkages from these paradigmatic locations through to the methodological choices that address a community-based research problem. Beyond serving as an answer to calls for academics in training to demonstrate philosophical-theoretical-methodological integrity and coherence in their scholarship, this paper represents critical realism and its fore-grounding of a deeply stratified ontology in reflexive relation to pragmatism and its back-grounding of ontology. We conclude by considering the merits and challenges of conducting research from within singular versus proliferate paradigmatic perspectives. © 2011 Blackwell Publishing Ltd.

  20. A New Methodology for Open Pit Slope Design in Karst-Prone Ground Conditions Based on Integrated Stochastic-Limit Equilibrium Analysis

    Science.gov (United States)

    Zhang, Ke; Cao, Ping; Ma, Guowei; Fan, Wenchen; Meng, Jingjing; Li, Kaihui

    2016-07-01

    Using the Chengmenshan Copper Mine as a case study, a new methodology for open pit slope design in karst-prone ground conditions is presented based on integrated stochastic-limit equilibrium analysis. The numerical modeling and optimization design procedure contain a collection of drill core data, karst cave stochastic model generation, SLIDE simulation and bisection method optimization. Borehole investigations are performed, and the statistical result shows that the length of the karst cave fits a negative exponential distribution model, but the length of carbonatite does not exactly follow any standard distribution. The inverse transform method and acceptance-rejection method are used to reproduce the length of the karst cave and carbonatite, respectively. A code for karst cave stochastic model generation, named KCSMG, is developed. The stability of the rock slope with the karst cave stochastic model is analyzed by combining the KCSMG code and the SLIDE program. This approach is then applied to study the effect of the karst cave on the stability of the open pit slope, and a procedure to optimize the open pit slope angle is presented.

  1. Monitoring Strategies of Earth Dams by Ground-Based Radar Interferometry: How to Extract Useful Information for Seismic Risk Assessment.

    Science.gov (United States)

    Di Pasquale, Andrea; Nico, Giovanni; Pitullo, Alfredo; Prezioso, Giuseppina

    2018-01-16

    The aim of this paper is to describe how ground-based radar interferometry can provide displacement measurements of earth dam surfaces and of vibration frequencies of its main concrete infrastructures. In many cases, dams were built many decades ago and, at that time, were not equipped with in situ sensors embedded in the structure when they were built. Earth dams have scattering properties similar to landslides for which the Ground-Based Synthetic Aperture Radar (GBSAR) technique has been so far extensively applied to study ground displacements. In this work, SAR and Real Aperture Radar (RAR) configurations are used for the measurement of earth dam surface displacements and vibration frequencies of concrete structures, respectively. A methodology for the acquisition of SAR data and the rendering of results is described. The geometrical correction factor, needed to transform the Line-of-Sight (LoS) displacement measurements of GBSAR into an estimate of the horizontal displacement vector of the dam surface, is derived. Furthermore, a methodology for the acquisition of RAR data and the representation of displacement temporal profiles and vibration frequency spectra of dam concrete structures is presented. For this study a Ku-band ground-based radar, equipped with horn antennas having different radiation patterns, has been used. Four case studies, using different radar acquisition strategies specifically developed for the monitoring of earth dams, are examined. The results of this work show the information that a Ku-band ground-based radar can provide to structural engineers for a non-destructive seismic assessment of earth dams.

  2. A Step-by-Step Design Methodology for a Base Case Vanadium Redox-Flow Battery

    Science.gov (United States)

    Moore, Mark; Counce, Robert M.; Watson, Jack S.; Zawodzinski, Thomas A.; Kamath, Haresh

    2012-01-01

    The purpose of this work is to develop an evolutionary procedure to be used by Chemical Engineering students for the base-case design of a Vanadium Redox-Flow Battery. The design methodology is based on the work of Douglas (1985) and provides a profitability analysis at each decision level so that more profitable alternatives and directions can be…

  3. A Novel Methodology for Improving Plant Pest Surveillance in Vineyards and Crops Using UAV-Based Hyperspectral and Spatial Data.

    Science.gov (United States)

    Vanegas, Fernando; Bratanov, Dmitry; Powell, Kevin; Weiss, John; Gonzalez, Felipe

    2018-01-17

    Recent advances in remote sensed imagery and geospatial image processing using unmanned aerial vehicles (UAVs) have enabled the rapid and ongoing development of monitoring tools for crop management and the detection/surveillance of insect pests. This paper describes a (UAV) remote sensing-based methodology to increase the efficiency of existing surveillance practices (human inspectors and insect traps) for detecting pest infestations (e.g., grape phylloxera in vineyards). The methodology uses a UAV integrated with advanced digital hyperspectral, multispectral, and RGB sensors. We implemented the methodology for the development of a predictive model for phylloxera detection. In this method, we explore the combination of airborne RGB, multispectral, and hyperspectral imagery with ground-based data at two separate time periods and under different levels of phylloxera infestation. We describe the technology used-the sensors, the UAV, and the flight operations-the processing workflow of the datasets from each imagery type, and the methods for combining multiple airborne with ground-based datasets. Finally, we present relevant results of correlation between the different processed datasets. The objective of this research is to develop a novel methodology for collecting, processing, analising and integrating multispectral, hyperspectral, ground and spatial data to remote sense different variables in different applications, such as, in this case, plant pest surveillance. The development of such methodology would provide researchers, agronomists, and UAV practitioners reliable data collection protocols and methods to achieve faster processing techniques and integrate multiple sources of data in diverse remote sensing applications.

  4. A Novel Methodology for Improving Plant Pest Surveillance in Vineyards and Crops Using UAV-Based Hyperspectral and Spatial Data

    Science.gov (United States)

    Vanegas, Fernando; Weiss, John; Gonzalez, Felipe

    2018-01-01

    Recent advances in remote sensed imagery and geospatial image processing using unmanned aerial vehicles (UAVs) have enabled the rapid and ongoing development of monitoring tools for crop management and the detection/surveillance of insect pests. This paper describes a (UAV) remote sensing-based methodology to increase the efficiency of existing surveillance practices (human inspectors and insect traps) for detecting pest infestations (e.g., grape phylloxera in vineyards). The methodology uses a UAV integrated with advanced digital hyperspectral, multispectral, and RGB sensors. We implemented the methodology for the development of a predictive model for phylloxera detection. In this method, we explore the combination of airborne RGB, multispectral, and hyperspectral imagery with ground-based data at two separate time periods and under different levels of phylloxera infestation. We describe the technology used—the sensors, the UAV, and the flight operations—the processing workflow of the datasets from each imagery type, and the methods for combining multiple airborne with ground-based datasets. Finally, we present relevant results of correlation between the different processed datasets. The objective of this research is to develop a novel methodology for collecting, processing, analysing and integrating multispectral, hyperspectral, ground and spatial data to remote sense different variables in different applications, such as, in this case, plant pest surveillance. The development of such methodology would provide researchers, agronomists, and UAV practitioners reliable data collection protocols and methods to achieve faster processing techniques and integrate multiple sources of data in diverse remote sensing applications. PMID:29342101

  5. Methodological tools for the collection and analysis of participant observation data using grounded theory.

    Science.gov (United States)

    Laitinen, Heleena; Kaunonen, Marja; Astedt-Kurki, Päivi

    2014-11-01

    To give clarity to the analysis of participant observation in nursing when implementing the grounded theory method. Participant observation (PO) is a method of collecting data that reveals the reality of daily life in a specific context. In grounded theory, interviews are the primary method of collecting data but PO gives a distinctive insight, revealing what people are really doing, instead of what they say they are doing. However, more focus is needed on the analysis of PO. An observational study carried out to gain awareness of nursing care and its electronic documentation in four acute care wards in hospitals in Finland. Discussion of using the grounded theory method and PO as a data collection tool. The following methodological tools are discussed: an observational protocol, jotting of notes, microanalysis, the use of questioning, constant comparison, and writing and illustrating. Each tool has specific significance in collecting and analysing data, working in constant interaction. Grounded theory and participant observation supplied rich data and revealed the complexity of the daily reality of acute care. In this study, the methodological tools provided a base for the study at the research sites and outside. The process as a whole was challenging. It was time-consuming and it required rigorous and simultaneous data collection and analysis, including reflective writing. Using these methodological tools helped the researcher stay focused from data collection and analysis to building theory. Using PO as a data collection method in qualitative nursing research provides insights. It is not commonly discussed in nursing research and therefore this study can provide insight, which cannot be seen or revealed by using other data collection methods. Therefore, this paper can produce a useful tool for those who intend to use PO and grounded theory in their nursing research.

  6. Agile methodology selection criteria: IT start-up case study

    Science.gov (United States)

    Micic, Lj

    2017-05-01

    Project management in modern IT companies is often based on agile methodologies which have several advantages compared to traditional methodologies such is waterfall. Having in mind that clients sometimes change project during development it is crucial for an IT company to choose carefully which methodology is going to implement and is it going to be mostly based on one or is it going got be combination of several. There are several modern and often used methodologies but among those Scrum, Kanban and XP programming are usually the most common. Sometimes companies use mostly tools and procedures from one but quite often they use some of the combination of those methodologies. Having in mind that those methodologies are just a framework they allow companies to adapt it for their specific projects as well as for other limitations. These methodologies are in limited usage Bosnia but more and more IT companies are starting to use agile methodologies because it is practice and common not just for their clients abroad but also starting to be the only option in order to deliver quality product on time. However it is always challenging which methodology or combination of several companies should implement and how to connect it to its own project, organizational framework and HR management. This paper presents one case study based on local IT start up and delivers solution based on theoretical framework and practical limitations that case company has.

  7. Ground-truth aerosol lidar observations: can the Klett solutions obtained from ground and space be equal for the same aerosol case?

    International Nuclear Information System (INIS)

    Ansmann, Albert

    2006-01-01

    Upcoming multiyear satellite lidar aerosol observations need strong support by a worldwide ground-truth lidar network. In this context the question arises as to whether the ground stations can deliver the same results as obtained from space when the Klett formalism is applied to elastic backscatter lidar data for the same aerosol case. This question is investigated based on simulations of observed cases of simple and complex aerosol layering. The results show that the differences between spaceborne and ground-based observations can be as large as20% for the backscatter and extinction coefficients and the optimum estimates of the column lidar ratios. In cases with complex aerosol layering, the application of the two-layer approach can lead to similar results (space, ground) and accurate products provided that horizontally homogeneous aerosol conditions are given

  8. Modeling ground-based timber harvesting systems using computer simulation

    Science.gov (United States)

    Jingxin Wang; Chris B. LeDoux

    2001-01-01

    Modeling ground-based timber harvesting systems with an object-oriented methodology was investigated. Object-oriented modeling and design promote a better understanding of requirements, cleaner designs, and better maintainability of the harvesting simulation system. The model developed simulates chainsaw felling, drive-to-tree feller-buncher, swing-to-tree single-grip...

  9. Methodology for evaluating the grounding system in electrical substations; Metodologia para la evaluacion del sistema de puesta a tierra en subestaciones electricas

    Energy Technology Data Exchange (ETDEWEB)

    Torrelles Rivas, L.F [Universidad Nacional Experimental Politecnica: Antonio Jose de Sucre (UNEXPO), Guayana, Bolivar (Venezuela)]. E-mail: torrellesluis@gmail.com; Alvarez, P. [Petroleos de Venezuela S.A (PDVSA), Maturin, Monagas (Venezuela)]. E-mail: alvarezph@pdvsa.com

    2013-03-15

    The present work proposes a methodology for evaluating grounding systems in electrical substations from medium and high voltage, in order to diagnose the state of the elements of the grounding system and the corresponding electrical variables. The assessment methodology developed includes a visual inspection phase to the elements of the substation. Then, by performing measurements and data analysis, the electrical continuity between the components of the substation and the mesh ground is verified, the soil resistivity and resistance of the mesh. Also included in the methodology the calculation of the step and touch voltage of the substation, based on the criteria of the International IEEE standards. We study the case of the 115 kV Pirital Substation belonging to PDVSA Oriente Transmission Network. [Spanish] En el presente trabajo se plantea una metodologia para la evaluacion de sistemas de puesta a tierra en subestaciones electricas de media y alta tension, con la finalidad de diagnosticar el estado de los elementos que conforman dicho sistema y las variables electricas correspondientes. La metodologia de evaluacion desarrollada incluye una fase de inspeccion visual de los elementos que conforman la subestacion. Luego, mediante la ejecucion de mediciones y analisis de datos, se verifica la continuidad electrica entre los componentes de la subestacion y la malla de puesta a tierra, la resistividad del suelo y resistencia de la malla. Se incluye tambien en la metodologia el calculo de las tensiones de paso y de toque de la subestacion, segun lo fundamentado en los criterios de los estandares Internacionales IEEE. Se estudia el caso de la Subestacion Pirital 115 kV perteneciente a la Red de Transmision de PDVSA Oriente.

  10. Physically based probabilistic seismic hazard analysis using broadband ground motion simulation: a case study for the Prince Islands Fault, Marmara Sea

    Science.gov (United States)

    Mert, Aydin; Fahjan, Yasin M.; Hutchings, Lawrence J.; Pınar, Ali

    2016-08-01

    The main motivation for this study was the impending occurrence of a catastrophic earthquake along the Prince Island Fault (PIF) in the Marmara Sea and the disaster risk around the Marmara region, especially in Istanbul. This study provides the results of a physically based probabilistic seismic hazard analysis (PSHA) methodology, using broadband strong ground motion simulations, for sites within the Marmara region, Turkey, that may be vulnerable to possible large earthquakes throughout the PIF segments in the Marmara Sea. The methodology is called physically based because it depends on the physical processes of earthquake rupture and wave propagation to simulate earthquake ground motion time histories. We included the effects of all considerable-magnitude earthquakes. To generate the high-frequency (0.5-20 Hz) part of the broadband earthquake simulation, real, small-magnitude earthquakes recorded by a local seismic array were used as empirical Green's functions. For the frequencies below 0.5 Hz, the simulations were obtained by using synthetic Green's functions, which are synthetic seismograms calculated by an explicit 2D /3D elastic finite difference wave propagation routine. By using a range of rupture scenarios for all considerable-magnitude earthquakes throughout the PIF segments, we produced a hazard calculation for frequencies of 0.1-20 Hz. The physically based PSHA used here followed the same procedure as conventional PSHA, except that conventional PSHA utilizes point sources or a series of point sources to represent earthquakes, and this approach utilizes the full rupture of earthquakes along faults. Furthermore, conventional PSHA predicts ground motion parameters by using empirical attenuation relationships, whereas this approach calculates synthetic seismograms for all magnitudes of earthquakes to obtain ground motion parameters. PSHA results were produced for 2, 10, and 50 % hazards for all sites studied in the Marmara region.

  11. Physically-Based Probabilistic Seismic Hazard Analysis Using Broad-Band Ground Motion Simulation: a Case Study for Prince Islands Fault, Marmara Sea

    Science.gov (United States)

    Mert, A.

    2016-12-01

    The main motivation of this study is the impending occurrence of a catastrophic earthquake along the Prince Island Fault (PIF) in Marmara Sea and the disaster risk around Marmara region, especially in İstanbul. This study provides the results of a physically-based Probabilistic Seismic Hazard Analysis (PSHA) methodology, using broad-band strong ground motion simulations, for sites within the Marmara region, Turkey, due to possible large earthquakes throughout the PIF segments in the Marmara Sea. The methodology is called physically-based because it depends on the physical processes of earthquake rupture and wave propagation to simulate earthquake ground motion time histories. We include the effects of all considerable magnitude earthquakes. To generate the high frequency (0.5-20 Hz) part of the broadband earthquake simulation, the real small magnitude earthquakes recorded by local seismic array are used as an Empirical Green's Functions (EGF). For the frequencies below 0.5 Hz the simulations are obtained using by Synthetic Green's Functions (SGF) which are synthetic seismograms calculated by an explicit 2D/3D elastic finite difference wave propagation routine. Using by a range of rupture scenarios for all considerable magnitude earthquakes throughout the PIF segments we provide a hazard calculation for frequencies 0.1-20 Hz. Physically based PSHA used here follows the same procedure of conventional PSHA except that conventional PSHA utilizes point sources or a series of point sources to represent earthquakes and this approach utilizes full rupture of earthquakes along faults. Further, conventional PSHA predicts ground-motion parameters using by empirical attenuation relationships, whereas this approach calculates synthetic seismograms for all magnitude earthquakes to obtain ground-motion parameters. PSHA results are produced for 2%, 10% and 50% hazards for all studied sites in Marmara Region.

  12. Demonstration of a performance assessment methodology for high-level radioactive waste disposal in basalt formations

    International Nuclear Information System (INIS)

    Bonano, E.J.; Davis, P.A.; Shipers, L.R.; Brinster, K.F.; Beyler, W.E.; Updegraff, C.D.; Shepherd, E.R.; Tilton, L.M.; Wahi, K.K.

    1989-06-01

    This document describes a performance assessment methodology developed for a high-level radioactive waste repository mined in deep basalt formations. This methodology is an extension of an earlier one applicable to bedded salt. The differences between the two methodologies arise primarily in the modeling of round-water flow and radionuclide transport. Bedded salt was assumed to be a porous medium, whereas basalt formations contain fractured zones. Therefore, mathematical models and associated computer codes were developed to simulate the aforementioned phenomena in fractured media. The use of the methodology is demonstrated at a hypothetical basalt site by analyzing seven scenarios: (1) thermohydrological effects caused by heat released from the repository, (2) mechanohydrological effects caused by an advancing and receding glacier, (3) normal ground-water flow, (4) pumping of ground water from a confined aquifer, (5) rerouting of a river near the repository, (6) drilling of a borehole through the repository, and (7) formation of a new fault intersecting the repository. The normal ground-water flow was considered the base-case scenario. This scenario was used to perform uncertainty and sensitivity analyses and to demonstrate the existing capabilities for assessing compliance with the ground-water travel time criterion and the containment requirements. Most of the other scenarios were considered perturbations of the base case, and a few were studied in terms of changes with respect to initial conditions. The potential impact of these scenarios on the long-term performance of the disposal system was ascertained through comparison with the base-case scenario or the undisturbed initial conditions. 66 refs., 106 figs., 27 tabs

  13. Engineering uses of physics-based ground motion simulations

    Science.gov (United States)

    Baker, Jack W.; Luco, Nicolas; Abrahamson, Norman A.; Graves, Robert W.; Maechling, Phillip J.; Olsen, Kim B.

    2014-01-01

    This paper summarizes validation methodologies focused on enabling ground motion simulations to be used with confidence in engineering applications such as seismic hazard analysis and dynmaic analysis of structural and geotechnical systems. Numberical simullation of ground motion from large erthquakes, utilizing physics-based models of earthquake rupture and wave propagation, is an area of active research in the earth science community. Refinement and validatoin of these models require collaboration between earthquake scientists and engineering users, and testing/rating methodolgies for simulated ground motions to be used with confidence in engineering applications. This paper provides an introduction to this field and an overview of current research activities being coordinated by the Souther California Earthquake Center (SCEC). These activities are related both to advancing the science and computational infrastructure needed to produce ground motion simulations, as well as to engineering validation procedures. Current research areas and anticipated future achievements are also discussed.

  14. Park Planning for Ageing Adults Using Grounded Theory Methodology

    Directory of Open Access Journals (Sweden)

    Bernie Dahl

    2004-06-01

    Full Text Available The importance of understanding park planning issues and implementing planning strategies for ageing adults was the driving force for this study. Literature reviews have identified a variety of scholarly work from fields such as gerontology, psychology, sociology and economics, all of which provide valuable information regarding the special needs of ageing adults. Very few researchers, however, have investigated the leisure behaviours of older adults in outdoor recreation (Croskeys, Tinsley and Tinsley, 2002 and the use of grounded theory methodology has essentially been unexplored in this area. Ageing adults are projected to live more than 20 percent of their life in retirement (MRP, 1998, cited in Croskeys, Tinsley and Tinsley, 2002, allowing for an increased amount of discretionary time. This offers opportunities for ageing adults to participate in outdoor recreational activities and will undoubtedly increase their leisure time. However, with limited research in recreational needs and inclusion for older adults, it is difficult for park planners and administrators to meet the growing needs of this population. Therefore, this research is necessary in order to determine whether ageing adults are being accounted for in park and outdoor recreational planning. The objective of this study was to use grounded theory research methodology to identify and examine ageing adult needs in relation to outdoor leisure activities in a regional park setting. Ten Midwestern regional park visitors (aged 65-75 years old and four park employees were interviewed. Our research attempts to fill in the gaps between the perceptions of ageing park users and those of park planners, using a methodology that relies primarily on direct contact with park visitors.

  15. Methodological Aspects of Building Science-based Sports Training System for Taekwondo Sportsmen

    Directory of Open Access Journals (Sweden)

    Ananchenko Konstantin

    2016-10-01

    Full Text Available The authors have solved topical scientific problems in the article: 1 the research base in the construction of theoretical and methodological foundations of sports training, based on taekwondo has been analysed; 2 the organization and methodological requirements for the training sessions of taekwondo have been researched; 3 the necessity of interaction processes of natural development and adaptation to physical activity of young taekwondo sportsmen has been grounded; 4 the necessity of scientific evidence of building young fighters training loads in microcycles, based on their individualization has been proved.

  16. Safety analysis methodologies for radioactive waste repositories in shallow ground

    International Nuclear Information System (INIS)

    1984-01-01

    The report is part of the IAEA Safety Series and is addressed to authorities and specialists responsible for or involved in planning, performing and/or reviewing safety assessments of shallow ground radioactive waste repositories. It discusses approaches that are applicable for safety analysis of a shallow ground repository. The methodologies, analysis techniques and models described are pertinent to the task of predicting the long-term performance of a shallow ground disposal system. They may be used during the processes of selection, confirmation and licensing of new sites and disposal systems or to evaluate the long-term consequences in the post-sealing phase of existing operating or inactive sites. The analysis may point out need for remedial action, or provide information to be used in deciding on the duration of surveillance. Safety analysis both general in nature and specific to a certain repository, site or design concept, are discussed, with emphasis on deterministic and probabilistic studies

  17. Methodological proposal of grounding in commercial and industrial installations

    International Nuclear Information System (INIS)

    Rodriguez Araya, Michael Eduardo

    2013-01-01

    A methodology is elaborated for the design of methods of commercial and industrial grounding. International standards and technical documents related with the grounding in the electrical design are studied in commercial and industrial installations. The design techniques of earthing systems are investigated. The topics that were covered to develop a design proposal have been: the analysis of resistivity, soil types, calculations of step voltages, contact and voltage of mesh. A field visit is programmed in nearby of the Escuela de Ingenieria Electrica at the Universidad de Costa Rica, to realize the pertinent measurements of resistivity for the design of a hypothetical grounding mesh for a future installation. The tellurometer (GP-1 model) of the brand Amprobe Instrument was used to provide the data from ground resistivity. The equipment has used four electrodes and has implemented the Wenner method for calculations. A earthing design is realized in a company in the industrial or commercial sector of Costa Rica. The earthing designs are realized to protect equipments found at the site and are affected by conditions such as: atmospheric overloads, transients, sags, interruptions or any event that may to affect the quality of the energy. The resistivity of an ground has depended largely on the amount of moisture that has presented. A correct earthing system should cover the greater amount of the total area of the building, and to comply with the voltage of mesh necessary for the design has been optimal. The design of any earthing has depended on unique characteristics that have been indicated by the location of industry [es

  18. Case Study Research Methodology

    Directory of Open Access Journals (Sweden)

    Mark Widdowson

    2011-01-01

    Full Text Available Commenting on the lack of case studies published in modern psychotherapy publications, the author reviews the strengths of case study methodology and responds to common criticisms, before providing a summary of types of case studies including clinical, experimental and naturalistic. Suggestions are included for developing systematic case studies and brief descriptions are given of a range of research resources relating to outcome and process measures. Examples of a pragmatic case study design and a hermeneutic single-case efficacy design are given and the paper concludes with some ethical considerations and an exhortation to the TA community to engage more widely in case study research.

  19. The ground based plan

    International Nuclear Information System (INIS)

    1989-01-01

    The paper presents a report of ''The Ground Based Plan'' of the United Kingdom Science and Engineering Research Council. The ground based plan is a plan for research in astronomy and planetary science by ground based techniques. The contents of the report contains a description of:- the scientific objectives and technical requirements (the basis for the Plan), the present organisation and funding for the ground based programme, the Plan, the main scientific features and the further objectives of the Plan. (U.K.)

  20. A neural network based methodology to predict site-specific spectral acceleration values

    Science.gov (United States)

    Kamatchi, P.; Rajasankar, J.; Ramana, G. V.; Nagpal, A. K.

    2010-12-01

    A general neural network based methodology that has the potential to replace the computationally-intensive site-specific seismic analysis of structures is proposed in this paper. The basic framework of the methodology consists of a feed forward back propagation neural network algorithm with one hidden layer to represent the seismic potential of a region and soil amplification effects. The methodology is implemented and verified with parameters corresponding to Delhi city in India. For this purpose, strong ground motions are generated at bedrock level for a chosen site in Delhi due to earthquakes considered to originate from the central seismic gap of the Himalayan belt using necessary geological as well as geotechnical data. Surface level ground motions and corresponding site-specific response spectra are obtained by using a one-dimensional equivalent linear wave propagation model. Spectral acceleration values are considered as a target parameter to verify the performance of the methodology. Numerical studies carried out to validate the proposed methodology show that the errors in predicted spectral acceleration values are within acceptable limits for design purposes. The methodology is general in the sense that it can be applied to other seismically vulnerable regions and also can be updated by including more parameters depending on the state-of-the-art in the subject.

  1. Analysis Methodology for Optimal Selection of Ground Station Site in Space Missions

    Science.gov (United States)

    Nieves-Chinchilla, J.; Farjas, M.; Martínez, R.

    2013-12-01

    Optimization of ground station sites is especially important in complex missions that include several small satellites (clusters or constellations) such as the QB50 project, where one ground station would be able to track several spatial vehicles, even simultaneously. In this regard the design of the communication system has to carefully take into account the ground station site and relevant signal phenomena, depending on the frequency band. To propose the optimal location of the ground station, these aspects become even more relevant to establish a trusted communication link due to the ground segment site in urban areas and/or selection of low orbits for the space segment. In addition, updated cartography with high resolution data of the location and its surroundings help to develop recommendations in the design of its location for spatial vehicles tracking and hence to improve effectiveness. The objectives of this analysis methodology are: completion of cartographic information, modelling the obstacles that hinder communication between the ground and space segment and representation in the generated 3D scene of the degree of impairment in the signal/noise of the phenomena that interferes with communication. The integration of new technologies of geographic data capture, such as 3D Laser Scan, determine that increased optimization of the antenna elevation mask, in its AOS and LOS azimuths along the horizon visible, maximizes visibility time with spatial vehicles. Furthermore, from the three-dimensional cloud of points captured, specific information is selected and, using 3D modeling techniques, the 3D scene of the antenna location site and surroundings is generated. The resulting 3D model evidences nearby obstacles related to the cartographic conditions such as mountain formations and buildings, and any additional obstacles that interfere with the operational quality of the antenna (other antennas and electronic devices that emit or receive in the same bandwidth

  2. Estimating and validating ground-based timber harvesting production through computer simulation

    Science.gov (United States)

    Jingxin Wang; Chris B. LeDoux

    2003-01-01

    Estimating ground-based timber harvesting systems production with an object oriented methodology was investigated. The estimation model developed generates stands of trees, simulates chain saw, drive-to-tree feller-buncher, swing-to-tree single-grip harvester felling, and grapple skidder and forwarder extraction activities, and analyzes costs and productivity. It also...

  3. Case-based reasoning a concise introduction

    CERN Document Server

    López, Beatriz

    2013-01-01

    Case-based reasoning is a methodology with a long tradition in artificial intelligence that brings together reasoning and machine learning techniques to solve problems based on past experiences or cases. Given a problem to be solved, reasoning involves the use of methods to retrieve similar past cases in order to reuse their solution for the problem at hand. Once the problem has been solved, learning methods can be applied to improve the knowledge based on past experiences. In spite of being a broad methodology applied in industry and services, case-based reasoning has often been forgotten in

  4. Case Study Methodology and Homelessness Research

    Directory of Open Access Journals (Sweden)

    Jill Pable

    2013-10-01

    Full Text Available This paper describes the potential suitability of case study methodology for inquiry with the homeless population. It references a research study that uses case study research method to build theory. This study's topic is the lived experience of destitute individuals who reside in homeless shelters, and explores the homeless shelter built environment's potential influence on resident satisfaction and recovery. Case study methodology may be appropriate because it explores real-life contextual issues that characterize homelessness and can also accommodate the wide range of homeless person demographics that make this group difficult to study in a generalized fashion. Further, case study method accommodates the need within research in this area to understand individualized treatments as a potential solution for homelessness.

  5. Methodology for the case studies

    NARCIS (Netherlands)

    Smits, M.J.W.; Woltjer, G.B.

    2017-01-01

    This document is about the methodology and selection of the case studies. It is meant as a guideline for the case studies, and together with the other reports in this work package can be a source of inform ation for policy officers, interest groups and researchers evaluating or performing impact

  6. Navigating the grounded theory terrain. Part 1.

    Science.gov (United States)

    Hunter, Andrew; Murphy, Kathy; Grealish, Annmarie; Casey, Dympna; Keady, John

    2011-01-01

    The decision to use grounded theory is not an easy one and this article aims to illustrate and explore the methodological complexity and decision-making process. It explores the decision making of one researcher in the first two years of a grounded theory PhD study looking at the psychosocial training needs of nurses and healthcare assistants working with people with dementia in residential care. It aims to map out three different approaches to grounded theory: classic, Straussian and constructivist. In nursing research, grounded theory is often referred to but it is not always well understood. This confusion is due in part to the history of grounded theory methodology, which is one of development and divergent approaches. Common elements across grounded theory approaches are briefly outlined, along with the key differences of the divergent approaches. Methodological literature pertaining to the three chosen grounded theory approaches is considered and presented to illustrate the options and support the choice made. The process of deciding on classical grounded theory as the version best suited to this research is presented. The methodological and personal factors that directed the decision are outlined. The relative strengths of Straussian and constructivist grounded theories are reviewed. All three grounded theory approaches considered offer the researcher a structured, rigorous methodology, but researchers need to understand their choices and make those choices based on a range of methodological and personal factors. In the second article, the final methodological decision will be outlined and its research application described.

  7. Methodology for locale-scale monitoring for the PROTHEGO project: the Choirokoitia case study

    Science.gov (United States)

    Themistocleous, Kyriacos; Agapiou, Athos; Cuca, Branka; Danezis, Chris; Cigna, Francesca; Margottini, Claudio; Spizzichino, Daniele

    2016-10-01

    PROTHEGO (PROTection of European Cultural HEritage from GeO-hazards) is a collaborative research project funded in the framework of the Joint Programming Initiative on Cultural Heritage and Global Change (JPICH) - Heritage Plus in 2015-2018 (www.prothego.eu). PROTHEGO aims to make an innovative contribution towards the analysis of geohazards in areas of cultural heritage, and uses novel space technology based on radar interferometry (InSAR) to retrieve information on ground stability and motion in the 400+ UNESCO's World Heritage List monuments and sites of Europe. InSAR can be used to measure micro-movements to identify geo-hazards. In order to verify the InSAR image data, field and close range measurements are necessary. This paper presents the methodology for local-scale monitoring of the Choirokoitia study site in Cyprus, inscribed in the UNESCO World Heritage List, and part of the demonstration sites of PROTHEGO. Various field and remote sensing methods will be exploited for the local-scale monitoring, static GNSS, total station, leveling, laser scanning and UAV and compared with the Persistent Scatterer Interferometry results. The in-situ measurements will be taken systematically in order to document any changes and geo-hazards that affect standing archaeological remains. In addition, ground truth from in-situ visits will provide feedback related to the classification results of urban expansion and land use change maps. Available archival and current optical satellite images will be used to calibrate and identify the level of risk at the Cyprus case study site. The ground based geotechnical monitoring will be compared and validated with InSAR data to evaluate cultural heritage sites deformation trend and to understand its behaviour over the last two decades.

  8. A Personal Journey with Grounded Theory Methodology. Kathy Charmaz in Conversation With Reiner Keller

    Directory of Open Access Journals (Sweden)

    Kathy Charmaz

    2016-01-01

    Full Text Available Kathy CHARMAZ is one of the most important thinkers in grounded theory methodology today. Her trailblazing work on constructivist grounded theory continues to inspire research across many disciplines and around the world. In this interview, she reflects on the aura surrounding qualitative inquiry that existed in California in the late 1960s to early 1970s and the lessons she learned from her first forays into empirical research. She comments on the trajectory that grounded theory research has followed since then and gives an account of her own perspective on constructivist grounded theory. In doing so, she underlines the importance of the Chicago School and symbolic interactionist tradition for grounded theory research work today and shows where the latter is positioned in the current field of qualitative fieldwork. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1601165

  9. Performance-based methodology for assessing seismic vulnerability and capacity of buildings

    Science.gov (United States)

    Shibin, Lin; Lili, Xie; Maosheng, Gong; Ming, Li

    2010-06-01

    This paper presents a performance-based methodology for the assessment of seismic vulnerability and capacity of buildings. The vulnerability assessment methodology is based on the HAZUS methodology and the improved capacitydemand-diagram method. The spectral displacement ( S d ) of performance points on a capacity curve is used to estimate the damage level of a building. The relationship between S d and peak ground acceleration (PGA) is established, and then a new vulnerability function is expressed in terms of PGA. Furthermore, the expected value of the seismic capacity index (SCev) is provided to estimate the seismic capacity of buildings based on the probability distribution of damage levels and the corresponding seismic capacity index. The results indicate that the proposed vulnerability methodology is able to assess seismic damage of a large number of building stock directly and quickly following an earthquake. The SCev provides an effective index to measure the seismic capacity of buildings and illustrate the relationship between the seismic capacity of buildings and seismic action. The estimated result is compared with damage surveys of the cities of Dujiangyan and Jiangyou in the M8.0 Wenchuan earthquake, revealing that the methodology is acceptable for seismic risk assessment and decision making. The primary reasons for discrepancies between the estimated results and the damage surveys are discussed.

  10. Integration between ground based and satellite SAR data in landslide mapping: The San Fratello case study

    Science.gov (United States)

    Bardi, Federica; Frodella, William; Ciampalini, Andrea; Bianchini, Silvia; Del Ventisette, Chiara; Gigli, Giovanni; Fanti, Riccardo; Moretti, Sandro; Basile, Giuseppe; Casagli, Nicola

    2014-10-01

    The potential use of the integration of PSI (Persistent Scatterer Interferometry) and GB-InSAR (Ground-based Synthetic Aperture Radar Interferometry) for landslide hazard mitigation was evaluated for mapping and monitoring activities of the San Fratello landslide (Sicily, Italy). Intense and exceptional rainfall events are the main factors that triggered several slope movements in the study area, which is susceptible to landslides, because of its steep slopes and silty-clayey sedimentary cover. In the last three centuries, the town of San Fratello was affected by three large landslides, developed in different periods: the oldest one occurred in 1754, damaging the northeastern sector of the town; in 1922 a large landslide completely destroyed a wide area in the western hillside of the town. In this paper, the attention is focussed on the most recent landslide that occurred on 14 February 2010: in this case, the phenomenon produced the failure of a large sector of the eastern hillside, causing severe damages to buildings and infrastructures. In particular, several slow-moving rotational and translational slides occurred in the area, making it suitable to monitor ground instability through different InSAR techniques. PS-InSAR™ (permanent scatterers SAR interferometry) techniques, using ERS-1/ERS-2, ENVISAT, RADARSAT-1, and COSMO-SkyMed SAR images, were applied to analyze ground displacements during pre- and post-event phases. Moreover, during the post-event phase in March 2010, a GB-InSAR system, able to acquire data continuously every 14 min, was installed collecting ground displacement maps for a period of about three years, until March 2013. Through the integration of space-borne and ground-based data sets, ground deformation velocity maps were obtained, providing a more accurate delimitation of the February 2010 landslide boundary, with respect to the carried out traditional geomorphological field survey. The integration of GB-InSAR and PSI techniques proved to

  11. Methodology or method? A critical review of qualitative case study reports

    Directory of Open Access Journals (Sweden)

    Nerida Hyett

    2014-05-01

    Full Text Available Despite on-going debate about credibility, and reported limitations in comparison to other approaches, case study is an increasingly popular approach among qualitative researchers. We critically analysed the methodological descriptions of published case studies. Three high-impact qualitative methods journals were searched to locate case studies published in the past 5 years; 34 were selected for analysis. Articles were categorized as health and health services (n=12, social sciences and anthropology (n=7, or methods (n=15 case studies. The articles were reviewed using an adapted version of established criteria to determine whether adequate methodological justification was present, and if study aims, methods, and reported findings were consistent with a qualitative case study approach. Findings were grouped into five themes outlining key methodological issues: case study methodology or method, case of something particular and case selection, contextually bound case study, researcher and case interactions and triangulation, and study design inconsistent with methodology reported. Improved reporting of case studies by qualitative researchers will advance the methodology for the benefit of researchers and practitioners.

  12. Methodology or method? A critical review of qualitative case study reports

    Science.gov (United States)

    Hyett, Nerida; Kenny, Amanda; Dickson-Swift, Virginia

    2014-01-01

    Despite on-going debate about credibility, and reported limitations in comparison to other approaches, case study is an increasingly popular approach among qualitative researchers. We critically analysed the methodological descriptions of published case studies. Three high-impact qualitative methods journals were searched to locate case studies published in the past 5 years; 34 were selected for analysis. Articles were categorized as health and health services (n=12), social sciences and anthropology (n=7), or methods (n=15) case studies. The articles were reviewed using an adapted version of established criteria to determine whether adequate methodological justification was present, and if study aims, methods, and reported findings were consistent with a qualitative case study approach. Findings were grouped into five themes outlining key methodological issues: case study methodology or method, case of something particular and case selection, contextually bound case study, researcher and case interactions and triangulation, and study design inconsistent with methodology reported. Improved reporting of case studies by qualitative researchers will advance the methodology for the benefit of researchers and practitioners. PMID:24809980

  13. Applying of component system development in object methodology, case study

    Directory of Open Access Journals (Sweden)

    Milan Mišovič

    2013-01-01

    Full Text Available To create computarization target software as a component system has been a very strong requirement for the last 20 years of software developing. Finally, the architectural components are self-contained units, presenting not only partial and overall system behavior, but also cooperating with each other on the basis of their interfaces. Among others, components have allowed flexible modification of processes the behavior of which is the foundation of components behavior without changing the life of the component system. On the other hand, the component system makes it possible, at design time, to create numerous new connections between components and thus creating modified system behaviors. This all enables the company management to perform, at design time, required behavioral changes of processes in accordance with the requirements of changing production and market.The development of software which is generally referred to as SDP (Software Development Process contains two directions. The first one, called CBD (Component–Based Development, is dedicated to the development of component–based systems CBS (Component–based System, the second target is the development of software under the influence of SOA (Service–Oriented Architecture. Both directions are equipped with their different development methodologies. The subject of this paper is only the first direction and application of development of component–based systems in its object–oriented methodologies. The requirement of today is to carry out the development of component-based systems in the framework of developed object–oriented methodologies precisely in the way of a dominant style. In some of the known methodologies, however, this development is not completely transparent and is not even recognized as dominant. In some cases, it is corrected by the special meta–integration models of component system development into an object methodology.This paper presents a case study

  14. Phenomena based Methodology for Process Synthesis incorporating Process Intensification

    DEFF Research Database (Denmark)

    Lutze, Philip; Babi, Deenesh Kavi; Woodley, John

    2013-01-01

    at processes at the lowest level of aggregation which is the phenomena level. In this paper, a phenomena based synthesis/design methodology incorporating process intensification is presented. Using this methodology, a systematic identification of necessary and desirable (integrated) phenomena as well......Process intensification (PI) has the potential to improve existing as well as conceptual processes, in order to achieve a more sustainable production. PI can be achieved at different levels. That is, the unit operations, functional and/or phenomena level. The highest impact is expected by looking...... as generation and screening of phenomena based flowsheet options are presented using a decomposition based solution approach. The developed methodology as well as necessary tools and supporting methods are highlighted through a case study involving the production of isopropyl-acetate....

  15. Methodological Grounds of Managing Innovation Development of Restaurants

    Directory of Open Access Journals (Sweden)

    Naidiuk V. S.

    2013-12-01

    Full Text Available The goal of the article lies in identification and further development of methodological grounds of managing the innovation development of restaurants. Based on the data of the critical analysis of existing scientific views on interpretation of the essence of the “managing innovation development of an enterprise” notion, the article conducts clarification of this definition. In the result of the study the article builds up a cause-effect diagram of solution of the problem of ensuring efficient management of the innovation development of a restaurant. The article develops a conceptual scheme of development and realisation of the strategy of innovation development in a restaurant. It experimentally confirms the hypothesis of availability of a very strong density of the feedback between resistance to innovation changes and a variable share of qualified personnel that is capable of permanent development (learning and generation of new ideas, in restaurants and builds a model of dependency between them. The prospects of further studies in this direction could become scientific studies directed at development of methodical approaches to identification of the level of innovation potential and assessment of efficiency of managing innovation development of different (by type, class, size, etc. restaurants. The obtained data could also be used for development of a new or improvement of the existing tools of strategic management of innovation development at the micro-level.

  16. Qualitative case study methodology in nursing research: an integrative review.

    Science.gov (United States)

    Anthony, Susan; Jack, Susan

    2009-06-01

    This paper is a report of an integrative review conducted to critically analyse the contemporary use of qualitative case study methodology in nursing research. Increasing complexity in health care and increasing use of case study in nursing research support the need for current examination of this methodology. In 2007, a search for case study research (published 2005-2007) indexed in the CINAHL, MEDLINE, EMBASE, PsychINFO, Sociological Abstracts and SCOPUS databases was conducted. A sample of 42 case study research papers met the inclusion criteria. Whittemore and Knafl's integrative review method guided the analysis. Confusion exists about the name, nature and use of case study. This methodology, including terminology and concepts, is often invisible in qualitative study titles and abstracts. Case study is an exclusive methodology and an adjunct to exploring particular aspects of phenomena under investigation in larger or mixed-methods studies. A high quality of case study exists in nursing research. Judicious selection and diligent application of literature review methods promote the development of nursing science. Case study is becoming entrenched in the nursing research lexicon as a well-accepted methodology for studying phenomena in health and social care, and its growing use warrants continued appraisal to promote nursing knowledge development. Attention to all case study elements, process and publication is important in promoting authenticity, methodological quality and visibility.

  17. Validation of a Methodology to Predict Micro-Vibrations Based on Finite Element Model Approach

    Science.gov (United States)

    Soula, Laurent; Rathband, Ian; Laduree, Gregory

    2014-06-01

    This paper presents the second part of the ESA R&D study called "METhodology for Analysis of structure- borne MICro-vibrations" (METAMIC). After defining an integrated analysis and test methodology to help predicting micro-vibrations [1], a full-scale validation test campaign has been carried out. It is based on a bread-board representative of typical spacecraft (S/C) platform consisting in a versatile structure made of aluminium sandwich panels equipped with different disturbance sources and a dummy payload made of a silicon carbide (SiC) bench. The bread-board has been instrumented with a large set of sensitive accelerometers and tests have been performed including back-ground noise measurement, modal characterization and micro- vibration tests. The results provided responses to the perturbation coming from a reaction wheel or cryo-cooler compressors, operated independently then simultaneously with different operation modes. Using consistent modelling and associated experimental characterization techniques, a correlation status has been assessed by comparing test results with predictions based on FEM approach. Very good results have been achieved particularly for the case of a wheel in sweeping rate operation with test results over-predicted within a reasonable margin lower than two. Some limitations of the methodology have also been identified for sources operating at a fixed rate or coming with a small number of dominant harmonics and recommendations have been issued in order to deal with model uncertainties and stay conservative.

  18. Evaluating airborne and ground based gamma spectrometry methods for detecting particulate radioactivity in the environment: a case study of Irish Sea beaches.

    Science.gov (United States)

    Cresswell, A J; Sanderson, D C W

    2012-10-15

    In several places, programmes are in place to locate and recover radioactive particles that have the potential to cause detrimental health effects in any member of the public who may encounter them. A model has been developed to evaluate the use of mobile gamma spectrometry systems within such programmes, with particular emphasis on large volume (16l) NaI(Tl) detectors mounted in low flying helicopters. This model uses a validated Monte Carlo code with assessment of local geochemistry and natural and anthropogenic background radiation concentrations and distributions. The results of the model, applied to the example of particles recovered from beaches in the vicinity of Sellafield, clearly show the ability of rapid airborne surveys conducted at 75 m ground clearance and 120 kph speeds to demonstrate the absence of sources greater than 5 MBq (137)Cs within large areas (10-20 km(2)h(-1)), and identify areas requiring further ground based investigation. Lowering ground clearance for airborne surveys to 15m whilst maintaining speeds covering 1-2 km(2) h(-1) can detect buried (137)Cs sources of 0.5MBq or greater activity. A survey design to detect 100 kBq (137)Cs sources at 10 cm depth has also been defined, requiring surveys at <15m ground clearance and <2 ms(-1) ground speed. The response of airborne systems to the Sellafield particles recovered to date has also been simulated, and the proportion of the existing radiocaesium background in the vicinity of the nuclear site has been established. Finally the rates of area coverage and sensitivities of both airborne and ground based approaches are compared, demonstrating the ability of airborne systems to increase the rate of particle recovery in a cost effective manner. The potential for equipment and methodological developments to improve performance are discussed. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. Case Study Research: Foundations and Methodological Orientations

    Directory of Open Access Journals (Sweden)

    Helena Harrison

    2017-01-01

    Full Text Available Over the last forty years, case study research has undergone substantial methodological development. This evolution has resulted in a pragmatic, flexible research approach, capable of providing comprehensive in-depth understanding of a diverse range of issues across a number of disciplines. Change and progress have stemmed from parallel influences of historical transformations in approaches to research and individual researcher's preferences, perspectives, and interpretations of this design. Researchers who have contributed to the development of case study research come from diverse disciplines with different philosophical perspectives, resulting in a variety of definitions and approaches. For the researcher new to using case study, such variety can create a confusing platform for its application. In this article, we explore the evolution of case study research, discuss methodological variations, and summarize key elements with the aim of providing guidance on the available options for researchers wanting to use case study in their work. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1701195

  20. Development of a meta-algorithm for guiding primary care encounters for patients with multimorbidity using evidence-based and case-based guideline development methodology.

    Science.gov (United States)

    Muche-Borowski, Cathleen; Lühmann, Dagmar; Schäfer, Ingmar; Mundt, Rebekka; Wagner, Hans-Otto; Scherer, Martin

    2017-06-22

    The study aimed to develop a comprehensive algorithm (meta-algorithm) for primary care encounters of patients with multimorbidity. We used a novel, case-based and evidence-based procedure to overcome methodological difficulties in guideline development for patients with complex care needs. Systematic guideline development methodology including systematic evidence retrieval (guideline synopses), expert opinions and informal and formal consensus procedures. Primary care. The meta-algorithm was developed in six steps:1. Designing 10 case vignettes of patients with multimorbidity (common, epidemiologically confirmed disease patterns and/or particularly challenging health care needs) in a multidisciplinary workshop.2. Based on the main diagnoses, a systematic guideline synopsis of evidence-based and consensus-based clinical practice guidelines was prepared. The recommendations were prioritised according to the clinical and psychosocial characteristics of the case vignettes.3. Case vignettes along with the respective guideline recommendations were validated and specifically commented on by an external panel of practicing general practitioners (GPs).4. Guideline recommendations and experts' opinions were summarised as case specific management recommendations (N-of-one guidelines).5. Healthcare preferences of patients with multimorbidity were elicited from a systematic literature review and supplemented with information from qualitative interviews.6. All N-of-one guidelines were analysed using pattern recognition to identify common decision nodes and care elements. These elements were put together to form a generic meta-algorithm. The resulting meta-algorithm reflects the logic of a GP's encounter of a patient with multimorbidity regarding decision-making situations, communication needs and priorities. It can be filled with the complex problems of individual patients and hereby offer guidance to the practitioner. Contrary to simple, symptom-oriented algorithms, the meta

  1. A methodological review of qualitative case study methodology in midwifery research.

    Science.gov (United States)

    Atchan, Marjorie; Davis, Deborah; Foureur, Maralyn

    2016-10-01

    To explore the use and application of case study research in midwifery. Case study research provides rich data for the analysis of complex issues and interventions in the healthcare disciplines; however, a gap in the midwifery research literature was identified. A methodological review of midwifery case study research using recognized templates, frameworks and reporting guidelines facilitated comprehensive analysis. An electronic database search using the date range January 2005-December 2014: Maternal and Infant Care, CINAHL Plus, Academic Search Complete, Web of Knowledge, SCOPUS, Medline, Health Collection (Informit), Cochrane Library Health Source: Nursing/Academic Edition, Wiley online and ProQuest Central. Narrative evaluation was undertaken. Clearly worded questions reflected the problem and purpose. The application, strengths and limitations of case study methods were identified through a quality appraisal process. The review identified both case study research's applicability to midwifery and its low uptake, especially in clinical studies. Many papers included the necessary criteria to achieve rigour. The included measures of authenticity and methodology were varied. A high standard of authenticity was observed, suggesting authors considered these elements to be routine inclusions. Technical aspects were lacking in many papers, namely a lack of reflexivity and incomplete transparency of processes. This review raises the profile of case study research in midwifery. Midwives will be encouraged to explore if case study research is suitable for their investigation. The raised profile will demonstrate further applicability; encourage support and wider adoption in the midwifery setting. © 2016 John Wiley & Sons Ltd.

  2. Ground-based photo monitoring

    Science.gov (United States)

    Frederick C. Hall

    2000-01-01

    Ground-based photo monitoring is repeat photography using ground-based cameras to document change in vegetation or soil. Assume those installing the photo location will not be the ones re-photographing it. This requires a protocol that includes: (1) a map to locate the monitoring area, (2) another map diagramming the photographic layout, (3) type and make of film such...

  3. Qualitative methodology in a psychoanalytic single case study

    DEFF Research Database (Denmark)

    Grünbaum, Liselotte

    features and breaks in psychotherapy investigated. One aim of the study was to contribute to the development of a transparent and systematic methodology for the psychoanalytic case study by application of rigorous qualitative research methodology. To this end, inductive-deductive principles in line...

  4. Effects of energy development on ground water quality: an overview and preliminary assessment

    International Nuclear Information System (INIS)

    Parker, W.M. III; Yin, S.C.L.; Davis, M.J.; Kutz, W.J.

    1981-07-01

    A preliminary national overview of the various effects on ground water quality likely to result from energy development. Based on estimates of present and projected energy-development activities, those regions of the country are identified where ground water quality has the potential for being adversely affected. The general causes of change in ground water quality are reviewed. Specific effects on ground water quality of selected energy technologies are discussed, and some case-history material is provided. A brief overview of pertinent legislation relating to the protection and management of ground water quality is presented. Six methodologies that have some value for assessing the potential effects on ground water quality of energy development activities are reviewed. A method of identifying regions in the 48 contiguous states where there is a potential for ground water quality problems is described and then applied

  5. A Rigorous Methodology for Analyzing and Designing Plug-Ins

    DEFF Research Database (Denmark)

    Fasie, Marieta V.; Haxthausen, Anne Elisabeth; Kiniry, Joseph

    2013-01-01

    . This paper addresses these problems by describing a rigorous methodology for analyzing and designing plug-ins. The methodology is grounded in the Extended Business Object Notation (EBON) and covers informal analysis and design of features, GUI, actions, and scenarios, formal architecture design, including...... behavioral semantics, and validation. The methodology is illustrated via a case study whose focus is an Eclipse environment for the RAISE formal method's tool suite....

  6. Demonstration of a performance assessment methodology for nuclear waste isolation in basalt formations

    International Nuclear Information System (INIS)

    Bonano, E.J.; Davis, P.A.

    1988-01-01

    This paper summarizes the results of the demonstration of a performance assessment methodology developed by Sandia National Laboratories, Albuquerque for the US Nuclear Regulatory Commission for use in the analysis of high-level radioactive waste disposal in deep basalts. Seven scenarios that could affect the performance of a repository in basalts were analyzed. One of these scenarios, normal ground-water flow, was called the base-case scenario. This was used to demonstrate the modeling capabilities in the methodology necessary to assess compliance with the ground-water travel time criterion. The scenario analysis consisted of both scenario screening and consequence modeling. Preliminary analyses of scenarios considering heat released from the waste and the alteration of the hydraulic properties of the rock mass due to loads created by a glacier suggested that these effects would not be significant. The analysis of other scenarios indicated that those changing the flow field in the vicinity of the repository would have an impact on radionuclide discharges, while changes far from the repository may not be significant. The analysis of the base-case scenario was used to show the importance of matrix diffusion as a radionuclide retardation mechanism in fractured media. The demonstration of the methodology also included an overall sensitivity analysis to identify important parameters and/or processes. 15 refs., 13 figs., 2 tabs

  7. Initial building investigations at Aberdeen Proving Ground, Maryland: Objectives and methodology

    Energy Technology Data Exchange (ETDEWEB)

    Brubaker, K.L.; Dougherty, J.M.; McGinnis, L.D.

    1994-12-01

    As part of an environmental-contamination source-definition program at Aberdeen Proving Ground, detailed internal and external inspections of 23 potentially contaminated buildings are being conducted to describe and characterize the state of each building as it currently exists and to identify areas potentially contaminated with toxic or other hazardous substances. In addition, a detailed geophysical investigation is being conducted in the vicinity of each target building to locate and identify subsurface structures, associated with former building operations, that are potential sources of contamination. This report describes the objectives of the initial building inspections, including the geophysical investigations, and discusses the methodology that has been developed to achieve these objectives.

  8. A methodology for strain-based fatigue reliability analysis

    International Nuclear Information System (INIS)

    Zhao, Y.X.

    2000-01-01

    A significant scatter of the cyclic stress-strain (CSS) responses should be noted for a nuclear reactor material, 1Cr18Ni9Ti pipe-weld metal. Existence of the scatter implies that a random cyclic strain applied history will be introduced under any of the loading modes even a deterministic loading history. A non-conservative evaluation might be given in the practice without considering the scatter. A methodology for strain-based fatigue reliability analysis, which has taken into account the scatter, is developed. The responses are approximately modeled by probability-based CSS curves of Ramberg-Osgood relation. The strain-life data are modeled, similarly, by probability-based strain-life curves of Coffin-Manson law. The reliability assessment is constructed by considering interference of the random fatigue strain applied and capacity histories. Probability density functions of the applied and capacity histories are analytically given. The methodology could be conveniently extrapolated to the case of deterministic CSS relation as the existent methods did. Non-conservative evaluation of the deterministic CSS relation and availability of present methodology have been indicated by an analysis of the material test results

  9. A case-based assistant for clinical psychiatry expertise.

    OpenAIRE

    Bichindaritz, I.

    1994-01-01

    Case-based reasoning is an artificial intelligence methodology for the processing of empirical knowledge. Recent case-based reasoning systems also use theoretic knowledge about the domain to constrain the case-based reasoning. The organization of the memory is the key issue in case-based reasoning. The case-based assistant presented here has two structures in memory: cases and concepts. These memory structures permit it to be as skilled in problem-solving tasks, such as diagnosis and treatmen...

  10. Toward Paradigmatic Change in TESOL Methodologies: Building Plurilingual Pedagogies from the Ground Up

    Science.gov (United States)

    Lin, Angel

    2013-01-01

    Contemporary TESOL methodologies have been characterized by compartmentalization of languages in the classroom. However, recent years have seen the beginning signs of paradigmatic change in TESOL methodologies that indicate a move toward plurilingualism. In this article, the author draws on the case of Hong Kong to illustrate how, in the past four…

  11. Development of a PC-based ground support system for a small satellite instrument

    Science.gov (United States)

    Deschambault, Robert L.; Gregory, Philip R.; Spenler, Stephen; Whalen, Brian A.

    1993-11-01

    The importance of effective ground support for the remote control and data retrieval of a satellite instrument cannot be understated. Problems with ground support may include the need to base personnel at a ground tracking station for extended periods, and the delay between the instrument observation and the processing of the data by the science team. Flexible solutions to such problems in the case of small satellite systems are provided by using low-cost, powerful personal computers and off-the-shelf software for data acquisition and processing, and by using Internet as a communication pathway to enable scientists to view and manipulate satellite data in real time at any ground location. The personal computer based ground support system is illustrated for the case of the cold plasma analyzer flown on the Freja satellite. Commercial software was used as building blocks for writing the ground support equipment software. Several levels of hardware support, including unit tests and development, functional tests, and integration were provided by portable and desktop personal computers. Satellite stations in Saskatchewan and Sweden were linked to the science team via phone lines and Internet, which provided remote control through a central point. These successful strategies will be used on future small satellite space programs.

  12. Is case-chaos methodology an appropriate alternative to conventional case-control studies for investigating outbreaks?

    Science.gov (United States)

    Edelstein, Michael; Wallensten, Anders; Kühlmann-Berenzon, Sharon

    2014-08-15

    Case-chaos methodology is a proposed alternative to case-control studies that simulates controls by randomly reshuffling the exposures of cases. We evaluated the method using data on outbreaks in Sweden. We identified 5 case-control studies from foodborne illness outbreaks that occurred between 2005 and 2012. Using case-chaos methodology, we calculated odds ratios 1,000 times for each exposure. We used the median as the point estimate and the 2.5th and 97.5th percentiles as the confidence interval. We compared case-chaos matched odds ratios with their respective case-control odds ratios in terms of statistical significance. Using Spearman's correlation, we estimated the correlation between matched odds ratios and the proportion of cases exposed to each exposure and quantified the relationship between the 2 using a normal linear mixed model. Each case-control study identified an outbreak vehicle (odds ratios = 4.9-45). Case-chaos methodology identified the outbreak vehicle 3 out of 5 times. It identified significant associations in 22 of 113 exposures that were not associated with outcome and 5 of 18 exposures that were significantly associated with outcome. Log matched odds ratios correlated with their respective proportion of cases exposed (Spearman ρ = 0.91) and increased significantly with the proportion of cases exposed (b = 0.054). Case-chaos methodology missed the outbreak source 2 of 5 times and identified spurious associations between a number of exposures and outcome. Measures of association correlated with the proportion of cases exposed. We recommended against using case-chaos analysis during outbreak investigations. © The Author 2014. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  13. GIS-based regionalized life cycle assessment: how big is small enough? Methodology and case study of electricity generation.

    Science.gov (United States)

    Mutel, Christopher L; Pfister, Stephan; Hellweg, Stefanie

    2012-01-17

    We describe a new methodology for performing regionalized life cycle assessment and systematically choosing the spatial scale of regionalized impact assessment methods. We extend standard matrix-based calculations to include matrices that describe the mapping from inventory to impact assessment spatial supports. Uncertainty in inventory spatial data is modeled using a discrete spatial distribution function, which in a case study is derived from empirical data. The minimization of global spatial autocorrelation is used to choose the optimal spatial scale of impact assessment methods. We demonstrate these techniques on electricity production in the United States, using regionalized impact assessment methods for air emissions and freshwater consumption. Case study results show important differences between site-generic and regionalized calculations, and provide specific guidance for future improvements of inventory data sets and impact assessment methods.

  14. Design methodology for bio-based processing: Biodiesel and fatty alcohol production

    DEFF Research Database (Denmark)

    Simasatikul, Lida; Arpornwichanopa, Amornchai; Gani, Rafiqul

    2013-01-01

    A systematic design methodology is developed for producing multiple main products plus side products starting with one or more bio-based renewable source. A superstructure that includes all possible reaction and separation operations is generated through thermodynamic insights and available data........ Economic analysis and net present value are determined to find the best economically and operationally feasible process. The application of the methodology is presented through a case study involving biodiesel and fatty alcohol productions....

  15. Simulation and case-based learning

    DEFF Research Database (Denmark)

    Ørngreen, Rikke; Guralnick, David

    2008-01-01

    Abstract- This paper has its origin in the authors' reflection on years of practical experiences combined with literature readings in our preparation for a workshop on learn-by-doing simulation and case-based learning to be held at the ICELW 2008 conference (the International Conference on E-Learning...... in the Workplace). The purpose of this paper is to describe the two online learning methodologies and to raise questions for future discussion. In the workshop, the organizers and participants work with and discuss differences and similarities within the two pedagogical methodologies, focusing on how...... they are applied in workplace related and e-learning contexts. In addition to the organizers, a small number of invited presenters will attend, giving demonstrations of their work within learn-by-doing simulation and cases-based learning, but still leaving ample of time for discussion among all participants....

  16. Design methodology for bio-based processing: Biodiesel and fatty alcohol production

    DEFF Research Database (Denmark)

    Simasatikul, Lida; Arpornwichanop, Amornchai; Gani, Rafiqul

    2012-01-01

    A systematic design methodology is developed for producing two main products plus side products starting with one or more bio-based renewable source. A superstructure that includes all possible reaction and separation operations is generated through thermodynamic insights and available data. The ....... Economic analysis and net present value are determined to find the best economically and operationally feasible process. The application of the methodology is presented through a case study involving biodiesel and fatty alcohol productions....

  17. Methodology and application of combined watershed and ground-water models in Kansas

    Science.gov (United States)

    Sophocleous, M.; Perkins, S.P.

    2000-01-01

    Increased irrigation in Kansas and other regions during the last several decades has caused serious water depletion, making the development of comprehensive strategies and tools to resolve such problems increasingly important. This paper makes the case for an intermediate complexity, quasi-distributed, comprehensive, large-watershed model, which falls between the fully distributed, physically based hydrological modeling system of the type of the SHE model and the lumped, conceptual rainfall-runoff modeling system of the type of the Stanford watershed model. This is achieved by integrating the quasi-distributed watershed model SWAT with the fully-distributed ground-water model MODFLOW. The advantage of this approach is the appreciably smaller input data requirements and the use of readily available data (compared to the fully distributed, physically based models), the statistical handling of watershed heterogeneities by employing the hydrologic-response-unit concept, and the significantly increased flexibility in handling stream-aquifer interactions, distributed well withdrawals, and multiple land uses. The mechanics of integrating the component watershed and ground-water models are outlined, and three real-world management applications of the integrated model from Kansas are briefly presented. Three different aspects of the integrated model are emphasized: (1) management applications of a Decision Support System for the integrated model (Rattlesnake Creek subbasin); (2) alternative conceptual models of spatial heterogeneity related to the presence or absence of an underlying aquifer with shallow or deep water table (Lower Republican River basin); and (3) the general nature of the integrated model linkage by employing a watershed simulator other than SWAT (Wet Walnut Creek basin). These applications demonstrate the practicality and versatility of this relatively simple and conceptually clear approach, making public acceptance of the integrated watershed modeling

  18. Building Grounded Theory in Entrepreneurship Research

    DEFF Research Database (Denmark)

    Mäkelä, Markus; Turcan, Romeo V.

    2007-01-01

    In this chapter we describe the process of building of theory from data (Glaser and Strauss 1967; Strauss and Corbin 1998). We discuss current grounded theory in relation to research in entrepreneurship and point out directions and potential improvements for further research in this field....... The chapter has two goals. First, we wish to provide an explicit paradigmatic positioning of the grounded theory methodology, discussing the most relevant views of ontology and epistemology that can be used as alternative starting points for conducting grounded theory research. While the chapter introduces...... our approach to grounded theory, we acknowledge the existence of other approaches and try to locate our approach in relation to them. As an important part of this discussion, we take a stand on how to usefully define ‘grounded theory’ and ‘case study research’. Second, we seek to firmly link our...

  19. Grounded understanding of abstract concepts: The case of STEM learning.

    Science.gov (United States)

    Hayes, Justin C; Kraemer, David J M

    2017-01-01

    Characterizing the neural implementation of abstract conceptual representations has long been a contentious topic in cognitive science. At the heart of the debate is whether the "sensorimotor" machinery of the brain plays a central role in representing concepts, or whether the involvement of these perceptual and motor regions is merely peripheral or epiphenomenal. The domain of science, technology, engineering, and mathematics (STEM) learning provides an important proving ground for sensorimotor (or grounded) theories of cognition, as concepts in science and engineering courses are often taught through laboratory-based and other hands-on methodologies. In this review of the literature, we examine evidence suggesting that sensorimotor processes strengthen learning associated with the abstract concepts central to STEM pedagogy. After considering how contemporary theories have defined abstraction in the context of semantic knowledge, we propose our own explanation for how body-centered information, as computed in sensorimotor brain regions and visuomotor association cortex, can form a useful foundation upon which to build an understanding of abstract scientific concepts, such as mechanical force. Drawing from theories in cognitive neuroscience, we then explore models elucidating the neural mechanisms involved in grounding intangible concepts, including Hebbian learning, predictive coding, and neuronal recycling. Empirical data on STEM learning through hands-on instruction are considered in light of these neural models. We conclude the review by proposing three distinct ways in which the field of cognitive neuroscience can contribute to STEM learning by bolstering our understanding of how the brain instantiates abstract concepts in an embodied fashion.

  20. Ground-based observations of exoplanet atmospheres

    NARCIS (Netherlands)

    Mooij, Ernst Johan Walter de

    2011-01-01

    This thesis focuses on the properties of exoplanet atmospheres. The results for ground-based near-infrared secondary eclipse observations of three different exoplanets, TrES-3b, HAT-P-1b and WASP-33b, are presented which have been obtained with ground-based telescopes as part of the GROUSE project.

  1. A Methodology for Retrieving Information from Malware Encrypted Output Files: Brazilian Case Studies

    Directory of Open Access Journals (Sweden)

    Nelson Uto

    2013-04-01

    Full Text Available This article presents and explains a methodology based on cryptanalytic and reverse engineering techniques that can be employed to quickly recover information from encrypted files generated by malware. The objective of the methodology is to minimize the effort with static and dynamic analysis, by using cryptanalysis and related knowledge as much as possible. In order to illustrate how it works, we present three case studies, taken from a big Brazilian company that was victimized by directed attacks focused on stealing information from a special purpose hardware they use in their environment.

  2. Methodology for designing aircraft having optimal sound signatures

    NARCIS (Netherlands)

    Sahai, A.K.; Simons, D.G.

    2017-01-01

    This paper presents a methodology with which aircraft designs can be modified such that they produce optimal sound signatures on the ground. With optimal sound it is implied in this case sounds that are perceived as less annoying by residents living near airport vicinities. A novel design and

  3. Conceptual design of a thermo-electrical energy storage system based on heat integration of thermodynamic cycles – Part A: Methodology and base case

    International Nuclear Information System (INIS)

    Morandin, Matteo; Maréchal, François; Mercangöz, Mehmet; Buchter, Florian

    2012-01-01

    The interest in large scale electricity storage (ES) with discharging time longer than 1 h and nominal power greater than 1 MW, is increasing worldwide as the increasing share of renewable energy, typically solar and wind energy, imposes severe load management issues. Thermo-electrical energy storage (TEES) based on thermodynamic cycles is currently under investigation at ABB corporate research as an alternative solution to pump hydro and compressed air energy storage. TEES is based on the conversion of electricity into thermal energy during charge by means of a heat pump and on the conversion of thermal energy into electricity during discharge by means of a thermal engine. The synthesis and the thermodynamic optimization of a TEES system based on hot water, ice storage and transcritical CO 2 cycles, is discussed in two papers. In this first paper a methodology for the conceptual design of a TEES system based on the analysis of the thermal integration between charging and discharging cycles through Pinch Analysis tools is introduced. According to such methodology, the heat exchanger network and temperatures and volumes of storage tanks are not defined a priori but are determined after the cycle parameters are optimized. For this purpose a heuristic procedure based on the interpretation of the composite curves obtained by optimizing the thermal integration between the cycles was developed. Such heuristic rules were implemented in a code that allows finding automatically the complete system design for given values of the intensive parameters of the charging and discharging cycles only. A base case system configuration is introduced and the results of its thermodynamic optimization are discussed here. A maximum roundtrip efficiency of 60% was obtained for the base case configuration assuming turbomachinery and heat exchanger performances in line with indications from manufacturers. -- Highlights: ► Energy storage based on water, ice, and transcritical CO 2 cycles is

  4. Relay chatter and operator response after a large earthquake: An improved PRA methodology with case studies

    International Nuclear Information System (INIS)

    Budnitz, R.J.; Lambert, H.E.; Hill, E.E.

    1987-08-01

    The purpose of this project has been to develop and demonstrate improvements in the PRA methodology used for analyzing earthquake-induced accidents at nuclear power reactors. Specifically, the project addresses methodological weaknesses in the PRA systems analysis used for studying post-earthquake relay chatter and for quantifying human response under high stress. An improved PRA methodology for relay-chatter analysis is developed, and its use is demonstrated through analysis of the Zion-1 and LaSalle-2 reactors as case studies. This demonstration analysis is intended to show that the methodology can be applied in actual cases, and the numerical values of core-damage frequency are not realistic. The analysis relies on SSMRP-based methodologies and data bases. For both Zion-1 and LaSalle-2, assuming that loss of offsite power (LOSP) occurs after a large earthquake and that there are no operator recovery actions, the analysis finds very many combinations (Boolean minimal cut sets) involving chatter of three or four relays and/or pressure switch contacts. The analysis finds that the number of min-cut-set combinations is so large that there is a very high likelihood (of the order of unity) that at least one combination will occur after earthquake-caused LOSP. This conclusion depends in detail on the fragility curves and response assumptions used for chatter. Core-damage frequencies are calculated, but they are probably pessimistic because assuming zero credit for operator recovery is pessimistic. The project has also developed an improved PRA methodology for quantifying operator error under high-stress conditions such as after a large earthquake. Single-operator and multiple-operator error rates are developed, and a case study involving an 8-step procedure (establishing feed-and-bleed in a PWR after an earthquake-initiated accident) is used to demonstrate the methodology

  5. Evaluation of the National Solar Radiation Database (NSRDB) Using Ground-Based Measurements

    Science.gov (United States)

    Xie, Y.; Sengupta, M.; Habte, A.; Lopez, A.

    2017-12-01

    Solar resource is essential for a wide spectrum of applications including renewable energy, climate studies, and solar forecasting. Solar resource information can be obtained from ground-based measurement stations and/or from modeled data sets. While measurements provide data for the development and validation of solar resource models and other applications modeled data expands the ability to address the needs for increased accuracy and spatial and temporal resolution. The National Renewable Energy Laboratory (NREL) has developed and regular updates modeled solar resource through the National Solar Radiation Database (NSRDB). The recent NSRDB dataset was developed using the physics-based Physical Solar Model (PSM) and provides gridded solar irradiance (global horizontal irradiance (GHI), direct normal irradiance (DNI), and diffuse horizontal irradiance) at a 4-km by 4-km spatial and half-hourly temporal resolution covering 18 years from 1998-2015. A comprehensive validation of the performance of the NSRDB (1998-2015) was conducted to quantify the accuracy of the spatial and temporal variability of the solar radiation data. Further, the study assessed the ability of NSRDB (1998-2015) to accurately capture inter-annual variability, which is essential information for solar energy conversion projects and grid integration studies. Comparisons of the NSRDB (1998-2015) with nine selected ground-measured data were conducted under both clear- and cloudy-sky conditions. These locations provide a high quality data covering a variety of geographical locations and climates. The comparison of the NSRDB to the ground-based data demonstrated that biases were within +/- 5% for GHI and +/-10% for DNI. A comprehensive uncertainty estimation methodology was established to analyze the performance of the gridded NSRDB and includes all sources of uncertainty at various time-averaged periods, a method that is not often used in model evaluation. Further, the study analyzed the inter

  6. Ground-source heat pump case studies and utility programs

    Energy Technology Data Exchange (ETDEWEB)

    Lienau, P.J.; Boyd, T.L.; Rogers, R.L.

    1995-04-01

    Ground-source heat pump systems are one of the promising new energy technologies that has shown rapid increase in usage over the past ten years in the United States. These systems offer substantial benefits to consumers and utilities in energy (kWh) and demand (kW) savings. The purpose of this study was to determine what existing monitored data was available mainly from electric utilities on heat pump performance, energy savings and demand reduction for residential, school and commercial building applications. In order to verify the performance, information was collected for 253 case studies from mainly utilities throughout the United States. The case studies were compiled into a database. The database was organized into general information, system information, ground system information, system performance, and additional information. Information was developed on the status of demand-side management of ground-source heat pump programs for about 60 electric utility and rural electric cooperatives on marketing, incentive programs, barriers to market penetration, number units installed in service area, and benefits.

  7. Methodology for applying monitored natural attenuation to petroleum hydrocarbon-contaminated ground-water systems with examples from South Carolina

    Science.gov (United States)

    Chapelle, Frank H.; Robertson, John F.; Landmeyer, James E.; Bradley, Paul M.

    2000-01-01

    Natural attenuation processes such as dispersion, advection, and biogradation serve to decrease concentrations of disssolved contaminants as they are transported in all ground-water systems.  However, the efficiency of these natural attenuation processes and the degree to which they help attain remediation goals, varies considerably from site to site.  This report provides a methodology for quantifying various natural attenuation mechanisms.  This methodology incorporates information on (1) concentrations of contaminants in space and/or time; (2) ambient reduction/oxidation (redox) conditions; (3) rates and directions of ground-water flow; (4) rates of contaminant biodegradation; and (5) demographic considerations, such as the presence of nearby receptor exposure points or property boundaries.  This document outlines the hydrologic, geochemical, and biologic data needed to assess the efficiency of natural attenuation, provides a screening tool for making preliminary assessments, and provides examples of how to determine when natural attenuation can be a useful component of site remediation at leaking underground storage tank sites.

  8. Cross-validation Methodology between Ground and GPM Satellite-based Radar Rainfall Product over Dallas-Fort Worth (DFW) Metroplex

    Science.gov (United States)

    Chen, H.; Chandrasekar, V.; Biswas, S.

    2015-12-01

    Over the past two decades, a large number of rainfall products have been developed based on satellite, radar, and/or rain gauge observations. However, to produce optimal rainfall estimation for a given region is still challenging due to the space time variability of rainfall at many scales and the spatial and temporal sampling difference of different rainfall instruments. In order to produce high-resolution rainfall products for urban flash flood applications and improve the weather sensing capability in urban environment, the center for Collaborative Adaptive Sensing of the Atmosphere (CASA), in collaboration with National Weather Service (NWS) and North Central Texas Council of Governments (NCTCOG), has developed an urban radar remote sensing network in DFW Metroplex. DFW is the largest inland metropolitan area in the U.S., that experiences a wide range of natural weather hazards such as flash flood and hailstorms. The DFW urban remote sensing network, centered by the deployment of eight dual-polarization X-band radars and a NWS WSR-88DP radar, is expected to provide impacts-based warning and forecasts for benefit of the public safety and economy. High-resolution quantitative precipitation estimation (QPE) is one of the major goals of the development of this urban test bed. In addition to ground radar-based rainfall estimation, satellite-based rainfall products for this area are also of interest for this study. Typical example is the rainfall rate product produced by the Dual-frequency Precipitation Radar (DPR) onboard Global Precipitation Measurement (GPM) Core Observatory satellite. Therefore, cross-comparison between ground and space-based rainfall estimation is critical to building an optimal regional rainfall system, which can take advantages of the sampling differences of different sensors. This paper presents the real-time high-resolution QPE system developed for DFW urban radar network, which is based upon the combination of S-band WSR-88DP and X

  9. METHODOLOGY TO EVALUATE THE POTENTIAL FOR GROUND WATER CONTAMINATION FROM GEOTHERMAL FLUID RELEASES

    Science.gov (United States)

    This report provides analytical methods and graphical techniques to predict potential ground water contamination from geothermal energy development. Overflows and leaks from ponds, pipe leaks, well blowouts, leaks from well casing, and migration from injection zones can be handle...

  10. A Design Methodology for Medical Processes

    Science.gov (United States)

    Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Summary Background Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient’s needs, the uncertainty of the patient’s response, and the indeterminacy of patient’s compliance to treatment. Also, the multiple actors involved in patient’s care need clear and transparent communication to ensure care coordination. Objectives In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. Methods The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. Results The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Conclusions Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution. PMID:27081415

  11. A Design Methodology for Medical Processes.

    Science.gov (United States)

    Ferrante, Simona; Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient's needs, the uncertainty of the patient's response, and the indeterminacy of patient's compliance to treatment. Also, the multiple actors involved in patient's care need clear and transparent communication to ensure care coordination. In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution.

  12. Orton-Gillingham Methodology for Students with Reading Disabilities: 30 Years of Case Law

    Science.gov (United States)

    Rose, Tessie E.; Zirkel, Perry

    2007-01-01

    Although numerous studies have investigated autism methodology case law, few studies have investigated case law regarding reading methodology, particularly the Orton-Gillingham approach, for students with reading disabilities. We provide the results of a systematic case analysis of all published Orton-Gillingham decisions from the original passage…

  13. Systematic screening methodology and energy efficient design of ionic liquid-based separation processes

    DEFF Research Database (Denmark)

    Kulajanpeng, Kusuma; Suriyapraphadilok, Uthaiporn; Gani, Rafiqul

    2016-01-01

    in size of the target solute was investigated using the same separation process and IL entrainer to obtain the same product purity. The proposed methodology has been evaluated through a case study of binary alcoholic aqueous azeotropic separation: water+ethanol and water+isopropanol.......A systematic methodology for the screening of ionic liquids (ILs) as entrainers and for the design of ILs-based separation processes in various homogeneous binary azeotropic mixtures has been developed. The methodology focuses on the homogeneous binary aqueous azeotropic systems (for example, water...

  14. Fusion of Satellite Multispectral Images Based on Ground-Penetrating Radar (GPR Data for the Investigation of Buried Concealed Archaeological Remains

    Directory of Open Access Journals (Sweden)

    Athos Agapiou

    2017-06-01

    Full Text Available The paper investigates the superficial layers of an archaeological landscape based on the integration of various remote sensing techniques. It is well known in the literature that shallow depths may be rich in archeological remains, which generate different signal responses depending on the applied technique. In this study three main technologies are examined, namely ground-penetrating radar (GPR, ground spectroscopy, and multispectral satellite imagery. The study aims to propose a methodology to enhance optical remote sensing satellite images, intended for archaeological research, based on the integration of ground based and satellite datasets. For this task, a regression model between the ground spectroradiometer and GPR is established which is then projected to a high resolution sub-meter optical image. The overall methodology consists of nine steps. Beyond the acquirement of the in-situ measurements and their calibration (Steps 1–3, various regression models are examined for more than 70 different vegetation indices (Steps 4–5. The specific data analysis indicated that the red-edge position (REP hyperspectral index was the most appropriate for developing a local fusion model between ground spectroscopy data and GPR datasets (Step 6, providing comparable results with the in situ GPR measurements (Step 7. Other vegetation indices, such as the normalized difference vegetation index (NDVI, have also been examined, providing significant correlation between the two datasets (R = 0.50. The model is then projected to a high-resolution image over the area of interest (Step 8. The proposed methodology was evaluated with a series of field data collected from the Vésztő-Mágor Tell in the eastern part of Hungary. The results were compared with in situ magnetic gradiometry measurements, indicating common interpretation results. The results were also compatible with the preliminary archaeological investigations of the area (Step 9. The overall

  15. CASE METHOD. ACTIVE LEARNING METHODOLOGY TO ACQUIRE SIGNIFICANT IN CHEMISTRY

    Directory of Open Access Journals (Sweden)

    Clotilde Pizarro

    2015-09-01

    Full Text Available In this paper the methodology of cases in first year students of the Engineering Risk Prevention and Environment is applied. For this purpose a real case of contamination occurred at a school in the region of Valparaiso called "La Greda" is presented. If the application starts delivering an extract of the information collected from the media and they made a brief induction on the methodology to be applied. A plenary session, which is debate about possible solutions to the problem and establishing a relationship between the case and drives the chemistry program is then performed. Is concluded that the application of the case method, was a fruitful tool in yields obtained by students, since the percentage of approval was 75%, which is considerably higher than previous years.

  16. A case-based assistant for clinical psychiatry expertise.

    Science.gov (United States)

    Bichindaritz, I

    1994-01-01

    Case-based reasoning is an artificial intelligence methodology for the processing of empirical knowledge. Recent case-based reasoning systems also use theoretic knowledge about the domain to constrain the case-based reasoning. The organization of the memory is the key issue in case-based reasoning. The case-based assistant presented here has two structures in memory: cases and concepts. These memory structures permit it to be as skilled in problem-solving tasks, such as diagnosis and treatment planning, as in interpretive tasks, such as clinical research. A prototype applied to clinical work about eating disorders in psychiatry, reasoning from the alimentary questionnaires of these patients, is presented as an example of the system abilities.

  17. Damage detection methodology on beam-like structures based on combined modal Wavelet Transform strategy

    Science.gov (United States)

    Serra, Roger; Lopez, Lautaro

    2018-05-01

    Different approaches on the detection of damages based on dynamic measurement of structures have appeared in the last decades. They were based, amongst others, on changes in natural frequencies, modal curvatures, strain energy or flexibility. Wavelet analysis has also been used to detect the abnormalities on modal shapes induced by damages. However the majority of previous work was made with non-corrupted by noise signals. Moreover, the damage influence for each mode shape was studied separately. This paper proposes a new methodology based on combined modal wavelet transform strategy to cope with noisy signals, while at the same time, able to extract the relevant information from each mode shape. The proposed methodology will be then compared with the most frequently used and wide-studied methods from the bibliography. To evaluate the performance of each method, their capacity to detect and localize damage will be analyzed in different cases. The comparison will be done by simulating the oscillations of a cantilever steel beam with and without defect as a numerical case. The proposed methodology proved to outperform classical methods in terms of noisy signals.

  18. A deviation based assessment methodology for multiple machine health patterns classification and fault detection

    Science.gov (United States)

    Jia, Xiaodong; Jin, Chao; Buzza, Matt; Di, Yuan; Siegel, David; Lee, Jay

    2018-01-01

    Successful applications of Diffusion Map (DM) in machine failure detection and diagnosis have been reported in several recent studies. DM provides an efficient way to visualize the high-dimensional, complex and nonlinear machine data, and thus suggests more knowledge about the machine under monitoring. In this paper, a DM based methodology named as DM-EVD is proposed for machine degradation assessment, abnormality detection and diagnosis in an online fashion. Several limitations and challenges of using DM for machine health monitoring have been analyzed and addressed. Based on the proposed DM-EVD, a deviation based methodology is then proposed to include more dimension reduction methods. In this work, the incorporation of Laplacian Eigen-map and Principal Component Analysis (PCA) are explored, and the latter algorithm is named as PCA-Dev and is validated in the case study. To show the successful application of the proposed methodology, case studies from diverse fields are presented and investigated in this work. Improved results are reported by benchmarking with other machine learning algorithms.

  19. Ground Control Point - Wireless System Network for UAV-based environmental monitoring applications

    Science.gov (United States)

    Mejia-Aguilar, Abraham

    2016-04-01

    In recent years, Unmanned Aerial Vehicles (UAV) have seen widespread civil applications including usage for survey and monitoring services in areas such as agriculture, construction and civil engineering, private surveillance and reconnaissance services and cultural heritage management. Most aerial monitoring services require the integration of information acquired during the flight (such as imagery) with ground-based information (such as GPS information or others) for improved ground truth validation. For example, to obtain an accurate 3D and Digital Elevation Model based on aerial imagery, it is necessary to include ground-based information of coordinate points, which are normally acquired with surveying methods based on Global Position Systems (GPS). However, GPS surveys are very time consuming and especially for longer time series of monitoring data repeated GPS surveys are necessary. In order to improve speed of data collection and integration, this work presents an autonomous system based on Waspmote technologies build on single nodes interlinked in a Wireless Sensor Network (WSN) star-topology for ground based information collection and later integration with surveying data obtained by UAV. Nodes are designed to be visible from the air, to resist extreme weather conditions with low-power consumption. Besides, nodes are equipped with GPS as well as Inertial Measurement Unit (IMU), accelerometer, temperature and soil moisture sensors and thus provide significant advantages in a broad range of applications for environmental monitoring. For our purpose, the WSN transmits the environmental data with 3G/GPRS to a database on a regular time basis. This project provides a detailed case study and implementation of a Ground Control Point System Network for UAV-based vegetation monitoring of dry mountain grassland in the Matsch valley, Italy.

  20. The Monitoring Case of Ground-Based Synthetic Aperture Radar with Frequency Modulated Continuous Wave System

    Science.gov (United States)

    Zhang, H. Y.; Zhai, Q. P.; Chen, L.; Liu, Y. J.; Zhou, K. Q.; Wang, Y. S.; Dou, Y. D.

    2017-09-01

    The features of the landslide geological disaster are wide distribution, variety, high frequency, high intensity, destructive and so on. It has become a natural disaster with harmful and wide range of influence. The technology of ground-based synthetic aperture radar is a novel deformation monitoring technology developed in recent years. The features of the technology are large monitoring area, high accuracy, long distance without contact and so on. In this paper, fast ground-based synthetic aperture radar (Fast-GBSAR) based on frequency modulated continuous wave (FMCW) system is used to collect the data of Ma Liuzui landslide in Chongqing. The device can reduce the atmospheric errors caused by rapidly changing environment. The landslide deformation can be monitored in severe weather conditions (for example, fog) by Fast-GBSAR with acquisition speed up to 5 seconds per time. The data of Ma Liuzui landslide in Chongqing are analyzed in this paper. The result verifies that the device can monitor landslide deformation under severe weather conditions.

  1. 618-11 Burial Ground USRADS radiological surveys

    International Nuclear Information System (INIS)

    Wendling, M.A.

    1994-01-01

    This report summarizes and documents the results of the radiological surveys conducted from February 4 through February 10, 1993 over the 618-11 Burial Ground, Hanford Site, Richland, Washington. In addition, this report explains the survey methodology using the Ultrasonic Ranging and Data System (USRADS). The 618-11 Burial Ground radiological survey field task consisted of two activities: characterization of the specific background conditions and the radiological survey of the area. The radiological survey of the 618-11 Burial Ground, along with the background study, were conducted by Site Investigative Surveys Environmental Restoration Health Physics Organization of the Westinghouse Hanford Company. The survey methodology was based on utilization of the Ultrasonic Ranging and Data System (USRADS) for automated recording of the gross gamma radiation levels at or near six (6) inches and at three (3) feet from the surface soil

  2. Making sense of grounded theory in medical education.

    Science.gov (United States)

    Kennedy, Tara J T; Lingard, Lorelei A

    2006-02-01

    Grounded theory is a research methodology designed to develop, through collection and analysis of data that is primarily (but not exclusively) qualitative, a well-integrated set of concepts that provide a theoretical explanation of a social phenomenon. This paper aims to provide an introduction to key features of grounded theory methodology within the context of medical education research. In this paper we include a discussion of the origins of grounded theory, a description of key methodological processes, a comment on pitfalls encountered commonly in the application of grounded theory research, and a summary of the strengths of grounded theory methodology with illustrations from the medical education domain. The significant strengths of grounded theory that have resulted in its enduring prominence in qualitative research include its clearly articulated analytical process and its emphasis on the generation of pragmatic theory that is grounded in the data of experience. When applied properly and thoughtfully, grounded theory can address research questions of significant relevance to the domain of medical education.

  3. Towards A Model-based Prognostics Methodology for Electrolytic Capacitors: A Case Study Based on Electrical Overstress Accelerated Aging

    Directory of Open Access Journals (Sweden)

    Gautam Biswas

    2012-12-01

    Full Text Available This paper presents a model-driven methodology for predict- ing the remaining useful life of electrolytic capacitors. This methodology adopts a Kalman filter approach in conjunction with an empirical state-based degradation model to predict the degradation of capacitor parameters through the life of the capacitor. Electrolytic capacitors are important components of systems that range from power supplies on critical avion- ics equipment to power drivers for electro-mechanical actuators. These devices are known for their comparatively low reliability and given their critical role in the system, they are good candidates for component level prognostics and health management. Prognostics provides a way to assess remain- ing useful life of a capacitor based on its current state of health and its anticipated future usage and operational conditions. This paper proposes and empirical degradation model and discusses experimental results for an accelerated aging test performed on a set of identical capacitors subjected to electrical stress. The data forms the basis for developing the Kalman-filter based remaining life prediction algorithm.

  4. An In Depth Look at Lightning Trends in Hurricane Harvey using Satellite and Ground-Based Measurements

    Science.gov (United States)

    Ringhausen, J.

    2017-12-01

    This research combines satellite measurements of lightning in Hurricane Harvey with ground-based lightning measurements to get a better sense of the total lightning occurring in the hurricane, both intra-cloud (IC) and cloud-to-ground (CG), and how it relates to the intensification and weakening of the tropical system. Past studies have looked at lightning trends in hurricanes using the space based Lightning Imaging Sensor (LIS) or ground-based lightning detection networks. However, both of these methods have drawbacks. For instance, LIS was in low earth orbit, which limited lightning observations to 90 seconds for a particular point on the ground; hence, continuous lightning coverage of a hurricane was not possible. Ground-based networks can have a decreased detection efficiency, particularly for ICs, over oceans where hurricanes generally intensify. With the launch of the Geostationary Lightning Mapper (GLM) on the GOES-16 satellite, researchers can study total lightning continuously over the lifetime of a tropical cyclone. This study utilizes GLM to investigate total lightning activity in Hurricane Harvey temporally; this is augmented with spatial analysis relative to hurricane structure, similar to previous studies. Further, GLM and ground-based network data are combined using Bayesian techniques in a new manner to leverage the strengths of each detection method. This methodology 1) provides a more complete estimate of lightning activity and 2) enables the derivation of the IC:CG ratio (Z-ratio) throughout the time period of the study. In particular, details of the evolution of the Z-ratio in time and space are presented. In addition, lightning stroke spatiotemporal trends are compared to lightning flash trends. This research represents a new application of lightning data that can be used in future study of tropical cyclone intensification and weakening.

  5. [Introduction to grounded theory].

    Science.gov (United States)

    Wang, Shou-Yu; Windsor, Carol; Yates, Patsy

    2012-02-01

    Grounded theory, first developed by Glaser and Strauss in the 1960s, was introduced into nursing education as a distinct research methodology in the 1970s. The theory is grounded in a critique of the dominant contemporary approach to social inquiry, which imposed "enduring" theoretical propositions onto study data. Rather than starting from a set theoretical framework, grounded theory relies on researchers distinguishing meaningful constructs from generated data and then identifying an appropriate theory. Grounded theory is thus particularly useful in investigating complex issues and behaviours not previously addressed and concepts and relationships in particular populations or places that are still undeveloped or weakly connected. Grounded theory data analysis processes include open, axial and selective coding levels. The purpose of this article was to explore the grounded theory research process and provide an initial understanding of this methodology.

  6. Comparison of in-flight and ground-based simulator derived flying qualities and pilot performance for approach and landing tasks

    Science.gov (United States)

    Grantham, William D.; Williams, Robert H.

    1987-01-01

    For the case of an approach-and-landing piloting task emphasizing response to the landing flare, pilot opinion and performance parameters derived from jet transport aircraft six-degree-of-freedom ground-based and in-flight simulators were compared in order to derive data for the flight-controls/flying-qualities engineers. The data thus obtained indicate that ground simulation results tend to be conservative, and that the effect of control sensitivity is more pronounced for ground simulation. The pilot also has a greater tendency to generate pilot-induced oscillation in ground-based simulation than in flight.

  7. Value and Vision-based Methodology in Integrated Design

    DEFF Research Database (Denmark)

    Tollestrup, Christian

    on empirical data from workshop where the Value and Vision-based methodology has been taught. The research approach chosen for this investigation is Action Research, where the researcher plays an active role in generating the data and gains a deeper understanding of the investigated phenomena. The result...... of this thesis is the value transformation from an explicit set of values to a product concept using a vision based concept development methodology based on the Pyramid Model (Lerdahl, 2001) in a design team context. The aim of this thesis is to examine how the process of value transformation is occurring within...... is divided in three; the systemic unfolding of the Value and Vision-based methodology, the structured presentation of practical implementation of the methodology and finally the analysis and conclusion regarding the value transformation, phenomena and learning aspects of the methodology....

  8. Improve Internal Audit Methodology in the Case Company

    OpenAIRE

    Hong Trang Nguyen, Thi

    2016-01-01

    The purpose of this study was to identify improvement areas in the internal audit methodology used by the Internal Audit team at the case company which is the local subsidiary of a global financial group. The Internal Audit activity of the case company has been recently evaluated by the Institute of Internal Auditors. The overall quality assessment concludes that the Internal Audit activity has a charter, policies and processes that are in conformance with the Mandatory Guidance of the Intern...

  9. Grounded theory.

    Science.gov (United States)

    Harris, Tina

    2015-04-29

    Grounded theory is a popular research approach in health care and the social sciences. This article provides a description of grounded theory methodology and its key components, using examples from published studies to demonstrate practical application. It aims to demystify grounded theory for novice nurse researchers, by explaining what it is, when to use it, why they would want to use it and how to use it. It should enable nurse researchers to decide if grounded theory is an appropriate approach for their research, and to determine the quality of any grounded theory research they read.

  10. Universal Verification Methodology Based Register Test Automation Flow.

    Science.gov (United States)

    Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu

    2016-05-01

    In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.

  11. GPM GROUND VALIDATION GCPEX SNOW MICROPHYSICS CASE STUDY V1

    Data.gov (United States)

    National Aeronautics and Space Administration — The GPM Ground Validation GCPEX Snow Microphysics Case Study characterizes the 3-D microphysical evolution and distribution of snow in context of the thermodynamic...

  12. Illumination compensation in ground based hyperspectral imaging

    Science.gov (United States)

    Wendel, Alexander; Underwood, James

    2017-07-01

    Hyperspectral imaging has emerged as an important tool for analysing vegetation data in agricultural applications. Recently, low altitude and ground based hyperspectral imaging solutions have come to the fore, providing very high resolution data for mapping and studying large areas of crops in detail. However, these platforms introduce a unique set of challenges that need to be overcome to ensure consistent, accurate and timely acquisition of data. One particular problem is dealing with changes in environmental illumination while operating with natural light under cloud cover, which can have considerable effects on spectral shape. In the past this has been commonly achieved by imaging known reference targets at the time of data acquisition, direct measurement of irradiance, or atmospheric modelling. While capturing a reference panel continuously or very frequently allows accurate compensation for illumination changes, this is often not practical with ground based platforms, and impossible in aerial applications. This paper examines the use of an autonomous unmanned ground vehicle (UGV) to gather high resolution hyperspectral imaging data of crops under natural illumination. A process of illumination compensation is performed to extract the inherent reflectance properties of the crops, despite variable illumination. This work adapts a previously developed subspace model approach to reflectance and illumination recovery. Though tested on a ground vehicle in this paper, it is applicable to low altitude unmanned aerial hyperspectral imagery also. The method uses occasional observations of reference panel training data from within the same or other datasets, which enables a practical field protocol that minimises in-field manual labour. This paper tests the new approach, comparing it against traditional methods. Several illumination compensation protocols for high volume ground based data collection are presented based on the results. The findings in this paper are

  13. Linking the Intercultural and Grounded Theory: Methodological Issues in Migration Research

    Directory of Open Access Journals (Sweden)

    Vera Sheridan

    2009-01-01

    Full Text Available Connecting intercultural research with Grounded Theory was advocated in the early history of intercultural theorising and includes the development of researchers' intercultural competencies. Such competency comes to the fore where intercultural theory places an equal emphasis on home and host cultures in migration research. In this context we have found a Grounded Theory approach particularly suitable for disentangling complex interlinkings within migration experiences and their individual outcomes. Grounded Theory allows for the exploration of various theories in different fields and the emergence of new or deeper interpretations of intercultural experiences, including where research has not engaged deeply with or avoided intercultural contexts. The use of software, based on Grounded Theory, provides the resource for systematically exploring the inter-related nature of data. In addition, engaging in intercultural research, in particular, raises questions around our practice as social science researchers: adherence to ethics guidelines, for instance, can be in some conflict with the relations we build with members of communities whose cultural values, for instance around friendship or trust, impact on the norms of both our own and institutional expectations. This leads to reflection on the relationship with research participants in terms of our own intercultural experiences and position. URN: urn:nbn:de:0114-fqs0901363

  14. How to bring absolute sustainability into decision-making: An industry case study using a Planetary Boundary-based methodology

    DEFF Research Database (Denmark)

    Ryberg, Morten W.; Owsianiak, Mikołaj; Clavreul, Julie

    2018-01-01

    The Planetary Boundaries concept has emerged as a framework for articulating environmental limits, gaining traction as a basis for considering sustainability in business settings, government policy and international guidelines. There is emerging interest in using the Planetary Boundaries concept...... as part of life cycle assessment (LCA) for gauging absolute environmental sustainability. We tested the applicability of a novel Planetary Boundaries-based life cycle impact assessment methodology on a hypothetical laundry washing case study at the EU level. We express the impacts corresponding...... to the control variables of the individual Planetary Boundaries together with a measure of their respective uncertainties. We tested four sharing principles for assigning a share of the safe operating space (SoSOS) to laundry washing and assessed if the impacts were within the assigned SoSOS. The choice...

  15. Positive deviance control-case life history: a method to develop grounded hypotheses about successful long-term avoidance of infection

    Directory of Open Access Journals (Sweden)

    Sandoval Milagros

    2008-03-01

    developed to stay safe. Staying Safe methodology develops grounded hypotheses. These can be tested through cohort studies of incidence and prevention trials of hypothesis-based programs to help drug injectors make their injection and sexual careers safer for themselves and others. This positive deviance control-case life history method might be used to study avoiding other infections like genital herpes among sex workers.

  16. Ground cross-modal impedance as a tool for analyzing ground/plate interaction and ground wave propagation.

    Science.gov (United States)

    Grau, L; Laulagnet, B

    2015-05-01

    An analytical approach is investigated to model ground-plate interaction based on modal decomposition and the two-dimensional Fourier transform. A finite rectangular plate subjected to flexural vibration is coupled with the ground and modeled with the Kirchhoff hypothesis. A Navier equation represents the stratified ground, assumed infinite in the x- and y-directions and free at the top surface. To obtain an analytical solution, modal decomposition is applied to the structure and a Fourier Transform is applied to the ground. The result is a new tool for analyzing ground-plate interaction to resolve this problem: ground cross-modal impedance. It allows quantifying the added-stiffness, added-mass, and added-damping from the ground to the structure. Similarity with the parallel acoustic problem is highlighted. A comparison between the theory and the experiment shows good matching. Finally, specific cases are investigated, notably the influence of layer depth on plate vibration.

  17. Risk-based methodology for USNRC inspections

    International Nuclear Information System (INIS)

    Wong, S.M.; Holahan, G.M.; Chung, J.W.; Johnson, M.R.

    1995-01-01

    This paper describes the development and trial applications of a risk-based methodology to enhance the inspection processes for US nuclear power plants. Objectives of risk-based methods to complement prescriptive engineering approaches in US Nuclear Regulatory Commission (USNRC) inspection programs are presented. Insights from time-dependent risk profiles of plant configurational from Individual Plant Evaluation (IPE) studies were integrated to develop a framework for optimizing inspection efforts in NRC regulatory initiatives. Lessons learned from NRC pilot applications of the risk-based methodology for evaluation of the effectiveness of operational risk management programs at US nuclear power plant sites are also discussed

  18. Methodology for estimating human perception to tremors in high-rise buildings

    Science.gov (United States)

    Du, Wenqi; Goh, Key Seng; Pan, Tso-Chien

    2017-07-01

    Human perception to tremors during earthquakes in high-rise buildings is usually associated with psychological discomfort such as fear and anxiety. This paper presents a methodology for estimating the level of perception to tremors for occupants living in high-rise buildings subjected to ground motion excitations. Unlike other approaches based on empirical or historical data, the proposed methodology performs a regression analysis using the analytical results of two generic models of 15 and 30 stories. The recorded ground motions in Singapore are collected and modified for structural response analyses. Simple predictive models are then developed to estimate the perception level to tremors based on a proposed ground motion intensity parameter—the average response spectrum intensity in the period range between 0.1 and 2.0 s. These models can be used to predict the percentage of occupants in high-rise buildings who may perceive the tremors at a given ground motion intensity. Furthermore, the models are validated with two recent tremor events reportedly felt in Singapore. It is found that the estimated results match reasonably well with the reports in the local newspapers and from the authorities. The proposed methodology is applicable to urban regions where people living in high-rise buildings might feel tremors during earthquakes.

  19. Integration of a satellite ground support system based on analysis of the satellite ground support domain

    Science.gov (United States)

    Pendley, R. D.; Scheidker, E. J.; Levitt, D. S.; Myers, C. R.; Werking, R. D.

    1994-11-01

    This analysis defines a complete set of ground support functions based on those practiced in real space flight operations during the on-orbit phase of a mission. These functions are mapped against ground support functions currently in use by NASA and DOD. Software components to provide these functions can be hosted on RISC-based work stations and integrated to provide a modular, integrated ground support system. Such modular systems can be configured to provide as much ground support functionality as desired. This approach to ground systems has been widely proposed and prototyped both by government institutions and commercial vendors. The combined set of ground support functions we describe can be used as a standard to evaluate candidate ground systems. This approach has also been used to develop a prototype of a modular, loosely-integrated ground support system, which is discussed briefly. A crucial benefit to a potential user is that all the components are flight-qualified, thus giving high confidence in their accuracy and reliability.

  20. High Throughput Determination of Plant Height, Ground Cover, and Above-Ground Biomass in Wheat with LiDAR.

    Science.gov (United States)

    Jimenez-Berni, Jose A; Deery, David M; Rozas-Larraondo, Pablo; Condon, Anthony Tony G; Rebetzke, Greg J; James, Richard A; Bovill, William D; Furbank, Robert T; Sirault, Xavier R R

    2018-01-01

    Crop improvement efforts are targeting increased above-ground biomass and radiation-use efficiency as drivers for greater yield. Early ground cover and canopy height contribute to biomass production, but manual measurements of these traits, and in particular above-ground biomass, are slow and labor-intensive, more so when made at multiple developmental stages. These constraints limit the ability to capture these data in a temporal fashion, hampering insights that could be gained from multi-dimensional data. Here we demonstrate the capacity of Light Detection and Ranging (LiDAR), mounted on a lightweight, mobile, ground-based platform, for rapid multi-temporal and non-destructive estimation of canopy height, ground cover and above-ground biomass. Field validation of LiDAR measurements is presented. For canopy height, strong relationships with LiDAR ( r 2 of 0.99 and root mean square error of 0.017 m) were obtained. Ground cover was estimated from LiDAR using two methodologies: red reflectance image and canopy height. In contrast to NDVI, LiDAR was not affected by saturation at high ground cover, and the comparison of both LiDAR methodologies showed strong association ( r 2 = 0.92 and slope = 1.02) at ground cover above 0.8. For above-ground biomass, a dedicated field experiment was performed with destructive biomass sampled eight times across different developmental stages. Two methodologies are presented for the estimation of biomass from LiDAR: 3D voxel index (3DVI) and 3D profile index (3DPI). The parameters involved in the calculation of 3DVI and 3DPI were optimized for each sample event from tillering to maturity, as well as generalized for any developmental stage. Individual sample point predictions were strong while predictions across all eight sample events, provided the strongest association with biomass ( r 2 = 0.93 and r 2 = 0.92) for 3DPI and 3DVI, respectively. Given these results, we believe that application of this system will provide new

  1. Bow tie methodology: a tool to enhance the visibility and understanding of nuclear safety cases

    International Nuclear Information System (INIS)

    Vannerem, Marc

    2013-01-01

    There is much common ground between the nuclear industry and other major hazard industries such as those subject to the Seveso II regulations, e.g. oil, gas and chemicals. They are all subject to legal requirements to identify and control hazards, and to demonstrate that all necessary measures have been taken to minimise risks posed by the site with regard to people and the environment. This places a requirement on the Operators of major hazard installations, whether nuclear or conventional, to understand and identify the hazards of their operations, the initiating events, the consequences, the prevention and mitigation measures. However, in the UK, nuclear and 'Seveso' type facilities seem to adopt a different approach to the presentation of their safety cases. Given the magnitude of the hazards, safety cases developed for nuclear fuel cycle facilities are rigorous, detailed and complex, which can have the effect of reducing the visibility of the key hazards and corresponding protective measures. In contrast, on installations in the oil and gas and chemical industries, a real attempt has been made over recent years to improve the visibility and accessibility of the safety case to all operating personnel, through the use of visual aids / diagrams. In particular, many Operators are choosing to use 'bow tie methodology', in which very simple overview diagrams are produced to illustrate, in a form understandable by all: - what the key hazards are; - the initiating events; - the consequences of an incident; - the barriers or 'Layers of Protection' which prevent an initiating event from developing into an incident; - the barriers or 'Layers of Defence' which mitigate the consequences of an incident, i.e. which prevent the incident from escalating into major consequences. The bow tie method is one of a number of methodologies that can be used to make safety cases more accessible. It is used in this paper to illustrate ways to

  2. Quality Improvement of Ground Works Process with the Use of Chosen Lean Management Tools - Case Study

    Science.gov (United States)

    Nowotarski, Piotr; Paslawski, Jerzy; Wysocki, Bartosz

    2017-12-01

    Ground works are one of the first processes connected with erecting structures. Based on ground conditions like the type of soil or level of underground water different types and solutions for foundations are designed. Foundations are the base for the buildings, and their proper design and execution is the key for the long and faultless use of the whole construction and might influence on the future costs of the eventual repairs (especially when ground water level is high, and there is no proper water insulation made). Article presents the introduction of chosen Lean Management tools for quality improvement of the process of ground works based on the analysis made on the construction site of vehicle control station located in Poznan, Poland. Processes assessment is made from different perspectives taking into account that 3 main groups of workers were directly involved in the process: blue collar-workers, site manager and site engineers. What is more comparison is made on the 3 points of view to the problems that might occur during this type of works, with details analysis on the causes of such situation? Authors presents also the change of approach of workers directly involved in the mentioned processes regarding introduction of Lean Management methodology, which illustrates the problem of scepticism for new ideas of the people used to perform works and actions in traditional way. Using Lean Management philosophy in construction is a good idea to streamline processes in company, get rid of constantly recurring problems, and in this way improve the productivity and quality of executed activities. Performed analysis showed that different groups of people have very different idea and opinion on the problems connected with executing the same process - ground works and only having full picture of the situation (especially in construction processes) management can take proper problems-preventing actions that consequently can influence on the amount of waste generated on

  3. Closed-form critical earthquake response of elastic-plastic structures on compliant ground under near-fault ground motions

    Directory of Open Access Journals (Sweden)

    Kotaro eKojima

    2016-01-01

    Full Text Available The double impulse is introduced as a substitute of the fling-step near-fault ground motion. A closed-form solution of the elastic-plastic response of a structure on compliant (flexible ground by the ‘critical double impulse’ is derived for the first time based on the solution for the corresponding structure with fixed base. As in the case of fixed-base model, only the free-vibration appears under such double impulse and the energy approach plays an important role in the derivation of the closed-form solution of a complicated elastic-plastic response on compliant ground. It is remarkable that no iteration is needed in the derivation of the critical elastic-plastic response. It is shown via the closed-form expression that, in the case of a smaller input level of double impulse to the structural strength, as the ground stiffness becomes larger, the maximum plastic deformation becomes larger. On the other hand, in the case of a larger input level of double impulse to the structural strength, as the ground stiffness becomes smaller, the maximum plastic deformation becomes larger. The criticality and validity of the proposed theory are investigated through the comparison with the response analysis to the corresponding one-cycle sinusoidal input as a representative of the fling-step near-fault ground motion. The applicability of the proposed theory to actual recorded pulse-type ground motions is also discussed.

  4. Navigating the grounded theory terrain. Part 2.

    Science.gov (United States)

    Hunter, Andrew; Murphy, Kathy; Grealish, Annmarie; Casey, Dympna; Keady, John

    2011-01-01

    In this paper, the choice of classic grounded theory will be discussed and justified in the context of the first author's PhD research. The methodological discussion takes place within the context of PhD research entitled: Development of a stakeholder-led framework for a structured education programme that will prepare nurses and healthcare assistants to deliver a psychosocial intervention for people with dementia. There is a lack of research and limited understanding of the effect of psychosocial interventions on people with dementia. The first author thought classic grounded theory a suitable research methodology to investigate as it is held to be ideal for areas of research where there is little understanding of the social processes at work. The literature relating to the practical application of classic grounded theory is illustrated using examples relating to four key grounded theory components: Theory development: using constant comparison and memoing, Methodological rigour, Emergence of a core category, Inclusion of self and engagement with participants. Following discussion of the choice and application of classic grounded theory, this paper explores the need for researchers to visit and understand the various grounded theory options. This paper argues that researchers new to grounded theory must be familiar with and understand the various options. The researchers will then be able to apply the methodologies they choose consistently and critically. Doing so will allow them to develop theory rigorously and they will ultimately be able to better defend their final methodological destinations.

  5. The effect of cassava-based bioethanol production on above-ground carbon stocks: A case study from Southern Mali

    International Nuclear Information System (INIS)

    Vang Rasmussen, Laura; Rasmussen, Kjeld; Birch-Thomsen, Torben; Kristensen, Søren B.P.; Traoré, Oumar

    2012-01-01

    Increasing energy use and the need to mitigate climate change make production of liquid biofuels a high priority. Farmers respond worldwide to this increasing demand by converting forests and grassland into biofuel crops, but whether biofuels offer carbon savings depends on the carbon emissions that occur when land use is changed to biofuel crops. This paper reports the results of a study on cassava-based bioethanol production undertaken in the Sikasso region in Southern Mali. The paper outlines the estimated impacts on above-ground carbon stocks when land use is changed to increase cassava production. The results show that expansion of cassava production for bioethanol will most likely lead to the conversion of fallow areas to cassava. A land use change from fallow to cassava creates a reduction in the above-ground carbon stocks in the order of 4–13 Mg C ha −1 , depending on (a) the age of the fallow, (b) the allometric equation used and (c) whether all trees are removed or the larger, useful trees are preserved. This ‘carbon debt’ associated with the above-ground biomass loss would take 8–25 years to repay if fossil fuels are replaced with cassava-based bioethanol. - Highlights: ► Demands for biofuels make production of cassava-based bioethanol a priority. ► Farmers in Southern Mali are likely to convert fallow areas to cassava production. ► Converting fallow to cassava creates reductions in above-ground carbon stocks. ► Estimates of carbon stock reductions include that farmers preserve useful trees. ► The carbon debt associated with above-ground biomass loss takes 8–25 years to repay.

  6. Methodology for heritage conservation in Belgium based on multi-temporal interferometry

    Science.gov (United States)

    Bejarano-Urrego, L.; Verstrynge, E.; Shimoni, M.; Lopez, J.; Walstra, J.; Declercq, P.-Y.; Derauw, D.; Hayen, R.; Van Balen, K.

    2017-09-01

    Soil differential settlements that cause structural damage to heritage buildings are precipitating cultural and economic value losses. Adequate damage assessment as well as protection and preservation of the built patrimony are priorities at national and local levels, so they require advanced integration and analysis of environmental, architectural and historical parameters. The GEPATAR project (GEotechnical and Patrimonial Archives Toolbox for ARchitectural conservation in Belgium) aims to create an online interactive geo-information tool that allows the user to view and to be informed about the Belgian heritage buildings at risk due to differential soil settlements. Multi-temporal interferometry techniques (MTI) have been proven to be a powerful technique for analyzing earth surface deformation patterns through time series of Synthetic Aperture Radar (SAR) images. These techniques allow to measure ground movements over wide areas at high precision and relatively low cost. In this project, Persistent Scatterer Synthetic Aperture Radar Interferometry (PS-InSAR) and Multidimensional Small Baseline Subsets (MSBAS) are used to measure and monitor the temporal evolution of surface deformations across Belgium. This information is integrated with the Belgian heritage data by means of an interactive toolbox in a GIS environment in order to identify the level of risk. At country scale, the toolbox includes ground deformation hazard maps, geological information, location of patrimony buildings and land use; while at local scale, it includes settlement rates, photographic and historical surveys as well as architectural and geotechnical information. Some case studies are investigated by means of on-site monitoring techniques and stability analysis to evaluate the applied approaches. This paper presents a description of the methodology being implemented in the project together with the case study of the Saint Vincent's church which is located on a former colliery zone. For

  7. Knowledge Management Audit - a methodology and case study

    Directory of Open Access Journals (Sweden)

    Thomas Lauer

    2001-11-01

    Full Text Available The strategic importance of knowledge in today’s organisation has been discussed extensively and research has looked at various issues in developing knowledge management systems. Both the characterisation of knowledge and alternate models for understanding the acquisition and use of such knowledge have taken on significant prominence. This is due to the complexities associated with acquiring and representing knowledge, and the varied nature of its use in knowledge work. However, the role of the knowledge workers and the processes that guide their knowledge work as they meet the knowledge goals of an organisation have received little attention. This paper proposes a knowledge audit (an assessment of the way knowledge processes meet an organisation’s knowledge goals methodology to understand the “gaps” in the needs of a knowledge worker before one develops KM systems. The methodology also uses “process change” research to help build a socio-technical environment critical for knowledge work. The audit methodology is applied to a particular case and the implementation of the audit recommendations is discussed. Future implications of such an audit are also discussed.

  8. A methodology for sunlight urban planning: a computer-based solar and sky vault obstruction analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, Fernando Oscar Ruttkay; Silva, Carlos Alejandro Nome [Federal Univ. of Santa Catarina (UFSC), Dept. of Architecture and Urbanism, Florianopolis, SC (Brazil); Turkienikz, Benamy [Federal Univ. of Rio Grande do Sul (UFRGS), Faculty of Architecture, Porto Alegre, RS (Brazil)

    2001-07-01

    The main purpose of the present study is to describe a planning methodology to improve the quality of the built environment based on the rational control of solar radiation and the view of the sky vault. The main criterion used to control the access and obstruction of solar radiation was the concept of desirability and undesirability of solar radiation. A case study for implementing the proposed methodology is developed. Although needing further developments to find its way into regulations and practical applications, the methodology has shown a strong potential to deal with an aspect that otherwise would be almost impossible. (Author)

  9. CASE STUDY: Kampala, Uganda — From the ground up: Urban ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2006-04-21

    Apr 21, 2006 ... English · Français ... CASE STUDY: Kampala, Uganda — From the ground up: Urban ... Azuba has used this kind of evidence to convert more than one ... a bottomup approach was needed to draft ordinances that would work.

  10. Study on seismic reliability for foundation grounds and surrounding slopes of nuclear power plants. Proposal of evaluation methodology and integration of seismic reliability evaluation system

    International Nuclear Information System (INIS)

    Ohtori, Yasuki; Kanatani, Mamoru

    2006-01-01

    This paper proposes an evaluation methodology of annual probability of failure for soil structures subjected to earthquakes and integrates the analysis system for seismic reliability of soil structures. The method is based on margin analysis, that evaluates the ground motion level at which structure is damaged. First, ground motion index that is strongly correlated with damage or response of the specific structure, is selected. The ultimate strength in terms of selected ground motion index is then evaluated. Next, variation of soil properties is taken into account for the evaluation of seismic stability of structures. The variation of the safety factor (SF) is evaluated and then the variation is converted into the variation of the specific ground motion index. Finally, the fragility curve is developed and then the annual probability of failure is evaluated combined with seismic hazard curve. The system facilitates the assessment of seismic reliability. A generator of random numbers, dynamic analysis program and stability analysis program are incorporated into one package. Once we define a structural model, distribution of the soil properties, input ground motions and so forth, list of safety factors for each sliding line is obtained. Monte Carlo Simulation (MCS), Latin Hypercube Sampling (LHS), point estimation method (PEM) and first order second moment (FOSM) implemented in this system are also introduced. As numerical examples, a ground foundation and a surrounding slope are assessed using the proposed method and the integrated system. (author)

  11. Comparison of GOME tropospheric NO2 columns with NO2 profiles deduced from ground-based in situ measurements

    Science.gov (United States)

    Schaub, D.; Boersma, K. F.; Kaiser, J. W.; Weiss, A. K.; Folini, D.; Eskes, H. J.; Buchmann, B.

    2006-08-01

    the ground-based NO2 profile is analysed by considering AK information. It is moderate and indicates similar shapes of the profiles for clear sky conditions. Only for large GOME columns, differences between the profile shapes explain the larger part of the relative difference. In contrast, the other error sources give rise to the larger relative differences found towards smaller columns. Further, for the clear sky cases, errors from different sources are found to compensate each other partially. The comparison for cloudy cases indicates a poorer agreement between the columns (n=60, R=0.61). The mean relative difference between the columns is 60% with a standard deviation of 118% and GOME on average overestimating the ground-based columns. The clear improvement after inclusion of AK information (n=60, R=0.87) suggests larger errors in the a priori NO2 profiles under cloudy conditions and demonstrates the importance of using accurate profile information for (partially) clouded scenes.

  12. Methodology of safety evaluation about land disposal of low level radioactive wastes

    International Nuclear Information System (INIS)

    Suzuki, Atsuyuki

    1986-01-01

    Accompanying the progress of the construction project of low level radioactive waste storage facilities in Aomori Prefecture, the full scale land disposal of low level radioactive wastes shows its symptom also in Japan. In this report, the scientific methodology to explain the safety about the land disposal of low level radioactive wastes is discussed. The land disposal of general wastes by shallow burying has already had sufficient results. In the case of low level radioactive wastes, also the land disposal by shallow burying is considered. Low level radioactive wastes can be regarded as one form of industrial wastes, as there are many common parts in the scientific and theoretical base of the safety. Attention is paid most to the contamination of ground water. Low level radioactive wastes are solid wastes, accordingly the degree of contamination should be less. The space in which ground water existes, the phenomena of ground water movement, the phenomena of ground water dispersion and Fick's law, the adsorption effect of strata, and the evaluation of source term are explained. These are the method to analyze the degree of contamination from safety evaluation viewpoint. (Kako, I.)

  13. Economic impact of the energy transition at the local level. Methodologies and case studies

    International Nuclear Information System (INIS)

    Maurer, Christiane; Ustinskaya, Elina

    2014-12-01

    An understanding of the mechanisms that link the energy transition to the stimulation of the economy on the ground is indispensable in terms of more efficient targeting of local energy transition policies. Objective evidence of economic results is required, as is a better understanding of economic analysis methods, in order to equip local and regional authorities with the tools required to demonstrate the economic benefits for all the key players. With the 'Economic impact of the energy transition at a local level - Methodologies and case studies' study, Energy Cities illustrates numerous socio-economic aspects of the energy transition and confirms the local economic benefits, through the use of data. As with any complex, new and diffuse process, this field of research is still not fully understood. Appropriate, high-quality analysis at the local level has rarely been carried out to date and a critical assessment of the methods used is necessary. This study will fill a gap and provide suggestions on potential areas for further research in a range of fields which should be further explored and examined in collaboration with local authorities and voluntary partners. Methodology and results The first part of the study sets out the scope of the study and contains a general description of the impact observed and the potential outcome of transition policies in the main green growth sectors: eco-innovation, the energy performance of buildings, renewable energy, sustainable mobility, recycling and industrial ecology. The analysis then focuses on the feedback of six towns and regions with a policy of active sustainable development (building renovation, support for eco-industries, promotion of soft mobility etc.), presented from the point of view of the economic results observed and the evaluation methods used. The study comprises five European authorities - Brussels, Copenhagen, Hannover, Kirklees and the Greater Paris Region, and one North American authority - Nolan County

  14. Vulnerability curves vs. vulnerability indicators: application of an indicator-based methodology for debris-flow hazards

    Science.gov (United States)

    Papathoma-Köhle, Maria

    2016-08-01

    The assessment of the physical vulnerability of elements at risk as part of the risk analysis is an essential aspect for the development of strategies and structural measures for risk reduction. Understanding, analysing and, if possible, quantifying physical vulnerability is a prerequisite for designing strategies and adopting tools for its reduction. The most common methods for assessing physical vulnerability are vulnerability matrices, vulnerability curves and vulnerability indicators; however, in most of the cases, these methods are used in a conflicting way rather than in combination. The article focuses on two of these methods: vulnerability curves and vulnerability indicators. Vulnerability curves express physical vulnerability as a function of the intensity of the process and the degree of loss, considering, in individual cases only, some structural characteristics of the affected buildings. However, a considerable amount of studies argue that vulnerability assessment should focus on the identification of these variables that influence the vulnerability of an element at risk (vulnerability indicators). In this study, an indicator-based methodology (IBM) for mountain hazards including debris flow (Kappes et al., 2012) is applied to a case study for debris flows in South Tyrol, where in the past a vulnerability curve has been developed. The relatively "new" indicator-based method is being scrutinised and recommendations for its improvement are outlined. The comparison of the two methodological approaches and their results is challenging since both methodological approaches deal with vulnerability in a different way. However, it is still possible to highlight their weaknesses and strengths, show clearly that both methodologies are necessary for the assessment of physical vulnerability and provide a preliminary "holistic methodological framework" for physical vulnerability assessment showing how the two approaches may be used in combination in the future.

  15. CASE-BASED PRODUCT CONFIGURATION AND REUSE IN MASS CUSTOMIZATION

    Institute of Scientific and Technical Information of China (English)

    Wang Shiwei; Tan Jianrong; Zhang Shuyou; Wang Xin; He Chenqi

    2004-01-01

    The increasing complexity and size of configuration knowledge bases requires the provision of advanced methods supporting the development of the actual configuration process and design reuse.A new framework to find a feasible and practical product configuration method is presented in mass customization.The basic idea of the approach is to integrate case-based reasoning (CBR) with a constraint satisfaction problem(CSP).The similarity measure between a crisp and range is also given,which is common in case retrieves.Based on the configuration model,a product platform and customer needs,case adaptation is carried out with the repair-based algorithm.Lastly,the methodology in the elevator configuration design domain is tested.

  16. Hydrogeology, simulated ground-water flow, and ground-water quality, Wright-Patterson Air Force Base, Ohio

    Science.gov (United States)

    Dumouchelle, D.H.; Schalk, C.W.; Rowe, G.L.; De Roche, J.T.

    1993-01-01

    Ground water is the primary source of water in the Wright-Patterson Air Force Base area. The aquifer consists of glacial sands and gravels that fill a buried bedrock-valley system. Consolidated rocks in the area consist of poorly permeable Ordovician shale of the Richmondian stage, in the upland areas, the Brassfield Limestone of Silurian age. The valleys are filled with glacial sediments of Wisconsinan age consisting of clay-rich tills and coarse-grained outwash deposits. Estimates of hydraulic conductivity of the shales based on results of displacement/recovery tests range from 0.0016 to 12 feet per day; estimates for the glacial sediments range from less than 1 foot per day to more than 1,000 feet per day. Ground water flow from the uplands towards the valleys and the major rivers in the region, the Great Miami and the Mad Rivers. Hydraulic-head data indicate that ground water flows between the bedrock and unconsolidated deposits. Data from a gain/loss study of the Mad River System and hydrographs from nearby wells reveal that the reach of the river next to Wright-Patterson Air Force Base is a ground-water discharge area. A steady-state, three-dimensional ground-water-flow model was developed to simulate ground-water flow in the region. The model contains three layers and encompasses about 100 square miles centered on Wright-Patterson Air Force Base. Ground water enters the modeled area primarily by river leakage and underflow at the model boundary. Ground water exits the modeled area primarily by flow through the valleys at the model boundaries and through production wells. A model sensitivity analysis involving systematic changes in values of hydrologic parameters in the model indicates that the model is most sensitive to decreases in riverbed conductance and vertical conductance between the upper two layers. The analysis also indicates that the contribution of water to the buried-valley aquifer from the bedrock that forms the valley walls is about 2 to 4

  17. Correlation between the Ship Grounding Accident and the Ship Traffic – A Case Study Based on the Statistics of the Gulf of Finland

    Directory of Open Access Journals (Sweden)

    Arsham Mazaheri

    2013-03-01

    Full Text Available Ship traffic is one of the factors that is presented in almost all of the existing grounding models, and is considered as one of the affecting factors on the likelihood of grounding accident. This effect in grounding accident is mostly accepted by the experts as a common sense or simply by just generalizing the ship-ship collision cases to grounding accidents. There is no available research on the actual causal link between the ship traffic and grounding accident in the literature. In this paper, authors have utilized the statistical analysis on historical grounding accident data in the Gulf of Finland between the years 1989 and 2010 and the AIS data of the same area in year 2010, as the source of ship traffic data, to investigate the possible existence of any correlation between the ship traffic and the grounding accident. The results show that for the studied area (Gulf of Finland there is no correlation between the traffic density and the grounding accident. However, the possibility of the existence of minor relation between the traffic distribution and grounding accident is shown by the result. This finding, however, needs further investigation for more clarification.

  18. Impacts of Jatropha-based biodiesel production on above and below-ground carbon stocks: A case study from Mozambique

    International Nuclear Information System (INIS)

    Vang Rasmussen, Laura; Rasmussen, Kjeld; Bech Bruun, Thilde

    2012-01-01

    The need to mitigate climate change makes production of liquid biofuels a high priority. Substituting fossil fuels by biodiesel produced from Jatropha curcas has gained widespread attention as Jatropha cultivation is claimed to offer green house gas emission reductions. Farmers respond worldwide to this increasing demand by converting forests into Jatropha, but whether Jatropha-based biodiesel offers carbon savings depends on the carbon emissions that occur when land use is changed to Jatropha. This paper provides an impact assessment of a small-scale Jatropha project in Cabo Delgado, Mozambique. The paper outlines the estimated impacts on above and below-ground carbon stocks when land use is changed to increase Jatropha production. The results show that expansion of Jatropha production will most likely lead to the conversion of miombo forest areas to Jatropha, which implies a reduction in above and below-ground carbon stocks. The carbon debts created by the land use change can be repaid by replacing fossil fuels with Jatropha-based biodiesel. A repayment time of almost two centuries is found with optimistic estimates of the carbon debt, while the use of pessimistic values results in a repayment time that approaches the millennium. - Highlights: ► Demands for biofuels make production of Jatropha-based biodiesel a priority. ► Farmers in Northern Mozambique are likely to convert un-logged miombo to Jatropha. ► Converting miombo to Jatropha creates reductions in above and below-ground carbon. ► It takes 187–966 years to repay emissions from above and below-ground carbon stocks.

  19. Modal-pushover-based ground-motion scaling procedure

    Science.gov (United States)

    Kalkan, Erol; Chopra, Anil K.

    2011-01-01

    Earthquake engineering is increasingly using nonlinear response history analysis (RHA) to demonstrate the performance of structures. This rigorous method of analysis requires selection and scaling of ground motions appropriate to design hazard levels. This paper presents a modal-pushover-based scaling (MPS) procedure to scale ground motions for use in a nonlinear RHA of buildings. In the MPS method, the ground motions are scaled to match to a specified tolerance, a target value of the inelastic deformation of the first-mode inelastic single-degree-of-freedom (SDF) system whose properties are determined by the first-mode pushover analysis. Appropriate for first-mode dominated structures, this approach is extended for structures with significant contributions of higher modes by considering elastic deformation of second-mode SDF systems in selecting a subset of the scaled ground motions. Based on results presented for three actual buildings-4, 6, and 13-story-the accuracy and efficiency of the MPS procedure are established and its superiority over the ASCE/SEI 7-05 scaling procedure is demonstrated.

  20. OCL-BASED TEST CASE GENERATION USING CATEGORY PARTITIONING METHOD

    Directory of Open Access Journals (Sweden)

    A. Jalila

    2015-10-01

    Full Text Available The adoption of fault detection techniques during initial stages of software development life cycle urges to improve reliability of a software product. Specification-based testing is one of the major criterions to detect faults in the requirement specification or design of a software system. However, due to the non-availability of implementation details, test case generation from formal specifications become a challenging task. As a novel approach, the proposed work presents a methodology to generate test cases from OCL (Object constraint Language formal specification using Category Partitioning Method (CPM. The experiment results indicate that the proposed methodology is more effective in revealing specification based faults. Furthermore, it has been observed that OCL and CPM form an excellent combination for performing functional testing at the earliest to improve software quality with reduced cost.

  1. Climate Change Risks – Methodological Framework and Case Study of Damages from Extreme Events in Cambodia

    DEFF Research Database (Denmark)

    Halsnæs, Kirsten; Kaspersen, Per Skougaard; Trærup, Sara Lærke Meltofte

    2016-01-01

    Climate change imposes some special risks on Least Developed Countries, and the chapter presents a methodological framework, which can be used to assess the impacts of key assumptions related to damage costs, risks and equity implications on current and future generations. The methodological...... framework is applied to a case study of severe storms in Cambodia based on statistical information on past storm events including information about buildings damaged and victims. Despite there is limited data available on the probability of severe storm events under climate change as well on the actual...... damage costs associated with the events in the case of Cambodia, we are using the past storm events as proxy data in a sensitivity analysis. It is here demonstrated how key assumptions on future climate change, income levels of victims, and income distribution over time, reflected in discount rates...

  2. Risk Based Inspection Methodology and Software Applied to Atmospheric Storage Tanks

    Science.gov (United States)

    Topalis, P.; Korneliussen, G.; Hermanrud, J.; Steo, Y.

    2012-05-01

    A new risk-based inspection (RBI) methodology and software is presented in this paper. The objective of this work is to allow management of the inspections of atmospheric storage tanks in the most efficient way, while, at the same time, accident risks are minimized. The software has been built on the new risk framework architecture, a generic platform facilitating efficient and integrated development of software applications using risk models. The framework includes a library of risk models and the user interface is automatically produced on the basis of editable schemas. This risk-framework-based RBI tool has been applied in the context of RBI for above-ground atmospheric storage tanks (AST) but it has been designed with the objective of being generic enough to allow extension to the process plants in general. This RBI methodology is an evolution of an approach and mathematical models developed for Det Norske Veritas (DNV) and the American Petroleum Institute (API). The methodology assesses damage mechanism potential, degradation rates, probability of failure (PoF), consequence of failure (CoF) in terms of environmental damage and financial loss, risk and inspection intervals and techniques. The scope includes assessment of the tank floor for soil-side external corrosion and product-side internal corrosion and the tank shell courses for atmospheric corrosion and internal thinning. It also includes preliminary assessment for brittle fracture and cracking. The data are structured according to an asset hierarchy including Plant, Production Unit, Process Unit, Tag, Part and Inspection levels and the data are inherited / defaulted seamlessly from a higher hierarchy level to a lower level. The user interface includes synchronized hierarchy tree browsing, dynamic editor and grid-view editing and active reports with drill-in capability.

  3. La Teoría Fundamentada como Metodología de Investigación Cualitativa en Enfermería Grounded theory as a qualitative research methodology in nursing

    Directory of Open Access Journals (Sweden)

    Cristina G. Vivar

    2010-12-01

    Full Text Available La teoría fundamentada (TF es un diseño de investigación cualitativa, reconocido a nivel internacional, que ha sido utilizado para desarrollar teorías sobre fenómenos de salud relevantes. Sin embargo, en el ámbito de la enfermería española, la TF ha tenido escasa consideración. Por ello, este artículo se centra en esta metodología cualitativa e ilustra su contribución en la investigación enfermera en España y su utilidad para la Enfermería. También, se presentan brevemente las características metodológicas distintivas de la teoría fundamentada.Grounded theory is a qualitative research design used at an international level. It has been applied as a methodology to develop theories about relevant health phenomena. However, in the Spanish nursing context, grounded theory has received very little attention. This article focuses on this qualitative methodology and illustrates its contribution to nursing research in Spain and its relevance for nursing. Moreover, the main methodological characteristics of grounded theory are briefly presented.

  4. Scaling earthquake ground motions for performance-based assessment of buildings

    Science.gov (United States)

    Huang, Y.-N.; Whittaker, A.S.; Luco, N.; Hamburger, R.O.

    2011-01-01

    The impact of alternate ground-motion scaling procedures on the distribution of displacement responses in simplified structural systems is investigated. Recommendations are provided for selecting and scaling ground motions for performance-based assessment of buildings. Four scaling methods are studied, namely, (1)geometric-mean scaling of pairs of ground motions, (2)spectrum matching of ground motions, (3)first-mode-period scaling to a target spectral acceleration, and (4)scaling of ground motions per the distribution of spectral demands. Data were developed by nonlinear response-history analysis of a large family of nonlinear single degree-of-freedom (SDOF) oscillators that could represent fixed-base and base-isolated structures. The advantages and disadvantages of each scaling method are discussed. The relationship between spectral shape and a ground-motion randomness parameter, is presented. A scaling procedure that explicitly considers spectral shape is proposed. ?? 2011 American Society of Civil Engineers.

  5. Ground System Architectures Workshop GMSEC SERVICES SUITE (GSS): an Agile Development Story

    Science.gov (United States)

    Ly, Vuong

    2017-01-01

    The GMSEC (Goddard Mission Services Evolution Center) Services Suite (GSS) is a collection of tools and software services along with a robust customizable web-based portal that enables the user to capture, monitor, report, and analyze system-wide GMSEC data. Given our plug-and-play architecture and the needs for rapid system development, we opted to follow the Scrum Agile Methodology for software development. Being one of the first few projects to implement the Agile methodology at NASA GSFC, in this presentation we will present our approaches, tools, successes, and challenges in implementing this methodology. The GMSEC architecture provides a scalable, extensible ground and flight system for existing and future missions. GMSEC comes with a robust Application Programming Interface (GMSEC API) and a core set of Java-based GMSEC components that facilitate the development of a GMSEC-based ground system. Over the past few years, we have seen an upbeat in the number of customers who are moving from a native desktop application environment to a web based environment particularly for data monitoring and analysis. We also see a need to provide separation of the business logic from the GUI display for our Java-based components and also to consolidate all the GUI displays into one interface. This combination of separation and consolidation brings immediate value to a GMSEC-based ground system through increased ease of data access via a uniform interface, built-in security measures, centralized configuration management, and ease of feature extensibility.

  6. Case Study Methodology: Flexibility, Rigour, and Ethical Considerations for the Scholarship of Teaching and Learning

    Directory of Open Access Journals (Sweden)

    Marion L. Pearson

    2015-12-01

    Full Text Available Individuals and teams engaging in the scholarship of teaching and learning (SoTL in multidisciplinary higher education settings must make decisions regarding choice of research methodology and methods. These decisions are guided by the research context and the goals of the inquiry. With reference to our own recent experiences investigating pedagogical and curricular practices in a pharmacy program, we outline case study methodology as one of the many options available for SoTL inquiry. Case study methodology has the benefits of flexibility in terms of the types of research questions that can be addressed and the data collection methods that can be employed. Conducted with proper attention to the context of the case(s selected, ethical treatment of participants, and data management, case studies also have the necessary rigour to be credible and generalizable. In the matter of generalization, however, we recommend that the readers of a case study draw their own conclusions about the applicability of the findings to other settings.

  7. Hanford Ground-Water Data Base management guide

    International Nuclear Information System (INIS)

    Rieger, J.T.; Mitchell, P.J.; Muffett, D.M.; Fruland, R.M.; Moore, S.B.; Marshall, S.M.

    1990-02-01

    This guide describes the Hanford Ground-Water Data Base (HGWDB), a computerized data base used to store hydraulic head, sample analytical, temperature, geologic, and well-structure information for ground-water monitoring wells on the Hanford Site. These data are stored for the purpose of data retrieval for report generation and also for historical purposes. This guide is intended as an aid to the data base manager and the various staff authorized to enter and verify data, maintain the data base, and maintain the supporting software. This guide focuses on the structure of the HGWDB, providing a fairly detailed description of the programs, files, and parameters. Data-retrieval instructions for the general user of the HGWDB will be found in the HGWDB User's Manual. 6 figs

  8. Safety assessment of a vault-based disposal facility using the ISAM methodology

    International Nuclear Information System (INIS)

    Kelly, E.; Kim, C.-L.; Lietava, P.; Little, R.; Simon, I.

    2002-01-01

    As part of the IAEA's Co-ordinated Research Project (CRP) on Improving Long-term of Safety Assessment Methodologies for Near Surface Waste Disposal Facilities (ISAM), three example cases were developed. The aim was to testing the ISAM safety assessment methodology using as realistic as possible data. One of the Test Cases, the Vault Test Case (VTC), related to the disposal of low level radioactive waste (LLW) to a hypothetical facility comprising a set of above surface vaults. This paper uses the various steps of the ISAM safety assessment methodology to describe the work undertaken by ISAM participants in developing the VTC and provides some general conclusions that can be drawn from the findings of their work. (author)

  9. Space and Ground-Based Infrastructures

    Science.gov (United States)

    Weems, Jon; Zell, Martin

    This chapter deals first with the main characteristics of the space environment, outside and inside a spacecraft. Then the space and space-related (ground-based) infrastructures are described. The most important infrastructure is the International Space Station, which holds many European facilities (for instance the European Columbus Laboratory). Some of them, such as the Columbus External Payload Facility, are located outside the ISS to benefit from external space conditions. There is only one other example of orbital platforms, the Russian Foton/Bion Recoverable Orbital Capsule. In contrast, non-orbital weightless research platforms, although limited in experimental time, are more numerous: sounding rockets, parabolic flight aircraft, drop towers and high-altitude balloons. In addition to these facilities, there are a number of ground-based facilities and space simulators, for both life sciences (for instance: bed rest, clinostats) and physical sciences (for instance: magnetic compensation of gravity). Hypergravity can also be provided by human and non-human centrifuges.

  10. Use of Comparative Case Study Methodology for US Public Health Policy Analysis: A Review.

    Science.gov (United States)

    Dinour, Lauren M; Kwan, Amy; Freudenberg, Nicholas

    There is growing recognition that policies influence population health, highlighting the need for evidence to inform future policy development and reform. This review describes how comparative case study methodology has been applied to public health policy research and discusses the methodology's potential to contribute to this evidence. English-language, peer-reviewed articles published between 1995 and 2012 were sought from 4 databases. Articles were included if they described comparative case studies addressing US public health policy. Two researchers independently assessed the 20 articles meeting review criteria. Case-related characteristics and research design tactics utilized to minimize threats to reliability and validity, such as the use of multiple sources of evidence and a case study protocol, were extracted from each article. Although comparative case study methodology has been used to analyze a range of public health policies at all stages and levels, articles reported an average use of only 3.65 (out of 10) research design tactics. By expanding the use of accepted research design tactics, public health policy researchers can contribute to expanding the evidence needed to advance health-promoting policies.

  11. Spent coffee grounds-based activated carbon preparation for sequestering of malachite green

    Science.gov (United States)

    Lim, Jun-Wei; Lam, Keat-Ying; Bashir, Mohammed J. K.; Yeong, Yin-Fong; Lam, Man-Kee; Ho, Yeek-Chia

    2016-11-01

    The key of reported work was to optimize the fabricating factors of spent coffee grounds-based activated carbon (SCG-bAC) used to sequester Malachite Green (MG) form aqueous solution via adsorption process. The fabricating factors of impregnation ratio with ortho-phosphoric acid, activation temperature and activation time were simultaneously optimized by central composite design (CCD) of response surface methodology (RSM) targeting on maximum removal of MG. At the optimum condition, 96.3% of MG was successfully removed by SCG-bAC at the impregnation ratio with ortho-phosphoric acid of 0.50, activation temperature of 554°C and activation time of 31.4 min. Statistical model that could predict the MG removal percentage was also derived and had been statistically confirmed to be significant. Subsequently, the MG adsorption equilibrium data was found well-fitted to Langmuir isotherm model, indicating the predominance of monolayer adsorption of MG on SCG-bAC surface. To conclude, the findings from the this study unveil the potential of spent coffee grounds as an alternative precursor in fabricating low-cost AC for the treatment of wastewater loaded with MG pollutant.

  12. Comparison of GOME tropospheric NO2 columns with NO2 profiles deduced from ground-based in situ measurements

    Directory of Open Access Journals (Sweden)

    D. Schaub

    2006-01-01

    priori and the ground-based NO2 profile is analysed by considering AK information. It is moderate and indicates similar shapes of the profiles for clear sky conditions. Only for large GOME columns, differences between the profile shapes explain the larger part of the relative difference. In contrast, the other error sources give rise to the larger relative differences found towards smaller columns. Further, for the clear sky cases, errors from different sources are found to compensate each other partially. The comparison for cloudy cases indicates a poorer agreement between the columns (n=60, R=0.61. The mean relative difference between the columns is 60% with a standard deviation of 118% and GOME on average overestimating the ground-based columns. The clear improvement after inclusion of AK information (n=60, R=0.87 suggests larger errors in the a priori NO2 profiles under cloudy conditions and demonstrates the importance of using accurate profile information for (partially clouded scenes.

  13. Leveraging the Zachman framework implementation using action - research methodology - a case study: aligning the enterprise architecture and the business goals

    Science.gov (United States)

    Nogueira, Juan Manuel; Romero, David; Espadas, Javier; Molina, Arturo

    2013-02-01

    With the emergence of new enterprise models, such as technology-based enterprises, and the large quantity of information generated through technological advances, the Zachman framework continues to represent a modelling tool of great utility and value to construct an enterprise architecture (EA) that can integrate and align the IT infrastructure and business goals. Nevertheless, implementing an EA requires an important effort within an enterprise. Small technology-based enterprises and start-ups can take advantage of EAs and frameworks but, because these enterprises have limited resources to allocate for this task, an enterprise framework implementation is not feasible in most cases. This article proposes a new methodology based on action-research for the implementation of the business, system and technology models of the Zachman framework to assist and facilitate its implementation. Following the explanation of cycles of the proposed methodology, a case study is presented to illustrate the results of implementing the Zachman framework in a technology-based enterprise: PyME CREATIVA, using action-research approach.

  14. Towards an MDA-based development methodology

    NARCIS (Netherlands)

    Gavras, Anastasius; Belaunde, Mariano; Ferreira Pires, Luis; Andrade Almeida, João; Oquendo, Flavio; Warboys, Brian C.; Morrison, Ron

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  15. Ground-based telescope pointing and tracking optimization using a neural controller.

    Science.gov (United States)

    Mancini, D; Brescia, M; Schipani, P

    2003-01-01

    Neural network models (NN) have emerged as important components for applications of adaptive control theories. Their basic generalization capability, based on acquired knowledge, together with execution rapidity and correlation ability between input stimula, are basic attributes to consider NN as an extremely powerful tool for on-line control of complex systems. By a control system point of view, not only accuracy and speed, but also, in some cases, a high level of adaptation capability is required in order to match all working phases of the whole system during its lifetime. This is particularly remarkable for a new generation ground-based telescope control system. Infact, strong changes in terms of system speed and instantaneous position error tolerance are necessary, especially in case of trajectory disturb induced by wind shake. The classical control scheme adopted in such a system is based on the proportional integral (PI) filter, already applied and implemented on a large amount of new generation telescopes, considered as a standard in this technological environment. In this paper we introduce the concept of a new approach, the neural variable structure proportional integral, (NVSPI), related to the implementation of a standard multi layer perceptron network in new generation ground-based Alt-Az telescope control systems. Its main purpose is to improve adaptive capability of the Variable structure proportional integral model, an already innovative control scheme recently introduced by authors [Proc SPIE (1997)], based on a modified version of classical PI control model, in terms of flexibility and accuracy of the dynamic response range also in presence of wind noise effects. The realization of a powerful well tested and validated telescope model simulation system allowed the possibility to directly compare performances of the two control schemes on simulated tracking trajectories, revealing extremely encouraging results in terms of NVSPI control robustness and

  16. A multi-sensor study of the impact of ground-based glaciogenic seeding on orogrpahic clouds and precipitation

    Science.gov (United States)

    Pokharel, Binod

    This dissertation examines reflectivity data from three different radar systems, as well as airborne and ground-based in situ particle imaging data, to study the impact of ground-based glaciogenic seeding on orographic clouds and precipitation formed over the mountains in southern Wyoming. The data for this study come from the AgI Seeding Cloud Impact Investigation (ASCII) field campaign conducted over the Sierra Madre mountains in 2012 (ASCII-12) and over the Medicine Bow mountains in 2013 (ASCII-13) in the context of the Wyoming Weather Modification Pilot Project (WWMPP). The campaigns were supported by a network of ground-based instruments, including a microwave radiometer, two profiling Ka-band Micro Rain Radars (MRRs), a Doppler on Wheels (DOW), rawinsondes, a Cloud Particle Imager, and a Parsivel disdrometer. The University of Wyoming King Air with profiling Wyoming Cloud Radar (WCR) conducted nine successful flights in ASCII-12, and eight flights in ASCII-13. WCR profiles from these flights are combined with those from seven other flights, which followed the same geographically-fixed pattern in 2008-09 (pre-ASCII) over the Medicine Bow range. All sampled storms were relatively shallow, with low-level air forced over the target mountain, and cold enough to support ice initiation by silver iodide (AgI) nuclei in cloud. Three detailed case studies are conducted, each with different atmospheric conditions and different cloud and snow growth properties: one case (21 Feb 2012) is stratiform, with strong winds and cloud droplets too small to enable snow growth by accretion (riming). A second case (13 Feb 2012) contains shallow convective cells. Clouds in the third case study (22 Feb 2012) are stratiform but contain numerous large droplets (mode ~35 microm in diameter), large enough for ice particle growth by riming. These cases and all others, each with a treated period following an untreated period, show that a clear seeding signature is not immediately apparent

  17. Hybrid Risk Management Methodology: A Case Study

    Directory of Open Access Journals (Sweden)

    Jacky Siu-Lun Ting

    2009-10-01

    Full Text Available Risk management is a decision-making process involving considerations of political, social, economic and engineering factors with relevant risk assessments relating to a potential hazard. In the last decade, a number of risk management tools are introduced and employed to manage and minimize the uncertainty and threats realization to the organizations. However, the focus of these methodologies are different; in which companies need to adopt various risk management principles to visualize a full picture of the organizational risk level. Regarding to this, this paper presents a new approach of risk management that integrates Hierarchical Holographic Modeling (HHM, Enterprise Risk Management (ERM and Business Recovery Planning (BCP for identifying and assessing risks as well as managing the consequences of realized residual risks. To illustrate the procedures of the proposed methodology, a logistic company ABC Limited is chosen to serve as a case study Through applying HHM and ERM to investigate and assess the risk, ABC Limited can be better evaluated the potential risks and then took the responsive actions (e.g. BCP to handle the risks and crisis in near future.

  18. Subtropical and Polar Cirrus Clouds Characterized by Ground-Based Lidars and CALIPSO/CALIOP Observations

    Directory of Open Access Journals (Sweden)

    Córdoba-Jabonero Carmen

    2016-01-01

    Full Text Available Cirrus clouds are product of weather processes, and then their occurrence and macrophysical/optical properties can vary significantly over different regions of the world. Lidars can provide height-resolved measurements with a relatively good both vertical and temporal resolutions, making them the most suitable instrumentation for high-cloud observations. The aim of this work is to show the potential of lidar observations on Cirrus clouds detection in combination with a recently proposed methodology to retrieve the Cirrus clouds macrophysical and optical features. In this sense, a few case studies of cirrus clouds observed at both subtropical and polar latitudes are examined and compared to CALIPSO/CALIOP observations. Lidar measurements are carried out in two stations: the Metropolitan city of Sao Paulo (MSP, Brazil, 23.3°S 46.4°W, located at subtropical latitudes, and the Belgrano II base (BEL, Argentina, 78ºS 35ºW in the Antarctic continent. Optical (COD-cloud optical depth and LR-Lidar Ratio and macrophysical (top/base heights and thickness properties of both the subtropical and polar cirrus clouds are reported. In general, subtropical Cirrus clouds present lower LR values and are found at higher altitudes than those detected at polar latitudes. In general, Cirrus clouds are detected at similar altitudes by CALIOP. However, a poor agreement is achieved in the LR retrieved between ground-based lidars and space-borne CALIOP measurements, likely due to the use of a fixed (or low-variable LR value in CALIOP inversion procedures.

  19. Introducing the VISAGE project - Visualization for Integrated Satellite, Airborne, and Ground-based data Exploration

    Science.gov (United States)

    Gatlin, P. N.; Conover, H.; Berendes, T.; Maskey, M.; Naeger, A. R.; Wingo, S. M.

    2017-12-01

    A key component of NASA's Earth observation system is its field experiments, for intensive observation of particular weather phenomena, or for ground validation of satellite observations. These experiments collect data from a wide variety of airborne and ground-based instruments, on different spatial and temporal scales, often in unique formats. The field data are often used with high volume satellite observations that have very different spatial and temporal coverage. The challenges inherent in working with such diverse datasets make it difficult for scientists to rapidly collect and analyze the data for physical process studies and validation of satellite algorithms. The newly-funded VISAGE project will address these issues by combining and extending nascent efforts to provide on-line data fusion, exploration, analysis and delivery capabilities. A key building block is the Field Campaign Explorer (FCX), which allows users to examine data collected during field campaigns and simplifies data acquisition for event-based research. VISAGE will extend FCX's capabilities beyond interactive visualization and exploration of coincident datasets, to provide interrogation of data values and basic analyses such as ratios and differences between data fields. The project will also incorporate new, higher level fused and aggregated analysis products from the System for Integrating Multi-platform data to Build the Atmospheric column (SIMBA), which combines satellite and ground-based observations into a common gridded atmospheric column data product; and the Validation Network (VN), which compiles a nationwide database of coincident ground- and satellite-based radar measurements of precipitation for larger scale scientific analysis. The VISAGE proof-of-concept will target "golden cases" from Global Precipitation Measurement Ground Validation campaigns. This presentation will introduce the VISAGE project, initial accomplishments and near term plans.

  20. Wavelet-based ground vehicle recognition using acoustic signals

    Science.gov (United States)

    Choe, Howard C.; Karlsen, Robert E.; Gerhart, Grant R.; Meitzler, Thomas J.

    1996-03-01

    We present, in this paper, a wavelet-based acoustic signal analysis to remotely recognize military vehicles using their sound intercepted by acoustic sensors. Since expedited signal recognition is imperative in many military and industrial situations, we developed an algorithm that provides an automated, fast signal recognition once implemented in a real-time hardware system. This algorithm consists of wavelet preprocessing, feature extraction and compact signal representation, and a simple but effective statistical pattern matching. The current status of the algorithm does not require any training. The training is replaced by human selection of reference signals (e.g., squeak or engine exhaust sound) distinctive to each individual vehicle based on human perception. This allows a fast archiving of any new vehicle type in the database once the signal is collected. The wavelet preprocessing provides time-frequency multiresolution analysis using discrete wavelet transform (DWT). Within each resolution level, feature vectors are generated from statistical parameters and energy content of the wavelet coefficients. After applying our algorithm on the intercepted acoustic signals, the resultant feature vectors are compared with the reference vehicle feature vectors in the database using statistical pattern matching to determine the type of vehicle from where the signal originated. Certainly, statistical pattern matching can be replaced by an artificial neural network (ANN); however, the ANN would require training data sets and time to train the net. Unfortunately, this is not always possible for many real world situations, especially collecting data sets from unfriendly ground vehicles to train the ANN. Our methodology using wavelet preprocessing and statistical pattern matching provides robust acoustic signal recognition. We also present an example of vehicle recognition using acoustic signals collected from two different military ground vehicles. In this paper, we will

  1. SNE's methodological basis - web-based software in entrepreneurial surveys

    DEFF Research Database (Denmark)

    Madsen, Henning

    This overhead based paper gives an introduction to the research methodology applied in the surveys carried out in the SNE-project.......This overhead based paper gives an introduction to the research methodology applied in the surveys carried out in the SNE-project....

  2. Preparing for budget-based payment methodologies: global payment and episode-based payment.

    Science.gov (United States)

    Hudson, Mark E

    2015-10-01

    Use of budget-based payment methodologies (capitation and episode-based bundled payment) has been demonstrated to drive value in healthcare delivery. With a focus on high-volume, high-cost surgical procedures, inclusion of anaesthesiology services in these methodologies is likely. This review provides a summary of budget-based payment methodologies and practical information necessary for anaesthesiologists to prepare for participation in these programmes. Although few examples of anaesthesiologists' participation in these models exist, an understanding of the structure of these programmes and opportunities for participation are available. Prospective preparation in developing anaesthesiology-specific bundled payment profiles and early participation in pathway development associated with selected episodes of care are essential for successful participation as a gainsharing partner. With significant opportunity to contribute to care coordination and cost management, anaesthesiology can play an important role in budget-based payment programmes and should expect to participate as full gainsharing partners. Precise costing methodologies and accurate economic modelling, along with identification of quality management and cost control opportunities, will help identify participation opportunities and appropriate payment and gainsharing agreements. Anaesthesiology-specific examples with budget-based payment models are needed to help guide increased participation in these programmes.

  3. Kepler and Ground-Based Transits of the exo-Neptune HAT-P-11b

    Science.gov (United States)

    Deming, Drake; Sada, Pedro V.; Jackson, Brian; Peterson, Steven W.; Agol, Eric; Knutson, Heather A.; Jennings, Donald E.; Haase, Plynn; Bays, Kevin

    2011-01-01

    We analyze 26 archival Kepler transits of the exo-Neptune HAT-P-11b, supplemented by ground-based transits observed in the blue (B band) and near-IR (J band). Both the planet and host star are smaller than previously believed; our analysis yields Rp = 4.31 R xor 0.06 R xor and Rs = 0.683 R solar mass 0.009 R solar mass, both about 3 sigma smaller than the discovery values. Our ground-based transit data at wavelengths bracketing the Kepler bandpass serve to check the wavelength dependence of stellar limb darkening, and the J-band transit provides a precise and independent constraint on the transit duration. Both the limb darkening and transit duration from our ground-based data are consistent with the new Kepler values for the system parameters. Our smaller radius for the planet implies that its gaseous envelope can be less extensive than previously believed, being very similar to the H-He envelope of GJ 436b and Kepler-4b. HAT-P-11 is an active star, and signatures of star spot crossings are ubiquitous in the Kepler transit data. We develop and apply a methodology to correct the planetary radius for the presence of both crossed and uncrossed star spots. Star spot crossings are concentrated at phases 0.002 and +0.006. This is consistent with inferences from Rossiter-McLaughlin measurements that the planet transits nearly perpendicular to the stellar equator. We identify the dominant phases of star spot crossings with active latitudes on the star, and infer that the stellar rotational pole is inclined at about 12 deg 5 deg to the plane of the sky. We point out that precise transit measurements over long durations could in principle allow us to construct a stellar Butterfly diagram to probe the cyclic evolution of magnetic activity on this active K-dwarf star.

  4. Case study for ARRA-funded ground-source heat pump (GSHP) demonstration at Oakland University

    Energy Technology Data Exchange (ETDEWEB)

    Im, Piljae [ORNL; Liu, Xiaobing [ORNL

    2015-09-01

    High initial costs and lack of public awareness of ground-source heat pump (GSHP) technology are the two major barriers preventing rapid deployment of this energy-saving technology in the United States. Under the American Recovery and Reinvestment Act (ARRA), 26 GSHP projects have been competitively selected and carried out to demonstrate the benefits of GSHP systems and innovative technologies for cost reduction and/or performance improvement. This paper highlights the findings of a case study of one of the ARRA-funded GSHP demonstration projects, a ground-source variable refrigerant flow (GS-VRF) system installed at the Human Health Building at Oakland University in Rochester, Michigan. This case study is based on the analysis of measured performance data, maintenance records, construction costs, and simulations of the energy consumption of conventional central heating, ventilation, and air-conditioning (HVAC) systems providing the same level of space conditioning as the demonstrated GS-VRF system. The evaluated performance metrics include the energy efficiency of the heat pump equipment and the overall GS-VRF system, pumping performance, energy savings, carbon emission reductions, and cost-effectiveness of the GS-VRF system compared with conventional HVAC systems. This case study also identified opportunities for reducing uncertainties in the performance evaluation, improving the operational efficiency, and reducing the installed cost of similar GSHP systems in the future.

  5. Preliminary energy demand studies for Ireland: base case and high case for 1980, 1985 and 1990

    Energy Technology Data Exchange (ETDEWEB)

    Henry, E W

    1981-01-01

    The framework of the Base Case and the High Case for 1990 for Ireland, related to the demand modules of the medium-term European Communities (EC) Energy Model, is described. The modules are: Multi-national Macre-economic Module (EURECA); National Input-Output Model (EXPLOR); and National Energy Demand Model (EDM). The final results of the EXPLOR and EDM are described; one set related to the Base Case and the other related to the High Case. The forecast or projection is termed Base Case because oil prices are assumed to increase with general price inflation, at the same rate. The other forecast is termed High Case because oil prices are assumed to increase at 5% per year more rapidly than general price inflation. The EXPLOR-EDM methodology is described. The lack of data on energy price elasticities for Ireland is noted. A comparison of the Base Case with the High Case is made. (MCW)

  6. Shear wave velocity-based evaluation and design of stone column improved ground for liquefaction mitigation

    Science.gov (United States)

    Zhou, Yanguo; Sun, Zhengbo; Chen, Jie; Chen, Yunmin; Chen, Renpeng

    2017-04-01

    The evaluation and design of stone column improvement ground for liquefaction mitigation is a challenging issue for the state of practice. In this paper, a shear wave velocity-based approach is proposed based on the well-defined correlations of liquefaction resistance (CRR)-shear wave velocity ( V s)-void ratio ( e) of sandy soils, and the values of parameters in this approach are recommended for preliminary design purpose when site specific values are not available. The detailed procedures of pre- and post-improvement liquefaction evaluations and stone column design are given. According to this approach, the required level of ground improvement will be met once the target V s of soil is raised high enough (i.e., no less than the critical velocity) to resist the given earthquake loading according to the CRR- V s relationship, and then this requirement is transferred to the control of target void ratio (i.e., the critical e) according to the V s- e relationship. As this approach relies on the densification of the surrounding soil instead of the whole improved ground and is conservative by nature, specific considerations of the densification mechanism and effect are given, and the effects of drainage and reinforcement of stone columns are also discussed. A case study of a thermal power plant in Indonesia is introduced, where the effectiveness of stone column improved ground was evaluated by the proposed V s-based method and compared with the SPT-based evaluation. This improved ground performed well and experienced no liquefaction during subsequent strong earthquakes.

  7. Development of a statistically based access delay timeline methodology.

    Energy Technology Data Exchange (ETDEWEB)

    Rivera, W. Gary; Robinson, David Gerald; Wyss, Gregory Dane; Hendrickson, Stacey M. Langfitt

    2013-02-01

    The charter for adversarial delay is to hinder access to critical resources through the use of physical systems increasing an adversarys task time. The traditional method for characterizing access delay has been a simple model focused on accumulating times required to complete each task with little regard to uncertainty, complexity, or decreased efficiency associated with multiple sequential tasks or stress. The delay associated with any given barrier or path is further discounted to worst-case, and often unrealistic, times based on a high-level adversary, resulting in a highly conservative calculation of total delay. This leads to delay systems that require significant funding and personnel resources in order to defend against the assumed threat, which for many sites and applications becomes cost prohibitive. A new methodology has been developed that considers the uncertainties inherent in the problem to develop a realistic timeline distribution for a given adversary path. This new methodology incorporates advanced Bayesian statistical theory and methodologies, taking into account small sample size, expert judgment, human factors and threat uncertainty. The result is an algorithm that can calculate a probability distribution function of delay times directly related to system risk. Through further analysis, the access delay analyst or end user can use the results in making informed decisions while weighing benefits against risks, ultimately resulting in greater system effectiveness with lower cost.

  8. Development of methodology and computer programs for the ground response spectrum and the probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Joon Kyoung [Semyung Univ., Research Institute of Industrial Science and Technol , Jecheon (Korea, Republic of)

    1996-12-15

    Objective of this study is to investigate and develop the methodologies and corresponding computer codes, compatible to the domestic seismological and geological environments, for estimating ground response spectrum and probabilistic seismic hazard. Using the PSHA computer program, the Cumulative Probability Functions(CPDF) and Probability Functions (PDF) of the annual exceedence have been investigated for the analysis of the uncertainty space of the annual probability at ten interested seismic hazard levels (0.1 g to 0.99 g). The cumulative provability functions and provability functions of the annual exceedence have been also compared to those results from the different input parameter spaces.

  9. Foundation Investigation for Ground Based Radar Project-Kwajalein Island, Marshall Islands

    Science.gov (United States)

    1990-04-01

    iL_ COPY MISCELLANEOUS PAPER GL-90-5 i iFOUNDATION INVESTIGATION FOR GROUND BASED RADAR PROJECT--KWAJALEIN ISLAND, MARSHALL ISLANDS by Donald E...C!assification) Foundatioa Investigation for Ground Based Radar Project -- Kwajalein Island, Marshall Islands 12. PERSONAL AUTHOR(S) Yule, Donald E...investigation for the Ground Based Radar Project -- Kwajalein Island, Marshall Islands , are presented.- eophysical tests comprised of surface refrac- tion

  10. Energy retrofit of commercial buildings. Case study and applied methodology

    Energy Technology Data Exchange (ETDEWEB)

    Aste, N.; Del Pero, C. [Department of Building Environment Science and Technology (BEST), Politecnico di Milano, Via Bonardi 3, 20133 Milan (Italy)

    2013-05-15

    Commercial buildings are responsible for a significant share of the energy requirements of European Union countries. Related consumptions due to heating, cooling, and lighting appear, in most cases, very high and expensive. Since the real estate is renewed with a very small percentage each year and current trends suggest reusing the old structures, strategies for improving energy efficiency and sustainability should focus not only on new buildings, but also and especially on existing ones. Architectural renovation of existing buildings could provide an opportunity to enhance their energy efficiency, by working on the improvement of envelopes and energy supply systems. It has also to be noted that the measures aimed to improve the energy performance of buildings should pay particular attention to the cost-effectiveness of the interventions. In general, there is a lack of well-established methods for retrofitting, but if a case study achieves effective results, the adopted strategies and methodologies can be successfully replicated for similar kinds of buildings. In this paper, an iterative methodology for energy retrofit of commercial buildings is presented, together with a specific application on an existing office building. The case study is particularly significant as it is placed in an urban climatic context characterized by cold winters and hot summers; consequently, HVAC energy consumption is considerable throughout the year. The analysis and simulations of energy performance before and after the intervention, along with measured data on real energy performance, demonstrate the validity of the applied approach. The specifically developed design and refurbishment methodology, presented in this work, could be also assumed as a reference in similar operations.

  11. Recommendations for benefit-risk assessment methodologies and visual representations

    DEFF Research Database (Denmark)

    Hughes, Diana; Waddingham, Ed; Mt-Isa, Shahrul

    2016-01-01

    PURPOSE: The purpose of this study is to draw on the practical experience from the PROTECT BR case studies and make recommendations regarding the application of a number of methodologies and visual representations for benefit-risk assessment. METHODS: Eight case studies based on the benefit......-risk balance of real medicines were used to test various methodologies that had been identified from the literature as having potential applications in benefit-risk assessment. Recommendations were drawn up based on the results of the case studies. RESULTS: A general pathway through the case studies...

  12. Management Research and Grounded Theory: A review of grounded theorybuilding approach in organisational and management research.

    Directory of Open Access Journals (Sweden)

    Graham J.J. Kenealy, Ph.D.

    2008-06-01

    Full Text Available Grounded theory is a systematic methodology for the collection and analysis of data which was discovered by Glaser and Strauss in the 1960’s. The discovery of this method was first presented to the academic community in their book ‘The Discovery of Grounded Theory’ (1967 which still remains a primary point of reference for those undertaking qualitative research and grounded theory in particular. This powerful research method has become very popular in some research domains; whilst increasing in popularity it is still less prevalent in the field of organisational and management research particularly in its original form. This self reflexive paper sets out to explore the possibilities for this imbalance which takes the discussion onto the areas of methodological adaptation and training. It also enters the debate about access to research subjects and provides a succinct argument supporting the notion that grounded theory should simply be viewed as a method that develops empirically grounded conceptual theory.

  13. Description of common methodology used in all case studies, following the ILCD methodology guide and the ISO standards for LCA (ISO 14040 and 14044)

    DEFF Research Database (Denmark)

    Sonesson, Ulf; Anton, Assumpcio; Ohlau, Katrin

    2011-01-01

    This document describes common methodological issues for the case studies within LC-IMPACT. There will be five case studies in three areas performed within the project. The case studies are: Tomatoes, Margarine, Fish, Paper and printing and finally car manufacture and operation. In each case stud...

  14. The business case for condition-based maintenance: a hybrid (non-) financial approach

    NARCIS (Netherlands)

    Tiddens, W.W.; Tinga, T.; Braaksma, A.J.J.; Brouwer, O.; Cepin, Marko; Bris, Radim

    2017-01-01

    Although developing business cases is key for evaluating project success, the costs and benefits of condition-based maintenance (CBM) implementations are often not explicitly defined and evaluated. Using the design science methodology, we developed a hybrid business case approach to help managers

  15. How to do a grounded theory study: a worked example of a study of dental practices.

    Science.gov (United States)

    Sbaraini, Alexandra; Carter, Stacy M; Evans, R Wendell; Blinkhorn, Anthony

    2011-09-09

    Qualitative methodologies are increasingly popular in medical research. Grounded theory is the methodology most-often cited by authors of qualitative studies in medicine, but it has been suggested that many 'grounded theory' studies are not concordant with the methodology. In this paper we provide a worked example of a grounded theory project. Our aim is to provide a model for practice, to connect medical researchers with a useful methodology, and to increase the quality of 'grounded theory' research published in the medical literature. We documented a worked example of using grounded theory methodology in practice. We describe our sampling, data collection, data analysis and interpretation. We explain how these steps were consistent with grounded theory methodology, and show how they related to one another. Grounded theory methodology assisted us to develop a detailed model of the process of adapting preventive protocols into dental practice, and to analyse variation in this process in different dental practices. By employing grounded theory methodology rigorously, medical researchers can better design and justify their methods, and produce high-quality findings that will be more useful to patients, professionals and the research community.

  16. How to do a grounded theory study: a worked example of a study of dental practices

    Directory of Open Access Journals (Sweden)

    Evans R

    2011-09-01

    Full Text Available Abstract Background Qualitative methodologies are increasingly popular in medical research. Grounded theory is the methodology most-often cited by authors of qualitative studies in medicine, but it has been suggested that many 'grounded theory' studies are not concordant with the methodology. In this paper we provide a worked example of a grounded theory project. Our aim is to provide a model for practice, to connect medical researchers with a useful methodology, and to increase the quality of 'grounded theory' research published in the medical literature. Methods We documented a worked example of using grounded theory methodology in practice. Results We describe our sampling, data collection, data analysis and interpretation. We explain how these steps were consistent with grounded theory methodology, and show how they related to one another. Grounded theory methodology assisted us to develop a detailed model of the process of adapting preventive protocols into dental practice, and to analyse variation in this process in different dental practices. Conclusions By employing grounded theory methodology rigorously, medical researchers can better design and justify their methods, and produce high-quality findings that will be more useful to patients, professionals and the research community.

  17. Comparison of the characteristic energy of precipitating electrons derived from ground-based and DMSP satellite data

    Directory of Open Access Journals (Sweden)

    M. Ashrafi

    2005-01-01

    Full Text Available Energy maps are important for ionosphere-magnetosphere coupling studies, because quantitative determination of field-aligned currents requires knowledge of the conductances and their spatial gradients. By combining imaging riometer absorption and all-sky auroral optical data it is possible to produce high temporal and spatial resolution maps of the Maxwellian characteristic energy of precipitating electrons within a 240240 common field of view. These data have been calibrated by inverting EISCAT electron density profiles into equivalent energy spectra. In this paper energy maps produced by ground-based instruments (optical and riometer are compared with DMSP satellite data during geomagnetic conjunctions. For the period 1995-2002, twelve satellite passes over the ground-based instruments' field of view for the cloud-free conditions have been considered. Four of the satellite conjunctions occurred during moderate geomagnetic, steady-state conditions and without any ion precipitation. In these cases with Maxwellian satellite spectra, there is 71% agreement between the characteristic energies derived from the satellite and the ground-based energy map method.

  18. Comparison of the characteristic energy of precipitating electrons derived from ground-based and DMSP satellite data

    Directory of Open Access Journals (Sweden)

    M. Ashrafi

    2005-01-01

    Full Text Available Energy maps are important for ionosphere-magnetosphere coupling studies, because quantitative determination of field-aligned currents requires knowledge of the conductances and their spatial gradients. By combining imaging riometer absorption and all-sky auroral optical data it is possible to produce high temporal and spatial resolution maps of the Maxwellian characteristic energy of precipitating electrons within a 240240 common field of view. These data have been calibrated by inverting EISCAT electron density profiles into equivalent energy spectra. In this paper energy maps produced by ground-based instruments (optical and riometer are compared with DMSP satellite data during geomagnetic conjunctions. For the period 1995-2002, twelve satellite passes over the ground-based instruments' field of view for the cloud-free conditions have been considered. Four of the satellite conjunctions occurred during moderate geomagnetic, steady-state conditions and without any ion precipitation. In these cases with Maxwellian satellite spectra, there is 71% agreement between the characteristic energies derived from the satellite and the ground-based energy map method.

  19. Evaporation from bare ground with different water-table depths based on an in-situ experiment in Ordos Plateau, China

    Science.gov (United States)

    Zhang, Zaiyong; Wang, Wenke; Wang, Zhoufeng; Chen, Li; Gong, Chengcheng

    2018-03-01

    The dynamic processes of ground evaporation are complex and are related to a multitude of factors such as meteorological influences, water-table depth, and materials in the unsaturated zone. To investigate ground evaporation from a homogeneous unsaturated zone, an in-situ experiment was conducted in Ordos Plateau of China. Two water-table depths were chosen to explore the water movement in the unsaturated zone and ground evaporation. Based on the experimental and calculated results, it was revealed that (1) bare ground evaporation is an atmospheric-limited stage for the case of water-table depth being close to the capillary height; (2) the bare ground evaporation is a water-storage-limited stage for the case of water-table depth being beyond the capillary height; (3) groundwater has little effect on ground-surface evaporation when the water depth is larger than the capillary height; and (4) ground evaporation is greater at nighttime than that during the daytime; and (5) a liquid-vapor interaction zone at nearly 20 cm depth is found, in which there exists a downward vapor flux on sunny days, leading to an increasing trend of soil moisture between 09:00 to 17:00; the maximum value is reached at midday. The results of this investigation are useful to further understand the dynamic processes of ground evaporation in arid areas.

  20. Qualitative Methodology in Unfamiliar Cultures

    DEFF Research Database (Denmark)

    Svensson, Christian Franklin

    2014-01-01

    on a qualitative methodology, conscious reflection on research design and objectivity is important when doing fieldwork. This case study discusses such reflections. Emphasis throughout is given to applied qualitative methodology and its contributions to the social sciences, in particular having to do......This case study discusses qualitative fieldwork in Malaysia. The trends in higher education led to investigating how and why young Indians and Chinese in Malaysia are using the university to pursue a life strategy. Given the importance of field context in designing and analysing research based...

  1. BigBOSS: The Ground-Based Stage IV BAO Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schlegel, David; Bebek, Chris; Heetderks, Henry; Ho, Shirley; Lampton, Michael; Levi, Michael; Mostek, Nick; Padmanabhan, Nikhil; Perlmutter, Saul; Roe, Natalie; Sholl, Michael; Smoot, George; White, Martin; Dey, Arjun; Abraham, Tony; Jannuzi, Buell; Joyce, Dick; Liang, Ming; Merrill, Mike; Olsen, Knut; Salim, Samir

    2009-04-01

    The BigBOSS experiment is a proposed DOE-NSF Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with an all-sky galaxy redshift survey. The project is designed to unlock the mystery of dark energy using existing ground-based facilities operated by NOAO. A new 4000-fiber R=5000 spectrograph covering a 3-degree diameter field will measure BAO and redshift space distortions in the distribution of galaxies and hydrogen gas spanning redshifts from 0.2< z< 3.5. The Dark Energy Task Force figure of merit (DETF FoM) for this experiment is expected to be equal to that of a JDEM mission for BAO with the lower risk and cost typical of a ground-based experiment.

  2. Estimating cotton canopy ground cover from remotely sensed scene reflectance

    International Nuclear Information System (INIS)

    Maas, S.J.

    1998-01-01

    Many agricultural applications require spatially distributed information on growth-related crop characteristics that could be supplied through aircraft or satellite remote sensing. A study was conducted to develop and test a methodology for estimating plant canopy ground cover for cotton (Gossypium hirsutum L.) from scene reflectance. Previous studies indicated that a relatively simple relationship between ground cover and scene reflectance could be developed based on linear mixture modeling. Theoretical analysis indicated that the effects of shadows in the scene could be compensated for by averaging the results obtained using scene reflectance in the red and near-infrared wavelengths. The methodology was tested using field data collected over several years from cotton test plots in Texas and California. Results of the study appear to verify the utility of this approach. Since the methodology relies on information that can be obtained solely through remote sensing, it would be particularly useful in applications where other field information, such as plant size, row spacing, and row orientation, is unavailable

  3. Radiation protection optimization using a knowledge based methodology

    International Nuclear Information System (INIS)

    Reyes-Jimenez, J.; Tsoukalas, L.H.

    1991-01-01

    This paper presents a knowledge based methodology for radiological planning and radiation protection optimization. The cost-benefit methodology described on International Commission of Radiation Protection Report No. 37 is employed within a knowledge based framework for the purpose of optimizing radiation protection and plan maintenance activities while optimizing radiation protection. 1, 2 The methodology is demonstrated through an application to a heating ventilation and air conditioning (HVAC) system. HVAC is used to reduce radioactivity concentration levels in selected contaminated multi-compartment models at nuclear power plants when higher than normal radiation levels are detected. The overall objective is to reduce personnel exposure resulting from airborne radioactivity, when routine or maintenance access is required in contaminated areas. 2 figs, 15 refs

  4. Methodological possibilities for using the electron and ion energy balance in thermospheric complex measurements

    International Nuclear Information System (INIS)

    Serafimov, K.B.; Serafimova, M.K.

    1991-01-01

    Combination of ground based measurements for determination of basic thermospheric characteristics is proposed . An expression for the energy transport between components of space plasma is also derived and discussed within the framework of the presented methodology which could be devided into the folowing major sections: 1) application of ionosonde, absorption measurements, TEC-measurements using Faradey radiation or the differential Doppler effect; 2) ground-based airglow measurements; 3) airglow and palsma satelite measurements. 9 refs

  5. Ground-water contamination at Wurtsmith Air Force Base, Michigan

    Science.gov (United States)

    Stark, J.R.; Cummings, T.R.; Twenter, F.R.

    1983-01-01

    A sand and gravel aquifer of glacial origin underlies Wurtsmith Air Force Base in northeastern lower Michigan. The aquifer overlies a thick clay layer at an average depth of 65 feet. The water table is about 10 feet below land surface in the western part of the Base and about 25 feet below land surface in the eastern part. A ground-water divide cuts diagonally across the Base from northwest to southeast. South of the divide, ground water flows to the Au Sable River; north of the divide, it flows to Van Etten Creek and Van Etten Lake. Mathematical models were used to aid in calculating rates of groundwater flow. Rates range from about 0.8 feet per day in the eastern part of the Base to about 0.3 feet per day in the western part. Models also were used as an aid in making decisions regarding purging of contaminated water from the aquifer. In 1977, trichloroethylene was detected in the Air Force Base water-supply system. It had leaked from a buried storage tank near Building 43 in the southeastern part of the Base and moved northeastward under the influence of the natural ground-water gradient and the pumping of Base water-supply wells. In the most highly contaminated part of the plume, concentrations are greater than 1,000 micrograms per liter. Current purge pumping is removing some of the trichloroethylene, and seems to have arrested its eastward movement. Pumping of additional purge wells could increase the rate of removal. Trichloroethylene has also been detected in ground water in the vicinity of the Base alert apron, where a plume from an unknown source extends northeastward off Base. A smaller, less well-defined area of contamination also occurs just north of the larger plume. Trichloroethylene, identified near the waste-treatment plant, seepage lagoons, and the northern landfill area, is related to activities and operations in these areas. Dichloroethylene and trichloroethylene occur in significant quantities westward of Building 43, upgradient from the major

  6. Earthquake strong ground motion studies at the Idaho National Engineering Laboratory

    International Nuclear Information System (INIS)

    Wong, Ivan; Silva, W.; Darragh, R.; Stark, C.; Wright, D.; Jackson, S.; Carpenter, G.; Smith, R.; Anderson, D.; Gilbert, H.; Scott, D.

    1989-01-01

    Site-specific strong earthquake ground motions have been estimated for the Idaho National Engineering Laboratory assuming that an event similar to the 1983 M s 7.3 Borah Peak earthquake occurs at epicentral distances of 10 to 28 km. The strong ground motion parameters have been estimated based on a methodology incorporating the Band-Limited-White-Noise ground motion model coupled with Random Vibration Theory. A 16-station seismic attenuation and site response survey utilizing three-component portable digital seismographs was also performed for a five-month period in 1989. Based on the recordings of regional earthquakes, the effects of seismic attenuation in the shallow crust and along the propagation path and local site response were evaluated. This data combined with a detailed geologic profile developed for each site based principally on borehole data, was used in the estimation of the strong ground motion parameters. The preliminary peak horizontal ground accelerations for individual sites range from approximately 0.15 to 0.35 g. Based on the authors analysis, the thick sedimentary interbeds (greater than 20 m) in the basalt section attenuate ground motions as speculated upon in a number of previous studies

  7. Assessment of the Methodological Rigor of Case Studies in the Field of Management Accounting Published in Journals in Brazil

    Directory of Open Access Journals (Sweden)

    Kelly Cristina Mucio Marques

    2015-04-01

    Full Text Available This study aims to assess the methodological rigor of case studies in management accounting published in Brazilian journals. The study is descriptive. The data were collected using documentary research and content analysis, and 180 papers published from 2008 to 2012 in accounting journals rated as A2, B1, and B2 that were classified as case studies were selected. Based on the literature, we established a set of 15 criteria that we expected to be identified (either explicitly or implicitly in the case studies to classify those case studies as appropriate from the standpoint of methodological rigor. These criteria were partially met by the papers analyzed. The aspects less aligned with those proposed in the literature were the following: little emphasis on justifying the need to understand phenomena in context; lack of explanation of the reason for choosing the case study strategy; the predominant use of questions that do not enable deeper analysis; many studies based on only one source of evidence; little use of data and information triangulation; little emphasis on the data collection method; a high number of cases in which confusion between case study as a research strategy and as data collection method were detected; a low number of papers reporting the method of data analysis; few reports on a study's contributions; and a minority highlighting the issues requiring further research. In conclusion, the method used to apply case studies to management accounting must be improved because few studies showed rigorous application of the procedures that this strategy requires.

  8. Ground Pollution Science

    International Nuclear Information System (INIS)

    Oh, Jong Min; Bae, Jae Geun

    1997-08-01

    This book deals with ground pollution science and soil science, classification of soil and fundamentals, ground pollution and human, ground pollution and organic matter, ground pollution and city environment, environmental problems of the earth and ground pollution, soil pollution and development of geological features of the ground, ground pollution and landfill of waste, case of measurement of ground pollution.

  9. From Darwin to constructivism: the evolution of grounded theory.

    Science.gov (United States)

    Hall, Helen; Griffiths, Debra; McKenna, Lisa

    2013-01-01

    To explore the evolution of grounded theory and equip the reader with a greater understanding of the diverse conceptual positioning that is evident in the methodology. Grounded theory was developed during the modernist phase of research to develop theories that are derived from data and explain human interaction. Its philosophical foundations derive from symbolic interactionism and were influenced by a range of scholars including Charles Darwin and George Mead. Rather than a rigid set of rules and procedures, grounded theory is a way of conceptualising data. Researchers demonstrate a range of perspectives and there is significant variation in the way the methodology is interpreted and executed. Some grounded theorists continue to align closely with the original post-positivist view, while others take a more constructivist approach. Although the diverse interpretations accommodate flexibility, they may also result in confusion. The grounded theory approach enables researchers to align to their own particular world view and use methods that are flexible and practical. With an appreciation of the diverse philosophical approaches to grounded theory, researchers are enabled to use and appraise the methodology more effectively.

  10. Towards A Model-based Prognostics Methodology for Electrolytic Capacitors: A Case Study Based on Electrical Overstress Accelerated Aging

    Data.gov (United States)

    National Aeronautics and Space Administration — A remaining useful life prediction methodology for elec- trolytic capacitors is presented. This methodology adopts a Kalman filter approach in conjunction with an...

  11. Towards A Model-Based Prognostics Methodology For Electrolytic Capacitors: A Case Study Based On Electrical Overstress Accelerated Aging

    Data.gov (United States)

    National Aeronautics and Space Administration — This paper presents a model-driven methodology for predict- ing the remaining useful life of electrolytic capacitors. This methodology adopts a Kalman filter...

  12. Safety analysis methodology for OPR 1000

    International Nuclear Information System (INIS)

    Hwang-Yong, Jun

    2005-01-01

    Full text: Korea Electric Power Research Institute (KEPRI) has been developing inhouse safety analysis methodology based on the delicate codes available to KEPRI to overcome the problems arising from currently used vendor oriented methodologies. For the Loss of Coolant Accident (LOCA) analysis, the KREM (KEPRI Realistic Evaluation Methodology) has been developed based on the RELAP-5 code. The methodology was approved for the Westinghouse 3-loop plants by the Korean regulatory organization and the project to extent the methodology to the Optimized Power Reactor 1000 (OPR1000) has been ongoing since 2001. Also, for the Non-LOCA analysis, the KNAP (Korea Non-LOCA Analysis Package) has been developed using the UNICORN-TM code system. To demonstrate the feasibility of these codes systems and methodologies, some typical cases of the design basis accidents mentioned in the final safety analysis report (FSAR) were analyzed. (author)

  13. Monitoring Hydraulic Fracturing Using Ground-Based Controlled Source Electromagnetics

    Science.gov (United States)

    Hickey, M. S.; Trevino, S., III; Everett, M. E.

    2017-12-01

    Hydraulic fracturing allows hydrocarbon production in low permeability formations. Imaging the distribution of fluid used to create a hydraulic fracture can aid in the characterization of fracture properties such as extent of plume penetration as well as fracture azimuth and symmetry. This could contribute to improving the efficiency of an operation, for example, in helping to determine ideal well spacing or the need to refracture a zone. A ground-based controlled-source electromagnetics (CSEM) technique is ideal for imaging the fluid due to the change in field caused by the difference in the conductive properties of the fluid when compared to the background. With advances in high signal to noise recording equipment, coupled with a high-power, broadband transmitter we can show hydraulic fracture extent and azimuth with minimal processing. A 3D finite element code is used to model the complete well casing along with the layered subsurface. This forward model is used to optimize the survey design and isolate the band of frequencies with the best response. In the field, the results of the modeling are also used to create a custom pseudorandom numeric (PRN) code to control the frequencies transmitted through a grounded dipole source. The receivers record the surface voltage across two grounded dipoles, one parallel and one perpendicular to the transmitter. The data are presented as the displays of amplitude ratios across several frequencies with the associated spatial information. In this presentation, we show multiple field results in multiple basins in the United States along with the CSEM theory used to create the survey designs.

  14. Space- and Ground-based Coronal Spectro-Polarimetry

    Science.gov (United States)

    Fineschi, Silvano; Bemporad, Alessandro; Rybak, Jan; Capobianco, Gerardo

    This presentation gives an overview of the near-future perspectives of ultraviolet and visible-light spectro-polarimetric instrumentation for probing coronal magnetism from space-based and ground-based observatories. Spectro-polarimetric imaging of coronal emission-lines in the visible-light wavelength-band provides an important diagnostics tool of the coronal magnetism. The interpretation in terms of Hanle and Zeeman effect of the line-polarization in forbidden emission-lines yields information on the direction and strength of the coronal magnetic field. As study case, this presentation will describe the Torino Coronal Magnetograph (CorMag) for the spectro-polarimetric observation of the FeXIV, 530.3 nm, forbidden emission-line. CorMag - consisting of a Liquid Crystal (LC) Lyot filter and a LC linear polarimeter - has been recently installed on the Lomnicky Peak Observatory 20cm Zeiss coronagraph. The preliminary results from CorMag will be presented. The linear polarization by resonance scattering of coronal permitted line-emission in the ultraviolet (UV)can be modified by magnetic fields through the Hanle effect. Space-based UV spectro-polarimeters would provide an additional tool for the disgnostics of coronal magnetism. As a case study of space-borne UV spectro-polarimeters, this presentation will describe the future upgrade of the Sounding-rocket Coronagraphic Experiment (SCORE) to include the capability of imaging polarimetry of the HI Lyman-alpha, 121.6 nm. SCORE is a multi-wavelength imager for the emission-lines, HeII 30.4 nm and HI 121.6 nm, and visible-light broad-band emission of the polarized K-corona. SCORE has flown successfully in 2009. This presentation will describe how in future re-flights SCORE could observe the expected Hanle effect in corona with a HI Lyman-alpha polarimeter.

  15. TH-CD-202-07: A Methodology for Generating Numerical Phantoms for Radiation Therapy Using Geometric Attribute Distribution Models

    Energy Technology Data Exchange (ETDEWEB)

    Dolly, S; Chen, H; Mutic, S; Anastasio, M; Li, H [Washington University School of Medicine, Saint Louis, MO (United States)

    2016-06-15

    Purpose: A persistent challenge for the quality assessment of radiation therapy treatments (e.g. contouring accuracy) is the absence of the known, ground truth for patient data. Moreover, assessment results are often patient-dependent. Computer simulation studies utilizing numerical phantoms can be performed for quality assessment with a known ground truth. However, previously reported numerical phantoms do not include the statistical properties of inter-patient variations, as their models are based on only one patient. In addition, these models do not incorporate tumor data. In this study, a methodology was developed for generating numerical phantoms which encapsulate the statistical variations of patients within radiation therapy, including tumors. Methods: Based on previous work in contouring assessment, geometric attribute distribution (GAD) models were employed to model both the deterministic and stochastic properties of individual organs via principle component analysis. Using pre-existing radiation therapy contour data, the GAD models are trained to model the shape and centroid distributions of each organ. Then, organs with different shapes and positions can be generated by assigning statistically sound weights to the GAD model parameters. Organ contour data from 20 retrospective prostate patient cases were manually extracted and utilized to train the GAD models. As a demonstration, computer-simulated CT images of generated numerical phantoms were calculated and assessed subjectively and objectively for realism. Results: A cohort of numerical phantoms of the male human pelvis was generated. CT images were deemed realistic both subjectively and objectively in terms of image noise power spectrum. Conclusion: A methodology has been developed to generate realistic numerical anthropomorphic phantoms using pre-existing radiation therapy data. The GAD models guarantee that generated organs span the statistical distribution of observed radiation therapy patients

  16. TH-CD-202-07: A Methodology for Generating Numerical Phantoms for Radiation Therapy Using Geometric Attribute Distribution Models

    International Nuclear Information System (INIS)

    Dolly, S; Chen, H; Mutic, S; Anastasio, M; Li, H

    2016-01-01

    Purpose: A persistent challenge for the quality assessment of radiation therapy treatments (e.g. contouring accuracy) is the absence of the known, ground truth for patient data. Moreover, assessment results are often patient-dependent. Computer simulation studies utilizing numerical phantoms can be performed for quality assessment with a known ground truth. However, previously reported numerical phantoms do not include the statistical properties of inter-patient variations, as their models are based on only one patient. In addition, these models do not incorporate tumor data. In this study, a methodology was developed for generating numerical phantoms which encapsulate the statistical variations of patients within radiation therapy, including tumors. Methods: Based on previous work in contouring assessment, geometric attribute distribution (GAD) models were employed to model both the deterministic and stochastic properties of individual organs via principle component analysis. Using pre-existing radiation therapy contour data, the GAD models are trained to model the shape and centroid distributions of each organ. Then, organs with different shapes and positions can be generated by assigning statistically sound weights to the GAD model parameters. Organ contour data from 20 retrospective prostate patient cases were manually extracted and utilized to train the GAD models. As a demonstration, computer-simulated CT images of generated numerical phantoms were calculated and assessed subjectively and objectively for realism. Results: A cohort of numerical phantoms of the male human pelvis was generated. CT images were deemed realistic both subjectively and objectively in terms of image noise power spectrum. Conclusion: A methodology has been developed to generate realistic numerical anthropomorphic phantoms using pre-existing radiation therapy data. The GAD models guarantee that generated organs span the statistical distribution of observed radiation therapy patients

  17. IMPROVED ALGORITHM FOR CALCULATING COMPLEX NON-EQUIPOTENTIAL GROUNDING DEVICES OF ELECTRICAL INSTALLATIONS TAKING INTO ACCOUNT CONDUCTIVITY OF NATURAL GROUNDINGS

    Directory of Open Access Journals (Sweden)

    K. A. Starkov

    2017-08-01

    Full Text Available Purpose. The method of natural concentrated groundings substitution by the set of electrodes taking them into account in the algorithm of electric characteristics calculation for complicated grounding connections of electric installation is offered. An equivalent model as a set of linear electrodes is chosen in accordance with two criteria: leakage resistance and potentials on the ground surface. Methodology. We have applied induced potential method and methods for computing branched electrical circuits with distributed parameters. Results. We have obtained the algorithm for calculating complex non-equipotential grounding connections, which makes it possible to obtain refined values of the potential distribution in the electric stations and substations with outdoor switchgear. Originality. For the first time, we have taking into account the conductivity of natural concentrated grounds by a set of vertical and horizontal electrodes based on equivalent electrical characteristics applied to a two-layer ground. Practical value. The using of the proposed calculation algorithm in the electric grids of JSC «Kharkivoblenergo» made it possible to determine the values of the potential distribution at short circuit in electrical substation taking into account the influence of the conductivity of natural concentrated groundings.

  18. Applying distance-to-target weighing methodology to evaluate the environmental performance of bio-based energy, fuels, and materials

    International Nuclear Information System (INIS)

    Weiss, Martin; Patel, Martin; Heilmeier, Hermann; Bringezu, Stefan

    2007-01-01

    The enhanced use of biomass for the production of energy, fuels, and materials is one of the key strategies towards sustainable production and consumption. Various life cycle assessment (LCA) studies demonstrate the great potential of bio-based products to reduce both the consumption of non-renewable energy resources and greenhouse gas emissions. However, the production of biomass requires agricultural land and is often associated with adverse environmental effects such as eutrophication of surface and ground water. Decision making in favor of or against bio-based and conventional fossil product alternatives therefore often requires weighing of environmental impacts. In this article, we apply distance-to-target weighing methodology to aggregate LCA results obtained in four different environmental impact categories (i.e., non-renewable energy consumption, global warming potential, eutrophication potential, and acidification potential) to one environmental index. We include 45 bio- and fossil-based product pairs in our analysis, which we conduct for Germany. The resulting environmental indices for all product pairs analyzed range from -19.7 to +0.2 with negative values indicating overall environmental benefits of bio-based products. Except for three options of packaging materials made from wheat and cornstarch, all bio-based products (including energy, fuels, and materials) score better than their fossil counterparts. Comparing the median values for the three options of biomass utilization reveals that bio-energy (-1.2) and bio-materials (-1.0) offer significantly higher environmental benefits than bio-fuels (-0.3). The results of this study reflect, however, subjective value judgments due to the weighing methodology applied. Given the uncertainties and controversies associated not only with distance-to-target methodologies in particular but also with weighing approaches in general, the authors strongly recommend using weighing for decision finding only as a

  19. Calibration of Ground -based Lidar instrument

    DEFF Research Database (Denmark)

    Villanueva, Héctor; Yordanova, Ginka

    This report presents the result of the lidar calibration performed for the given Ground-based Lidar at DTU’s test site for large wind turbines at Høvsøre, Denmark. Calibration is here understood as the establishment of a relation between the reference wind speed measurements with measurement...

  20. Case study application of the IAEA safeguards assessment methodology to a mixed oxide fuel fabrication facility

    International Nuclear Information System (INIS)

    Swartz, J.; McDaniel, T.

    1981-01-01

    Science Applications, Inc. has prepared a case study illustrating the application of an assessment methodology to an international system for safeguarding mixed oxide (MOX) fuel fabrication facilities. This study is the second in a series of case studies which support an effort by the International Atomic Energy Agency (IAEA) and an international Consultant Group to develop a methodology for assessing the effectiveness of IAEA safeguards. 3 refs

  1. An Innovative Fuzzy-Logic-Based Methodology for Trend Identification

    International Nuclear Information System (INIS)

    Wang Xin; Tsoukalas, Lefteri H.; Wei, Thomas Y.C.; Reifman, Jaques

    2001-01-01

    A new fuzzy-logic-based methodology for on-line signal trend identification is introduced. The methodology may be used for detecting the onset of nuclear power plant (NPP) transients at the earliest possible time and could be of great benefit to diagnostic, maintenance, and performance-monitoring programs. Although signal trend identification is complicated by the presence of noise, fuzzy methods can help capture important features of on-line signals, integrate the information included in these features, and classify incoming NPP signals into increasing, decreasing, and steady-state trend categories. A computer program named PROTREN is developed and tested for the purpose of verifying this methodology using NPP and simulation data. The results indicate that the new fuzzy-logic-based methodology is capable of detecting transients accurately, it identifies trends reliably and does not misinterpret a steady-state signal as a transient one

  2. Diverse Ways to Fore-Ground Methodological Insights about Qualitative Research

    Science.gov (United States)

    Koro-Ljungberg, Mirka; Mazzei, Lisa A.; Ceglowski, Deborah

    2013-01-01

    Texts and articles that put epistemological theories and methodologies to work in the context of qualitative research can stimulate scholarship in various ways such as through methodological innovations, transferability of theories and methods, interdisciplinarity, and transformative reflections across traditions and frameworks. Such…

  3. Ground Control for Emplacement Drifts for SR

    International Nuclear Information System (INIS)

    Y. Sun

    2000-01-01

    This analysis demonstrates that a satisfactory ground control system can be designed for the Yucca Mountain site, and provides the technical basis for the design of ground support systems to be used in repository emplacement and non-emplacement drifts. The repository ground support design was based on analytical methods using acquired computer codes, and focused on the final support systems. A literature review of case histories, including the lessons learned from the design and construction of the ESF, the studies on the seismic damages of underground openings, and the use of rock mass classification systems in the ground support design, was conducted (Sections 6.3.4 and 6.4). This review provided some basis for determining the inputs and methodologies used in this analysis. Stability of the supported and unsupported emplacement and non-emplacement drifts was evaluated in this analysis. The excavation effects (i.e., state of the stress change due to excavation), thermal effects (i.e., due to heat output from waste packages), and seismic effects (i.e., from potential earthquake events) were evaluated, and stress controlled modes of failure were examined for two in situ stress conditions (k 0 =0.3 and 1.0) using rock properties representing rock mass categories of 1 and 5. Variation of rock mass units such as the non-lithophysal (Tptpmn) and lithophysal (Tptpll) was considered in the analysis. The focus was on the non-lithophysal unit because this unit appears to be relatively weaker and has much smaller joint spacing. Therefore, the drift stability and ground support needs were considered to be controlled by the design for this rock unit. The ground support systems for both emplacement and non-emplacement drifts were incorporated into the models to assess their performance under in situ, thermal, and seismic loading conditions. Both continuum and discontinuum modeling approaches were employed in the analyses of the rock mass behavior and in the evaluation of the

  4. Empirical and pragmatic adequacy of grounded theory: Advancing nurse empowerment theory for nurses' practice.

    Science.gov (United States)

    Udod, Sonia A; Racine, Louise

    2017-12-01

    To draw on the findings of a grounded theory study aimed at exploring how power is exercised in nurse-manager relationships in the hospital setting, this paper examines the empirical and pragmatic adequacy of grounded theory as a methodology to advance the concept of empowerment in the area of nursing leadership and management. The evidence on staff nurse empowerment has highlighted the magnitude of individual and organisational outcomes, but has not fully explicated the micro-level processes underlying how power is exercised, shared or created within the nurse-manager relationship. Although grounded theory is a widely adopted nursing research methodology, it remains less used in nursing leadership because of the dominance of quantitative approaches to research. Grounded theory methodology provides the empirical and pragmatic relevance to inform nursing practice and policy. Grounded theory is a relevant qualitative approach to use in leadership research as it provides a fine and detailed analysis of the process underlying complexity and bureaucracy. Discursive paper. A critical examination of the empirical and pragmatic relevance of grounded theory by (Corbin & Strauss, , ) as a method for analysing and solving problems in nurses' practice is provided. This paper provides evidence to support the empirical and pragmatic adequacy of grounded theory methodology. Although the application of the ontological, epistemological and methodological assumptions of grounded theory is challenging, this methodology is useful to address real-life problems in nursing practice by developing theoretical explanations of nurse empowerment, or lack thereof, in the workplace. Grounded theory represents a relevant methodology to inform nursing leadership research. Grounded theory is anchored in the reality of practice. The strength of grounded theory is to provide results that can be readily applied to clinical practice and policy as they arise from problems that affect practice and that

  5. Design of an Emulsion-based Personal Detergent through a Model-based Chemical Product Design Methodology

    DEFF Research Database (Denmark)

    Mattei, Michele; Hill, Michael; Kontogeorgis, Georgios

    2013-01-01

    An extended systematic methodology for the design of emulsion-based Chemical products is presented. The methodology consists of a model-based framework involving seven sequential hierarchical steps: starting with the identification of the needs to be satisfied by the product and then adding one-b...... to obtain one or more candidate formulations. A conceptual casestudy representing a personal detergent is presented to highlight the methodology....

  6. Design of an Emulsion-based Personal Detergent through a Model-based Chemical Product Design Methodology

    DEFF Research Database (Denmark)

    Mattei, Michele; Hill, Michael; Kontogeorgis, Georgios

    2013-01-01

    An extended systematic methodology for the design of emulsion-based Chemical products is presented. The methodology consists of a model-based framework involving seven sequential hierarchical steps: starting with the identification of the needs to be satisfied by the product and then adding one...... to obtain one or more candidate formulations. A conceptual casestudy representing a personal detergent is presented to highlight the methodology....

  7. Validation of CALIPSO space-borne-derived attenuated backscatter coefficient profiles using a ground-based lidar in Athens, Greece

    Directory of Open Access Journals (Sweden)

    R. E. Mamouri

    2009-09-01

    Full Text Available We present initial aerosol validation results of the space-borne lidar CALIOP -onboard the CALIPSO satellite- Level 1 attenuated backscatter coefficient profiles, using coincident observations performed with a ground-based lidar in Athens, Greece (37.9° N, 23.6° E. A multi-wavelength ground-based backscatter/Raman lidar system is operating since 2000 at the National Technical University of Athens (NTUA in the framework of the European Aerosol Research LIdar NETwork (EARLINET, the first lidar network for tropospheric aerosol studies on a continental scale. Since July 2006, a total of 40 coincidental aerosol ground-based lidar measurements were performed over Athens during CALIPSO overpasses. The ground-based measurements were performed each time CALIPSO overpasses the station location within a maximum distance of 100 km. The duration of the ground–based lidar measurements was approximately two hours, centred on the satellite overpass time. From the analysis of the ground-based/satellite correlative lidar measurements, a mean bias of the order of 22% for daytime measurements and of 8% for nighttime measurements with respect to the CALIPSO profiles was found for altitudes between 3 and 10 km. The mean bias becomes much larger for altitudes lower that 3 km (of the order of 60% which is attributed to the increase of aerosol horizontal inhomogeneity within the Planetary Boundary Layer, resulting to the observation of possibly different air masses by the two instruments. In cases of aerosol layers underlying Cirrus clouds, comparison results for aerosol tropospheric profiles become worse. This is attributed to the significant multiple scattering effects in Cirrus clouds experienced by CALIPSO which result in an attenuation which is less than that measured by the ground-based lidar.

  8. The Grounded Theory Bookshelf

    Directory of Open Access Journals (Sweden)

    Vivian B. Martin, Ph.D.

    2005-03-01

    Full Text Available Bookshelf will provide critical reviews and perspectives on books on theory and methodology of interest to grounded theory. This issue includes a review of Heaton’s Reworking Qualitative Data, of special interest for some of its references to grounded theory as a secondary analysis tool; and Goulding’s Grounded Theory: A practical guide for management, business, and market researchers, a book that attempts to explicate the method and presents a grounded theory study that falls a little short of the mark of a fully elaborated theory.Reworking Qualitative Data, Janet Heaton (Sage, 2004. Paperback, 176 pages, $29.95. Hardcover also available.

  9. Probabilistic prediction of expected ground condition and construction time and costs in road tunnels

    Directory of Open Access Journals (Sweden)

    A. Mahmoodzadeh

    2016-10-01

    Full Text Available Ground condition and construction (excavation and support time and costs are the key factors in decision-making during planning and design phases of a tunnel project. An innovative methodology for probabilistic estimation of ground condition and construction time and costs is proposed, which is an integration of the ground prediction approach based on Markov process, and the time and cost variance analysis based on Monte-Carlo (MC simulation. The former provides the probabilistic description of ground classification along tunnel alignment according to the geological information revealed from geological profile and boreholes. The latter provides the probabilistic description of the expected construction time and costs for each operation according to the survey feedbacks from experts. Then an engineering application to Hamro tunnel is presented to demonstrate how the ground condition and the construction time and costs are estimated in a probabilistic way. In most items, in order to estimate the data needed for this methodology, a number of questionnaires are distributed among the tunneling experts and finally the mean values of the respondents are applied. These facilitate both the owners and the contractors to be aware of the risk that they should carry before construction, and are useful for both tendering and bidding.

  10. Application of a Resilience Framework to Military Installations: A Methodology for Energy Resilience Business Case Decisions

    Science.gov (United States)

    2016-09-01

    align to a disruption or an associated downtime impacting mission performance. Reliability metrics and models were also used throughout the study to...Military Installations: A Methodology for Energy Resilience Business Case Decisions N. Judson A.L. Pina E.V. Dydek S.B. Van Broekhoven A.S...Methodology for Energy Resilience Business Case Decisions N. Judson A.L. Pina E.V. Dydek S.B. Van Broekhoven Group 73 A.S. Castillo TBD

  11. MetaSensing's FastGBSAR: ground based radar for deformation monitoring

    Science.gov (United States)

    Rödelsperger, Sabine; Meta, Adriano

    2014-10-01

    The continuous monitoring of ground deformation and structural movement has become an important task in engineering. MetaSensing introduces a novel sensor system, the Fast Ground Based Synthetic Aperture Radar (FastGBSAR), based on innovative technologies that have already been successfully applied to airborne SAR applications. The FastGBSAR allows the remote sensing of deformations of a slope or infrastructure from up to a distance of 4 km. The FastGBSAR can be setup in two different configurations: in Real Aperture Radar (RAR) mode it is capable of accurately measuring displacements along a linear range profile, ideal for monitoring vibrations of structures like bridges and towers (displacement accuracy up to 0.01 mm). Modal parameters can be determined within half an hour. Alternatively, in Synthetic Aperture Radar (SAR) configuration it produces two-dimensional displacement images with an acquisition time of less than 5 seconds, ideal for monitoring areal structures like dams, landslides and open pit mines (displacement accuracy up to 0.1 mm). The MetaSensing FastGBSAR is the first ground based SAR instrument on the market able to produce two-dimensional deformation maps with this high acquisition rate. By that, deformation time series with a high temporal and spatial resolution can be generated, giving detailed information useful to determine the deformation mechanisms involved and eventually to predict an incoming failure. The system is fully portable and can be quickly installed on bedrock or a basement. The data acquisition and processing can be fully automated leading to a low effort in instrument operation and maintenance. Due to the short acquisition time of FastGBSAR, the coherence between two acquisitions is very high and the phase unwrapping is simplified enormously. This yields a high density of resolution cells with good quality and high reliability of the acquired deformations. The deformation maps can directly be used as input into an Early

  12. Multiple Methodologies: Using Community-Based Participatory Research and Decolonizing Methodologies in Kenya

    Science.gov (United States)

    Elder, Brent C.; Odoyo, Kenneth O.

    2018-01-01

    In this project, we examined the development of a sustainable inclusive education system in western Kenya by combining community-based participatory research (CBPR) and decolonizing methodologies. Through three cycles of qualitative interviews with stakeholders in inclusive education, participants explained what they saw as foundational components…

  13. Overview of Boundary Layer Clouds Using Satellite and Ground-Based Measurements

    Science.gov (United States)

    Xi, B.; Dong, X.; Wu, P.; Qiu, S.

    2017-12-01

    A comprehensive summary of boundary layer clouds properties based on our few recently studies will be presented. The analyses include the global cloud fractions and cloud macro/micro- physical properties based on satellite measurements using both CERES-MODIS and CloudSat/Caliposo data products,; the annual/seasonal/diurnal variations of stratocumulus clouds over different climate regions (mid-latitude land, mid-latitude ocean, and Arctic region) using DOE ARM ground-based measurements over Southern great plain (SGP), Azores (GRW), and North slope of Alaska (NSA) sites; the impact of environmental conditions to the formation and dissipation process of marine boundary layer clouds over Azores site; characterizing Arctice mixed-phase cloud structure and favorable environmental conditions for the formation/maintainess of mixed-phase clouds over NSA site. Though the presentation has widely spread topics, we will focus on the representation of the ground-based measurements over different climate regions; evaluation of satellite retrieved cloud properties using these ground-based measurements, and understanding the uncertainties of both satellite and ground-based retrievals and measurements.

  14. Sensor-based activity recognition using extended belief rule-based inference methodology.

    Science.gov (United States)

    Calzada, A; Liu, J; Nugent, C D; Wang, H; Martinez, L

    2014-01-01

    The recently developed extended belief rule-based inference methodology (RIMER+) recognizes the need of modeling different types of information and uncertainty that usually coexist in real environments. A home setting with sensors located in different rooms and on different appliances can be considered as a particularly relevant example of such an environment, which brings a range of challenges for sensor-based activity recognition. Although RIMER+ has been designed as a generic decision model that could be applied in a wide range of situations, this paper discusses how this methodology can be adapted to recognize human activities using binary sensors within smart environments. The evaluation of RIMER+ against other state-of-the-art classifiers in terms of accuracy, efficiency and applicability was found to be significantly relevant, specially in situations of input data incompleteness, and it demonstrates the potential of this methodology and underpins the basis to develop further research on the topic.

  15. Materials selection as an interdisciplinary technical activity: basic methodology and case studies

    Directory of Open Access Journals (Sweden)

    M. Ferrante

    2000-04-01

    Full Text Available The technical activity known as Materials Selection is reviewed in its concepts and methodologies. Objectives and strategies are briefly presented and two important features are introduced and discussed; (i Merit Indices: a combination of materials properties, which maximises the objectives chosen by the designer and (ii Materials Properties Maps: a bi-dimensional space whose coordinates are pairs of properties in which materials can be plotted and compared directly in terms of their merit indices. A general strategy for the deduction of these indices is explained and a formal methodology to establish a ranking of candidate materials when multiple constraints intervene is presented. Finally, two case studies are discussed in depth, one related to materials substitution in the context of mechanical design and a less conventional case linking material selection to physical comfort in the home furniture industry.

  16. Theoretical and methodological grounds of formation of the efficient system of higher education

    Directory of Open Access Journals (Sweden)

    Raevneva Elena V.

    2013-03-01

    Full Text Available The goal of the article lies in generalisation of the modern theoretical and methodological, methodical and instrumentation provision of building of efficient system of higher education. Analysis of literature on the problems of building educational systems shows that there is a theoretical and methodological and instrumentation level of study of this issue. The article considers a theoretical and methodological level of the study and specifies theories and philosophic schools, concepts, educational paradigms and scientific approaches used during formation of the educational paradigm. The article considers models of education and models and technologies of learning as instrumental provision. In the result of the analysis the article makes a conclusion that the humanistic paradigm, which is based on the competency building approach and which assumes the use of modern (innovation technologies of learning, should be in the foundation of reformation of the system of higher education. The prospect of further studies in this directions is formation of competences of potential specialists (graduates of higher educational establishments with consideration of requirements of employers and market in general.

  17. Enhancing our Understanding of Snowfall Modes with Ground-Based Observations

    Science.gov (United States)

    Pettersen, C.; Kulie, M.; Petersen, W. A.; Bliven, L. F.; Wood, N.

    2016-12-01

    Snowfall can be broadly categorized into deep and shallow events based on the vertical distribution of the precipitating ice. Remotely sensed data refine these precipitation categories and aid in discerning the underlying macro- and microphysical mechanisms. The unique patterns in the remotely sensed instruments observations can potentially connect distinct modes of snowfall to specific processes. Though satellites can observe and recognize these patterns in snowfall, these measurements are limited - particularly in cases of shallow and light precipitation, as the snow may be too close to the surface or below the detection limits of the instrumentation. By enhancing satellite measurements with ground-based instrumentation, whether with limited-term field campaigns or long-term strategic sites, we can further our understanding and assumptions about different snowfall modes and how they are measured from spaceborne instruments. Presented are three years of data from a ground-based instrument suite consisting of a MicroRain Radar (MRR; optimized for snow events) and a Precipitation Imaging Package (PIP). These instruments are located at the Marquette, Michigan National Weather Service Weather Forecast Office to: a) use coincident meteorological measurements and observations to enhance our understanding of the thermodynamic drivers and b) showcase these instruments in an operational setting to enhance forecasts of shallow snow events. Three winters of MRR and PIP measurements are partitioned, based on meteorological surface observations, into two-dimensional histograms of reflectivity and particle size distribution data. These statistics improve our interpretation of deep versus shallow precipitation. Additionally, these statistical techniques are applied to similar datasets from Global Precipitation Measurement field campaigns for further insight into cloud and precipitation macro- and microphysical processes.

  18. Hanford ground-water data base management guide and user's manual

    International Nuclear Information System (INIS)

    Mitchell, P.J.; Argo, R.S.; Bradymire, S.L.; Newbill, C.A.

    1985-05-01

    This management guide and user's manual is a working document for the computerized Hanford Ground-water Data Base maintained by the Geosciences Research and Engineering Department at Pacific Northwest Laboratory for the Hanford Ground-Water Surveillance Program. The program is managed by the Occupational and Environmental Protection Department for the US Department of Energy. The data base is maintained to provide rapid access to data that are rountinely collected from ground-water monitoring wells at the Hanford site. The data include water levels, sample analyses, geologic descriptions and well construction information of over 3000 existing or destroyed wells. These data are used to monitor water quality and for the evaluation of ground-water flow and pollutant transport problems. The management guide gives instructions for maintenance of the data base on the Digital Equipment Corporation PDP 11/70 Computer using the CIRMIS (Comprehensive Information Retrieval and Model Input Sequence) data base management software developed at Pacific Northwest Laboratory. Maintenance activities include inserting, modifying and deleting data, making back-up copies of the data base, and generating tables for annual monitoring reports. The user's guide includes instructions for running programs to retrieve the data in the form of listings of graphical plots. 3 refs

  19. A Model-Based Prognostics Methodology For Electrolytic Capacitors Based On Electrical Overstress Accelerated Aging

    Data.gov (United States)

    National Aeronautics and Space Administration — A remaining useful life prediction methodology for electrolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical...

  20. Methodologies for rapid evaluation of seismic demand levels in nuclear power plant structures

    International Nuclear Information System (INIS)

    Manrique, M.; Asfura, A.; Mukhim, G.

    1990-01-01

    A methodology for rapid assessment of both acceleration spectral peak and 'zero period acceleration' (ZPA) values for virtually any major structure in a nuclear power plant is presented. The methodology is based on spectral peak and ZPA amplification factors, developed from regression analyses of an analytical database. The developed amplification factors are applied to the plant's design ground spectrum to obtain amplified response parameters. A practical application of the methodology is presented. This paper also presents a methodology for calculating acceleration response spectrum curves at any number of desired damping ratios directly from a single known damping ratio spectrum. The methodology presented is particularly useful and directly applicable to older vintage nuclear power plant facilities (i.e. such as those affected by USI A-46). The methodology is based on principles of random vibration theory. The methodology has been implemented in a computer program (SPECGEN). SPECGEN results are compared with results obtained from time history analyses. (orig.)

  1. Principle and Design of a Single-phase Inverter-Based Grounding System for Neutral-to-ground Voltage Compensation in Distribution Networks

    DEFF Research Database (Denmark)

    Wang, Wen; Yan, Lingjie; Zeng, Xiangjun

    2017-01-01

    Neutral-to-ground overvoltage may occur in non-effectively grounded power systems because of the distributed parameters asymmetry and resonance between Petersen coil and distributed capacitances. Thus, the constraint of neutral-to-ground voltage is critical for the safety of distribution networks....... In this paper, an active grounding system based on single-phase inverter and its control parameter design method is proposed to achieve this objective. Relationship between its output current and neutral-to-ground voltage is derived to explain the principle of neutral-to-ground voltage compensation. Then...

  2. A systematic review of grounded theory studies in physiotherapy.

    Science.gov (United States)

    Ali, Nancy; May, Stephen; Grafton, Kate

    2018-05-23

    This systematic review aimed at appraising the methodological rigor of grounded theory research published in the field of physiotherapy to assess how the methodology is understood and applied. A secondary aim was to provide research implications drawn from the findings to guide future grounded theory methodology (GTM) research. A systematic search was conducted in MEDLINE, CINHAL, SPORT Discus, Science Direct, PubMed, Scopus, and Web of Science to identify studies in the field of physiotherapy that reported using GTM and/or methods in the study title and/or abstract. The descriptive characteristics and methodological quality of eligible studies were examined using grounded theory methodology assessment guidelines. The review included 68 studies conducted between 1998 and 2017. The findings showed that GTM is becoming increasingly used by physiotherapy researchers. Thirty-six studies (53%) demonstrated a good understanding and appropriate application of GTM. Thirty-two studies (47%) presented descriptive findings and were considered to be of poor methodological quality. There are several key tenets of GTM that are integral to the iterative process of qualitative theorizing and need to be applied throughout all research practices including sampling, data collection, and analysis.

  3. A Model-based Prognostics Methodology for Electrolytic Capacitors Based on Electrical Overstress Accelerated Aging

    Data.gov (United States)

    National Aeronautics and Space Administration — A remaining useful life prediction methodology for elec- trolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical...

  4. Methodology Series Module 2: Case-control Studies.

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Case-Control study design is a type of observational study. In this design, participants are selected for the study based on their outcome status. Thus, some participants have the outcome of interest (referred to as cases), whereas others do not have the outcome of interest (referred to as controls). The investigator then assesses the exposure in both these groups. The investigator should define the cases as specifically as possible. Sometimes, definition of a disease may be based on multiple criteria; thus, all these points should be explicitly stated in case definition. An important aspect of selecting a control is that they should be from the same 'study base' as that of the cases. We can select controls from a variety of groups. Some of them are: General population; relatives or friends; and hospital patients. Matching is often used in case-control control studies to ensure that the cases and controls are similar in certain characteristics, and it is a useful technique to increase the efficiency of the study. Case-Control studies can usually be conducted relatively faster and are inexpensive - particularly when compared with cohort studies (prospective). It is useful to study rare outcomes and outcomes with long latent periods. This design is not very useful to study rare exposures. Furthermore, they may also be prone to certain biases - selection bias and recall bias.

  5. Integrated layout based Monte-Carlo simulation for design arc optimization

    Science.gov (United States)

    Shao, Dongbing; Clevenger, Larry; Zhuang, Lei; Liebmann, Lars; Wong, Robert; Culp, James

    2016-03-01

    Design rules are created considering a wafer fail mechanism with the relevant design levels under various design cases, and the values are set to cover the worst scenario. Because of the simplification and generalization, design rule hinders, rather than helps, dense device scaling. As an example, SRAM designs always need extensive ground rule waivers. Furthermore, dense design also often involves "design arc", a collection of design rules, the sum of which equals critical pitch defined by technology. In design arc, a single rule change can lead to chain reaction of other rule violations. In this talk we present a methodology using Layout Based Monte-Carlo Simulation (LBMCS) with integrated multiple ground rule checks. We apply this methodology on SRAM word line contact, and the result is a layout that has balanced wafer fail risks based on Process Assumptions (PAs). This work was performed at the IBM Microelectronics Div, Semiconductor Research and Development Center, Hopewell Junction, NY 12533

  6. Ground-Based Global Positioning System (GPS) Meteorology Integrated Precipitable Water Vapor (IPW)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Ground-Based Global Positioning System (GPS) Meteorology Integrated Precipitable Water Vapor (IPW) data set measures atmospheric water vapor using ground-based...

  7. Case-Based Learning as Pedagogy for Teaching Information Ethics Based on the Dervin Sense-Making Methodology

    Science.gov (United States)

    Dow, Mirah J.; Boettcher, Carrie A.; Diego, Juana F.; Karch, Marziah E.; Todd-Diaz, Ashley; Woods, Kristine M.

    2015-01-01

    The purpose of this mixed methods study is to determine the effectiveness of case-based pedagogy in teaching basic principles of information ethics and ethical decision making. Study reports results of pre- and post-assessment completed by 49 library and information science (LIS) graduate students at a Midwestern university. Using Creswell's…

  8. Experimental and Numerical Investigation of the Tracer Gas Methodology in the Case of a Naturally Cross-Ventilated Building

    DEFF Research Database (Denmark)

    Nikolopoulos, Nikos; Nikolopoulos, Aristeidis; Larsen, Tine Steen

    2012-01-01

    The paper presents the investigation of a naturally cross – ventilated building using both experimental and numerical methods with the parameters being the free-stream and the incidence angle of the wind to the openings of the building. The experimental methodology calculates the air change rate......, focusing on the time dependent character of the induced flow field. The numerical results are compared with corresponding experimental data for the three aforementioned experimental methodologies in the case of a full scale building inside a wind-tunnel. The numerical investigation reveals that for large...... based either on measurements of the inlet velocity profile, the outlet velocity profile or the descending rate of the tracer gas concentration using the decay method. The numerical investigation is based on the solution of the governing Navier-Stokes equations in their full three dimensional expression...

  9. Top Level Space Cost Methodology (TLSCM)

    Science.gov (United States)

    1997-12-02

    Software 7 6. ACEIT . 7 C. Ground Rules and Assumptions 7 D. Typical Life Cycle Cost Distribution 7 E. Methodologies 7 1. Cost/budget Threshold 9 2. Analogy...which is based on real-time Air Force and space programs. Ref.(25:2- 8, 2-9) 6. ACEIT : Automated Cost Estimating Integrated Tools( ACEIT ), Tecolote...Research, Inc. There is a way to use the ACEIT cost program to get a print-out of an expanded WBS. Therefore, find someone that has ACEIT experience and

  10. Relevance of near-Earth magnetic field modeling in deriving SEP properties using ground-based data

    Science.gov (United States)

    Kanellakopoulos, Anastasios; Plainaki, Christina; Mavromichalaki, Helen; Laurenza, Monica; Gerontidou, Maria; Storini, Marisa; Andriopoulou, Maria

    2014-05-01

    Ground Level Enhancements (GLEs) are short-term increases observed in cosmic ray intensity records of ground-based particle detectors such as neutron monitors (NMs) or muon detectors; they are related to the arrival of solar relativistic particles in the terrestrial environment. Hence, GLE events are related to the most energetic class of solar energetic particle (SEP) events. In this work we investigate how the use of different magnetospheric field models can influence the derivation of the relativistic SEP properties when modeling GLE events. As a case study, we examine the event of 2012 May 17 (also known as GLE71), registered by ground-based NMs. We apply the Tsyganenko 89 and the Tsyganenko 96 models in order to calculate the trajectories of the arriving SEPs in the near-Earth environment. We show that the intersection of the SEP trajectories with the atmospheric layer at ~20 km from the Earth's surface (i.e., where the flux of the generated secondary particles is maximum), forms for each ground-based neutron monitor a specified viewing region that is dependent on the magnetospheric field configuration. Then, we apply the Neutron Monitor Based Anisotropic GLE Pure Power Law (NMBANGLE PPOLA) model (Plainaki et al. 2010, Solar Phys, 264, 239), in order to derive the spectral properties of the related SEP event and the spatial distributions of the SEP fluxes impacting the Earth's atmosphere. We examine the dependence of the results on the used magnetic field models and evaluate their range of validity. Finally we discuss information derived by modeling the SEP spectrum in the frame of particle acceleration scenarios.

  11. Project-Based Learning and Agile Methodologies in Electronic Courses: Effect of Student Population and Open Issues

    Directory of Open Access Journals (Sweden)

    Marina Zapater

    2013-12-01

    Full Text Available Project-Based Learning (PBL and Agile methodologies have proven to be very interesting instructional strategies in Electronics and Engineering education, because they provide practical learning skills that help students understand the basis of electronics. In this paper we analyze two courses, one belonging to a Master in Electronic Engineering and one to a Bachelor in Telecommunication Engineering that apply Agile-PBL methodologies, and compare the results obtained in both courses with a traditional laboratory course. Our results support previous work stating that Agile-PBL methodologies increase student satisfaction. However, we also highlight some open issues that negatively affect the implementation of these methodologies,such as planning overhead or accidental complexity. Moreover,we show how differences in the student population, mostly related to the time spent on-campus, their commitment to the course or part-time dedication, have an impact on the benefits of Agile-PBL methods. In these cases, Agile-PBL methodologies by themselves are not enough and need to be combined with other techniques to increase student motivation.

  12. IoT-Based Information System for Healthcare Application: Design Methodology Approach

    Directory of Open Access Journals (Sweden)

    Damian Dziak

    2017-06-01

    Full Text Available Over the last few decades, life expectancy has increased significantly. However, elderly people who live on their own often need assistance due to mobility difficulties, symptoms of dementia or other health problems. In such cases, an autonomous supporting system may be helpful. This paper proposes the Internet of Things (IoT-based information system for indoor and outdoor use. Since the conducted survey of related works indicated a lack of methodological approaches to the design process, therefore a Design Methodology (DM, which approaches the design target from the perspective of the stakeholders, contracting authorities and potential users, is introduced. The implemented solution applies the three-axial accelerometer and magnetometer, Pedestrian Dead Reckoning (PDR, thresholding and the decision trees algorithm. Such an architecture enables the localization of a monitored person within four room-zones with accuracy; furthermore, it identifies falls and the activities of lying, standing, sitting and walking. Based on the identified activities, the system classifies current activities as normal, suspicious or dangerous, which is used to notify the healthcare staff about possible problems. The real-life scenarios validated the high robustness of the proposed solution. Moreover, the test results satisfied both stakeholders and future users and ensured further cooperation with the project.

  13. Design Methodology for Magnetic Field-Based Soft Tri-Axis Tactile Sensors.

    Science.gov (United States)

    Wang, Hongbo; de Boer, Greg; Kow, Junwai; Alazmani, Ali; Ghajari, Mazdak; Hewson, Robert; Culmer, Peter

    2016-08-24

    Tactile sensors are essential if robots are to safely interact with the external world and to dexterously manipulate objects. Current tactile sensors have limitations restricting their use, notably being too fragile or having limited performance. Magnetic field-based soft tactile sensors offer a potential improvement, being durable, low cost, accurate and high bandwidth, but they are relatively undeveloped because of the complexities involved in design and calibration. This paper presents a general design methodology for magnetic field-based three-axis soft tactile sensors, enabling researchers to easily develop specific tactile sensors for a variety of applications. All aspects (design, fabrication, calibration and evaluation) of the development of tri-axis soft tactile sensors are presented and discussed. A moving least square approach is used to decouple and convert the magnetic field signal to force output to eliminate non-linearity and cross-talk effects. A case study of a tactile sensor prototype, MagOne, was developed. This achieved a resolution of 1.42 mN in normal force measurement (0.71 mN in shear force), good output repeatability and has a maximum hysteresis error of 3.4%. These results outperform comparable sensors reported previously, highlighting the efficacy of our methodology for sensor design.

  14. A Mixed Prediction Model of Ground Subsidence for Civil Infrastructures on Soft Ground

    Directory of Open Access Journals (Sweden)

    Kiyoshi Kobayashi

    2012-01-01

    Full Text Available The estimation of ground subsidence processes is an important subject for the asset management of civil infrastructures on soft ground, such as airport facilities. In the planning and design stage, there exist many uncertainties in geotechnical conditions, and it is impossible to estimate the ground subsidence process by deterministic methods. In this paper, the sets of sample paths designating ground subsidence processes are generated by use of a one-dimensional consolidation model incorporating inhomogeneous ground subsidence. Given the sample paths, the mixed subsidence model is presented to describe the probabilistic structure behind the sample paths. The mixed model can be updated by the Bayesian methods based upon the newly obtained monitoring data. Concretely speaking, in order to estimate the updating models, Markov Chain Monte Calro method, which is the frontier technique in Bayesian statistics, is applied. Through a case study, this paper discussed the applicability of the proposed method and illustrated its possible application and future works.

  15. Detection of Changes in Ground-Level Ozone Concentrations via Entropy

    Directory of Open Access Journals (Sweden)

    Yuehua Wu

    2015-04-01

    Full Text Available Ground-level ozone concentration is a key indicator of air quality. Theremay exist sudden changes in ozone concentration data over a long time horizon, which may be caused by the implementation of government regulations and policies, such as establishing exhaust emission limits for on-road vehicles. To monitor and assess the efficacy of these policies, we propose a methodology for detecting changes in ground-level ozone concentrations, which consists of three major steps: data transformation, simultaneous autoregressive modelling and change-point detection on the estimated entropy. To show the effectiveness of the proposed methodology, the methodology is applied to detect changes in ground-level ozone concentration data collected in the Toronto region of Canada between June and September for the years from 1988 to 2009. The proposed methodology is also applicable to other climate data.

  16. Simulation of Ground-Water Flow and Effects of Ground-Water Irrigation on Base Flow in the Elkhorn and Loup River Basins, Nebraska

    Science.gov (United States)

    Peterson, Steven M.; Stanton, Jennifer S.; Saunders, Amanda T.; Bradley, Jesse R.

    2008-01-01

    Irrigated agriculture is vital to the livelihood of communities in the Elkhorn and Loup River Basins in Nebraska, and ground water is used to irrigate most of the cropland. Concerns about the sustainability of ground-water and surface-water resources have prompted State and regional agencies to evaluate the cumulative effects of ground-water irrigation in this area. To facilitate understanding of the effects of ground-water irrigation, a numerical computer model was developed to simulate ground-water flow and assess the effects of ground-water irrigation (including ground-water withdrawals, hereinafter referred to as pumpage, and enhanced recharge) on stream base flow. The study area covers approximately 30,800 square miles, and includes the Elkhorn River Basin upstream from Norfolk, Nebraska, and the Loup River Basin upstream from Columbus, Nebraska. The water-table aquifer consists of Quaternary-age sands and gravels and Tertiary-age silts, sands, and gravels. The simulation was constructed using one layer with 2-mile by 2-mile cell size. Simulations were constructed to represent the ground-water system before 1940 and from 1940 through 2005, and to simulate hypothetical conditions from 2006 through 2045 or 2055. The first simulation represents steady-state conditions of the system before anthropogenic effects, and then simulates the effects of early surface-water development activities and recharge of water leaking from canals during 1895 to 1940. The first simulation ends at 1940 because before that time, very little pumpage for irrigation occurred, but after that time it became increasingly commonplace. The pre-1940 simulation was calibrated against measured water levels and estimated long-term base flow, and the 1940 through 2005 simulation was calibrated against measured water-level changes and estimated long-term base flow. The calibrated 1940 through 2005 simulation was used as the basis for analyzing hypothetical scenarios to evaluate the effects of

  17. Ground-glass opacity in lung metastasis from adenocarcinoma of the stomach: a case report

    International Nuclear Information System (INIS)

    Jung, Mi Ran; Kim, Jeong Kon; Lee, Jin Seong; Song, Koun Sik; Lim, Tae Hwan

    2000-01-01

    Ground-glass opacity is a frequent but nonspecific finding seen on high-resolution CT scans of lung parenchyma. Histologically, this appearance is observed when thickening of the alveolar wall and septal interstitium is minimal or the alveolar lumen is partially filled with fluid, macrophage, neutrophils, or amorphous material. It has been shown that ground-glass opacity may be caused not only by an active inflammatory process but also by fibrotic processes. When a focal area of ground-glass opacity persists or increases in size, the possibility of neoplasm-bronchioloalveolar carcinoma or adenoma, or lymphoma, for example, should be considered. Diffuse nonsegmental ground-glass opacity in both lung fields was incidentally found on follow up abdominal CT in a stomach cancer patient and signet-ring cell-type metastatic lung cancer was confirmed by transbronchial lung biopsy. We report a case of diffuse ground-glass opacity seen in metastatic lung cancer from adenocarcinoma of the stomach. (author)

  18. Assessment of potential strong ground motions in the city of Rome

    Directory of Open Access Journals (Sweden)

    L. Malagnini

    1994-06-01

    Full Text Available A methodology is used which combines stochastic generation of random series with a finite-difference technique to estimate the expected horizontal ground motion for the city of Rome as induced by a large earthquake in the Central Apennines. In this approach, source properties and long-path propagation are modelled through observed spectra of ground motion in the region, while the effects of the near-surface geology in the city are simulated by means of a finite-difference technique applied to 2-D models including elastic and anelastic properties of geologic materials and topographic variations. The parameters commonly used for earthquake engineering purposes are estimated from the simulated time histories of horizontal ground motion. We focus our attention on peak ground acceleration and velocity, and on the integral of the squared acceleration and velocity (that are proportional to the Arias intensity and seismic energy flux, respectively. Response spectra are analyzed as well. Parameter variations along 2-D profiles visualize the effects of the small-scale geological heterogeneities and topography irregularities on ground motion in the case of a strong earthquake. Interestingly, the largest amplification of peak ground acceleration and Arias intensity does not necessarily occur at the same sites where peak ground velocity and flux of seismic energy reach their highest values, depending on the frequency band of amplification. A magnitude 7 earthquake at a distance of 100 km results in peak ground accelerations ranging from 30 to 70 gals while peak ground velocities are estimated to vary from 5 to 7 cm/s; moreover, simulated time histories of horizontal ground motion yield amplitudes of 5% damped pseudovelocity response spectra as large as 15-20 cm/s for frequencies from 1to 3 Hz. In this frequency band, the mean value is 7 cm/s for firm sites and ranges from 10 to 13 cm/s for soil sites. All these results are in good agreement with predictions

  19. Ground-based Nuclear Detonation Detection (GNDD) Technology Roadmap

    International Nuclear Information System (INIS)

    Casey, Leslie A.

    2014-01-01

    This GNDD Technology Roadmap is intended to provide guidance to potential researchers and help management define research priorities to achieve technology advancements for ground-based nuclear explosion monitoring science being pursued by the Ground-based Nuclear Detonation Detection (GNDD) Team within the Office of Nuclear Detonation Detection in the National Nuclear Security Administration (NNSA) of the U.S. Department of Energy (DOE). Four science-based elements were selected to encompass the entire scope of nuclear monitoring research and development (R&D) necessary to facilitate breakthrough scientific results, as well as deliver impactful products. Promising future R&D is delineated including dual use associated with the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Important research themes as well as associated metrics are identified along with a progression of accomplishments, represented by a selected bibliography, that are precursors to major improvements to nuclear explosion monitoring.

  20. Ground-based Nuclear Detonation Detection (GNDD) Technology Roadmap

    Energy Technology Data Exchange (ETDEWEB)

    Casey, Leslie A.

    2014-01-13

    This GNDD Technology Roadmap is intended to provide guidance to potential researchers and help management define research priorities to achieve technology advancements for ground-based nuclear explosion monitoring science being pursued by the Ground-based Nuclear Detonation Detection (GNDD) Team within the Office of Nuclear Detonation Detection in the National Nuclear Security Administration (NNSA) of the U.S. Department of Energy (DOE). Four science-based elements were selected to encompass the entire scope of nuclear monitoring research and development (R&D) necessary to facilitate breakthrough scientific results, as well as deliver impactful products. Promising future R&D is delineated including dual use associated with the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Important research themes as well as associated metrics are identified along with a progression of accomplishments, represented by a selected bibliography, that are precursors to major improvements to nuclear explosion monitoring.

  1. Estimating the potential impacts of a nuclear reactor accident: methodology and case studies

    International Nuclear Information System (INIS)

    Cartwright, J.V.; Beemiller, R.M.; Trott, E.A. Jr.; Younger, J.M.

    1982-04-01

    This monograph describes an industrial impact model that can be used to estimate the regional industry-specific impacts of disasters. Special attention is given to the impacts of possible nuclear reactor accidents. The monograph also presents three applications of the model. The impacts estimated in the case studies are based on (1) general information and reactor-specific data, supplied by the US Nuclear Regulatory Commission (NRC), (2) regional economic models derived from the Regional Input-Output Modeling System (RIMS II) developed at the Bureau of Economic Analysis (BEA), and (3) additional methodology developed especially for taking into account the unique characteristics of a nuclear reactor accident with respect to regional industrial activity

  2. The COROT ground-based archive and access system

    Science.gov (United States)

    Solano, E.; González-Riestra, R.; Catala, C.; Baglin, A.

    2002-01-01

    A prototype of the COROT ground-based archive and access system is presented here. The system has been developed at LAEFF and it is based on the experience gained at Laboratorio de Astrofisica Espacial y Fisica Fundamental (LAEFF) with the INES (IUE Newly Extracted System) Archive.

  3. A detrimental soil disturbance prediction model for ground-based timber harvesting

    Science.gov (United States)

    Derrick A. Reeves; Matthew C. Reeves; Ann M. Abbott; Deborah S. Page-Dumroese; Mark D. Coleman

    2012-01-01

    Soil properties and forest productivity can be affected during ground-based harvest operations and site preparation. The degree of impact varies widely depending on topographic features and soil properties. Forest managers who understand site-specific limits to ground-based harvesting can alter harvest method or season to limit soil disturbance. To determine the...

  4. TFTR grounding scheme and ground-monitor system

    International Nuclear Information System (INIS)

    Viola, M.

    1983-01-01

    The Tokamak Fusion Test Reactor (TFTR) grounding system utilizes a single-point ground. It is located directly under the machine, at the basement floor level, and is tied to the building perimeter ground. Wired to this single-point ground, via individual 500 MCM insulated cables, are: the vacuum vessel; four toroidal field coil cases/inner support structure quadrants; umbrella structure halves; the substructure ring girder; radial beams and columns; and the diagnostic systems. Prior to the first machine operation, a ground-loop removal program was initiated. It required insulation of all hangers and supports (within a 35-foot radius of the center of the machine) of the various piping, conduits, cable trays, and ventilation systems. A special ground-monitor system was designed and installed. It actively monitors each of the individual machine grounds to insure that there are no inadvertent ground loops within the machine structure or its ground and that the machine grounds are intact prior to each pulse. The TFTR grounding system has proven to be a very manageable system and one that is easy to maintain

  5. Comparing distinct ground-based lightning location networks covering the Netherlands

    Science.gov (United States)

    de Vos, Lotte; Leijnse, Hidde; Schmeits, Maurice; Beekhuis, Hans; Poelman, Dieter; Evers, Läslo; Smets, Pieter

    2015-04-01

    Lightning can be detected using a ground-based sensor network. The Royal Netherlands Meteorological Institute (KNMI) monitors lightning activity in the Netherlands with the so-called FLITS-system; a network combining SAFIR-type sensors. This makes use of Very High Frequency (VHF) as well as Low Frequency (LF) sensors. KNMI has recently decided to replace FLITS by data from a sub-continental network operated by Météorage which makes use of LF sensors only (KNMI Lightning Detection Network, or KLDN). KLDN is compared to the FLITS system, as well as Met Office's long-range Arrival Time Difference (ATDnet), which measures Very Low Frequency (VLF). Special focus lies on the ability to detect Cloud to Ground (CG) and Cloud to Cloud (CC) lightning in the Netherlands. Relative detection efficiency of individual flashes and lightning activity in a more general sense are calculated over a period of almost 5 years. Additionally, the detection efficiency of each system is compared to a ground-truth that is constructed from flashes that are detected by both of the other datasets. Finally, infrasound data is used as a fourth lightning data source for several case studies. Relative performance is found to vary strongly with location and time. As expected, it is found that FLITS detects significantly more CC lightning (because of the strong aptitude of VHF antennas to detect CC), though KLDN and ATDnet detect more CG lightning. We analyze statistics computed over the entire 5-year period, where we look at CG as well as total lightning (CC and CG combined). Statistics that are considered are the Probability of Detection (POD) and the so-called Lightning Activity Detection (LAD). POD is defined as the percentage of reference flashes the system detects compared to the total detections in the reference. LAD is defined as the fraction of system recordings of one or more flashes in predefined area boxes over a certain time period given the fact that the reference detects at least one

  6. 49 CFR 1109.4 - Mandatory mediation in rate cases to be considered under the stand-alone cost methodology.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 8 2010-10-01 2010-10-01 false Mandatory mediation in rate cases to be considered... § 1109.4 Mandatory mediation in rate cases to be considered under the stand-alone cost methodology. (a) A... methodology must engage in non-binding mediation of its dispute with the railroad upon filing a formal...

  7. Using Open Access Satellite Data Alongside Ground Based Remote Sensing: An Assessment, with Case Studies from Egypt’s Delta

    Directory of Open Access Journals (Sweden)

    Sarah Parcak

    2017-09-01

    Full Text Available This paper will assess the most recently available open access high-resolution optical satellite data (0.3 m–0.6 m and its detection of buried ancient features versus ground based remote sensing tools. It also discusses the importance of CORONA satellite data to evaluate landscape changes over the past 50 years surrounding sites. The study concentrates on Egypt’s Nile Delta, which is threatened by rising sea and water tables and urbanization. Many ancient coastal sites will be lost in the next few decades, thus this paper emphasizes the need to map them before they disappear. It shows that high resolution satellites can sometimes provide the same general picture on ancient sites in the Egyptian Nile Delta as ground based remote sensing, with relatively sandier sedimentary and degrading tell environments, during periods of rainfall, and higher groundwater conditions. Research results also suggest potential solutions for rapid mapping of threatened Delta sites, and urge a collaborative global effort to maps them before they disappear.

  8. Designing food supply chains- a structured methodology: a case on novel protein foods

    OpenAIRE

    Apaiah, R.K.

    2006-01-01

    This thesis proposes and implements a structured methodology to aid in chain design and the evaluation and decision making processes that accompany it.It focusesonhow to design the entire chain from start to finish, so that the consumer gets a product that he/she wants, i.e.concentrating on product attributes rather than on the delivery of the product. The novel protein food (NPF) case from the PROFETAS program was used to develop the methodology. Two attributes of quality were investigated w...

  9. Culturally Responsive Teaching in the Context of Mathematics: A Grounded Theory Case Study

    Science.gov (United States)

    Bonner, Emily P.; Adams, Thomasenia L.

    2012-01-01

    In this grounded theory case study, four interconnected, foundational cornerstones of culturally responsive mathematics teaching (CRMT), communication, knowledge, trust/relationships, and constant reflection/revision, were systematically unearthed to develop an initial working theory of CRMT that directly informs classroom practice. These…

  10. A Duration Prediction Using a Material-Based Progress Management Methodology for Construction Operation Plans

    Directory of Open Access Journals (Sweden)

    Yongho Ko

    2017-04-01

    Full Text Available Precise and accurate prediction models for duration and cost enable contractors to improve their decision making for effective resource management in terms of sustainability in construction. Previous studies have been limited to cost-based estimations, but this study focuses on a material-based progress management method. Cost-based estimations typically used in construction, such as the earned value method, rely on comparing the planned budget with the actual cost. However, accurately planning budgets requires analysis of many factors, such as the financial status of the sectors involved. Furthermore, there is a higher possibility of changes in the budget than in the total amount of material used during construction, which is deduced from the quantity take-off from drawings and specifications. Accordingly, this study proposes a material-based progress management methodology, which was developed using different predictive analysis models (regression, neural network, and auto-regressive moving average as well as datasets on material and labor, which can be extracted from daily work reports from contractors. A case study on actual datasets was conducted, and the results show that the proposed methodology can be efficiently used for progress management in construction.

  11. Validation of GOME (ERS-2) NO2 vertical column data with ground-based measurements at Issyk-Kul (Kyrgyzstan)

    Science.gov (United States)

    Ionov, D.; Sinyakov, V.; Semenov, V.

    Starting from 1995 the global monitoring of atmospheric nitrogen dioxide is carried out by the measurements of nadir-viewing GOME spectrometer aboard ERS-2 satellite. Continuous validation of that data by means of comparisons with well-controlled ground-based measurements is important to ensure the quality of GOME data products and improve related retrieval algorithms. At the station of Issyk-Kul (Kyrgyzstan) the ground-based spectroscopic observations of NO2 vertical column have been started since 1983. The station is located on the northern shore of Issyk-Kul lake, 1650 meters above the sea level (42.6 N, 77.0 E). The site is equipped with grating spectrometer for the twilight measurements of zenith-scattered solar radiation in the visible range, and applies the DOAS technique to retrieve NO2 vertical column. It is included in the list of NDSC stations as a complementary one. The present study is focused on validation of GOME NO2 vertical column data, based on 8-year comparison with correlative ground-based measurements at Issyk-Kul station in 1996-2003. Within the investigation, an agreement of both individual and monthly averaged GOME measurements with corresponding twilight ground-based observations is examined. Such agreement is analyzed with respect to different conditions (season, sun elevation), temporal/spatial criteria choice (actual overpass location, correction for diurnal variation) and data processing (GDP version 2.7, 3.0). In addition, NO2 vertical columns were integrated from simultaneous stratospheric profile measurements by NASA HALOE and SAGE-II/III satellite instruments and introduced to explain the differences with ground-based observations. In particular cases, NO2 vertical profiles retrieved from the twilight ground-based measurements at Issuk-Kul were also included into comparison. Overall, summertime GOME NO2 vertical columns were found to be systematicaly lower than ground-based data. This work was supported by International Association

  12. Improving patient care in cardiac surgery using Toyota production system based methodology.

    Science.gov (United States)

    Culig, Michael H; Kunkle, Richard F; Frndak, Diane C; Grunden, Naida; Maher, Thomas D; Magovern, George J

    2011-02-01

    A new cardiac surgery program was developed in a community hospital setting using the operational excellence (OE) method, which is based on the principles of the Toyota production system. The initial results of the first 409 heart operations, performed over the 28 months between March 1, 2008, and June 30, 2010, are presented. Operational excellence methodology was taught to the cardiac surgery team. Coaching started 2 months before the opening of the program and continued for 24 months. Of the 409 cases presented, 253 were isolated coronary artery bypass graft operations. One operative death occurred. According to the database maintained by The Society of Thoracic Surgeons, the risk-adjusted operative mortality rate was 61% lower than the regional rate. Likewise, the risk-adjusted rate of major complications was 57% lower than The Society of Thoracic Surgeons regional rate. Daily solution to determine cause was attempted on 923 distinct perioperative problems by all team members. Using the cost of complications as described by Speir and coworkers, avoiding predicted complications resulted in a savings of at least $884,900 as compared with the regional average. By the systematic use of a real time, highly formatted problem-solving methodology, processes of care improved daily. Using carefully disciplined teamwork, reliable implementation of evidence-based protocols was realized by empowering the front line to make improvements. Low rates of complications were observed, and a cost savings of $3,497 per each case of isolated coronary artery bypass graft was realized. Copyright © 2011 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  13. Managing a big ground-based astronomy project: the Thirty Meter Telescope (TMT) project

    Science.gov (United States)

    Sanders, Gary H.

    2008-07-01

    TMT is a big science project and its scale is greater than previous ground-based optical/infrared telescope projects. This paper will describe the ideal "linear" project and how the TMT project departs from that ideal. The paper will describe the needed adaptations to successfully manage real world complexities. The progression from science requirements to a reference design, the development of a product-oriented Work Breakdown Structure (WBS) and an organization that parallels the WBS, the implementation of system engineering, requirements definition and the progression through Conceptual Design to Preliminary Design will be summarized. The development of a detailed cost estimate structured by the WBS, and the methodology of risk analysis to estimate contingency fund requirements will be summarized. Designing the project schedule defines the construction plan and, together with the cost model, provides the basis for executing the project guided by an earned value performance measurement system.

  14. Ground-Based Telescope Parametric Cost Model

    Science.gov (United States)

    Stahl, H. Philip; Rowell, Ginger Holmes

    2004-01-01

    A parametric cost model for ground-based telescopes is developed using multi-variable statistical analysis, The model includes both engineering and performance parameters. While diameter continues to be the dominant cost driver, other significant factors include primary mirror radius of curvature and diffraction limited wavelength. The model includes an explicit factor for primary mirror segmentation and/or duplication (i.e.. multi-telescope phased-array systems). Additionally, single variable models based on aperture diameter are derived. This analysis indicates that recent mirror technology advances have indeed reduced the historical telescope cost curve.

  15. A methodology for eliciting, representing, and analysing stakeholder knowledge for decision making on complex socio-ecological systems: from cognitive maps to agent-based models.

    Science.gov (United States)

    Elsawah, Sondoss; Guillaume, Joseph H A; Filatova, Tatiana; Rook, Josefine; Jakeman, Anthony J

    2015-03-15

    This paper aims to contribute to developing better ways for incorporating essential human elements in decision making processes for modelling of complex socio-ecological systems. It presents a step-wise methodology for integrating perceptions of stakeholders (qualitative) into formal simulation models (quantitative) with the ultimate goal of improving understanding and communication about decision making in complex socio-ecological systems. The methodology integrates cognitive mapping and agent based modelling. It cascades through a sequence of qualitative/soft and numerical methods comprising: (1) Interviews to elicit mental models; (2) Cognitive maps to represent and analyse individual and group mental models; (3) Time-sequence diagrams to chronologically structure the decision making process; (4) All-encompassing conceptual model of decision making, and (5) computational (in this case agent-based) Model. We apply the proposed methodology (labelled ICTAM) in a case study of viticulture irrigation in South Australia. Finally, we use strengths-weakness-opportunities-threats (SWOT) analysis to reflect on the methodology. Results show that the methodology leverages the use of cognitive mapping to capture the richness of decision making and mental models, and provides a combination of divergent and convergent analysis methods leading to the construction of an Agent Based Model. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. KSC ADVANCED GROUND BASED FIELD MILL V1

    Data.gov (United States)

    National Aeronautics and Space Administration — The Advanced Ground Based Field Mill (AGBFM) network consists of 34 (31 operational) field mills located at Kennedy Space Center (KSC), Florida. The field mills...

  17. [Methodological novelties applied to the anthropology of food: agent-based models and social networks analysis].

    Science.gov (United States)

    Díaz Córdova, Diego

    2016-01-01

    The aim of this article is to introduce two methodological strategies that have not often been utilized in the anthropology of food: agent-based models and social networks analysis. In order to illustrate these methods in action, two cases based in materials typical of the anthropology of food are presented. For the first strategy, fieldwork carried out in Quebrada de Humahuaca (province of Jujuy, Argentina) regarding meal recall was used, and for the second, elements of the concept of "domestic consumption strategies" applied by Aguirre were employed. The underlying idea is that, given that eating is recognized as a "total social fact" and, therefore, as a complex phenomenon, the methodological approach must also be characterized by complexity. The greater the number of methods utilized (with the appropriate rigor), the better able we will be to understand the dynamics of feeding in the social environment.

  18. An open repository of earthquake-triggered ground-failure inventories

    Science.gov (United States)

    Schmitt, Robert G.; Tanyas, Hakan; Nowicki Jessee, M. Anna; Zhu, Jing; Biegel, Katherine M.; Allstadt, Kate E.; Jibson, Randall W.; Thompson, Eric M.; van Westen, Cees J.; Sato, Hiroshi P.; Wald, David J.; Godt, Jonathan W.; Gorum, Tolga; Xu, Chong; Rathje, Ellen M.; Knudsen, Keith L.

    2017-12-20

    Earthquake-triggered ground failure, such as landsliding and liquefaction, can contribute significantly to losses, but our current ability to accurately include them in earthquake-hazard analyses is limited. The development of robust and widely applicable models requires access to numerous inventories of ground failures triggered by earthquakes that span a broad range of terrains, shaking characteristics, and climates. We present an openly accessible, centralized earthquake-triggered groundfailure inventory repository in the form of a ScienceBase Community to provide open access to these data with the goal of accelerating research progress. The ScienceBase Community hosts digital inventories created by both U.S. Geological Survey (USGS) and non-USGS authors. We present the original digital inventory files (when available) as well as an integrated database with uniform attributes. We also summarize the mapping methodology and level of completeness as reported by the original author(s) for each inventory. This document describes the steps taken to collect, process, and compile the inventories and the process for adding additional ground-failure inventories to the ScienceBase Community in the future.

  19. State-of-the-art report on the current status of methodologies for seismic PSA

    International Nuclear Information System (INIS)

    1998-01-01

    comprise the overall methodology. The review concentrates on evaluating the extent to which today's seismic-PSA methodology produces reliable and useful results and insights. The evaluation covers six sub-methodologies that must be combined to produce an overall full-scope seismic PSA: the seismic-hazard methodology; the local-ground-motion and building-motion methodology; the walk-down methodology; the failure-mode and fragility methodology; the systems-analysis methodology; and the consequence/release methodology. The report finds that all of these sub-methodologies are both reliable and useful, and that when combined together the overall seismic-PSA methodology can provide important engineering insights about how nuclear power-plants respond to earthquakes. This is true even though the numerical uncertainties in the bottom-line results can be large (plus-or-minus more than one order of magnitude or more is common.) However, a number of areas within the various sub-methodologies can be applied properly only if special expertise is available. The report describes the technical issues in detail, and outlines the approaches that have proven to be most successful, based on world-wide experience with about a hundred seismic PSA studies

  20. Advanced Methodologies for NASA Science Missions

    Science.gov (United States)

    Hurlburt, N. E.; Feigelson, E.; Mentzel, C.

    2017-12-01

    Most of NASA's commitment to computational space science involves the organization and processing of Big Data from space-based satellites, and the calculations of advanced physical models based on these datasets. But considerable thought is also needed on what computations are needed. The science questions addressed by space data are so diverse and complex that traditional analysis procedures are often inadequate. The knowledge and skills of the statistician, applied mathematician, and algorithmic computer scientist must be incorporated into programs that currently emphasize engineering and physical science. NASA's culture and administrative mechanisms take full cognizance that major advances in space science are driven by improvements in instrumentation. But it is less well recognized that new instruments and science questions give rise to new challenges in the treatment of satellite data after it is telemetered to the ground. These issues might be divided into two stages: data reduction through software pipelines developed within NASA mission centers; and science analysis that is performed by hundreds of space scientists dispersed through NASA, U.S. universities, and abroad. Both stages benefit from the latest statistical and computational methods; in some cases, the science result is completely inaccessible using traditional procedures. This paper will review the current state of NASA and present example applications using modern methodologies.

  1. Application of an environmental remediation methodology: theory vs. practice reflections and two Belgian case studies - 59184

    International Nuclear Information System (INIS)

    Blommaert, W.; Mannaerts, K.; Pepin, S.; Dehandschutter, B.

    2012-01-01

    Like in many countries, polluted industrial sites also exist in Belgium. Although the contamination is purely chemical in most cases, they may also contain a radioactive component. For chemically contaminated sites, extensive regulations and methodologies were already developed and applied by the different regional authorities. However and essentially because radioactivity is a federal competence, there was also a necessity for developing a legal federal framework (including an ER-methodology [1]) for remediation of radioactive contaminated sites. Most of the so-called radioactive contaminated sites are exhibiting a mixed contamination (chemical and radiological), and hence the development of such methodology had to be in line with the existing (regional) ones concerning chemical contamination. Each authority having their own responsibilities with regard to the type of contamination, this makes it more complicated and time-consuming finding the best solution satisfying all involved parties. To overcome these difficulties the legal framework and methodology - including the necessary involvement of the stakeholders and delineation of each party's responsibilities - has to be transparent, clear and unambiguous. Once the methodology is developed as such and approved, the application of it is expected to be more or less easy, logic and straightforward. But is this really true? The aim of this document is to investigate as well the impact of factors such as the type of radioactive contamination - levels of contamination, related to NORM activity or not, homogeneous or heterogeneous, the differences in licensing procedures,.. - on the application of the developed methodology and what could be the consequences in the long run on the remediation process. Two existing case studies in Belgium will be presented ([2]). The first case deals with a historical radium contaminated site, the second one with a phosphate processing facility still in operation, both with (very) low

  2. Finite Volume Based Computer Program for Ground Source Heat Pump System

    Energy Technology Data Exchange (ETDEWEB)

    Menart, James A. [Wright State University

    2013-02-22

    This report is a compilation of the work that has been done on the grant DE-EE0002805 entitled ?Finite Volume Based Computer Program for Ground Source Heat Pump Systems.? The goal of this project was to develop a detailed computer simulation tool for GSHP (ground source heat pump) heating and cooling systems. Two such tools were developed as part of this DOE (Department of Energy) grant; the first is a two-dimensional computer program called GEO2D and the second is a three-dimensional computer program called GEO3D. Both of these simulation tools provide an extensive array of results to the user. A unique aspect of both these simulation tools is the complete temperature profile information calculated and presented. Complete temperature profiles throughout the ground, casing, tube wall, and fluid are provided as a function of time. The fluid temperatures from and to the heat pump, as a function of time, are also provided. In addition to temperature information, detailed heat rate information at several locations as a function of time is determined. Heat rates between the heat pump and the building indoor environment, between the working fluid and the heat pump, and between the working fluid and the ground are computed. The heat rates between the ground and the working fluid are calculated as a function time and position along the ground loop. The heating and cooling loads of the building being fitted with a GSHP are determined with the computer program developed by DOE called ENERGYPLUS. Lastly COP (coefficient of performance) results as a function of time are provided. Both the two-dimensional and three-dimensional computer programs developed as part of this work are based upon a detailed finite volume solution of the energy equation for the ground and ground loop. Real heat pump characteristics are entered into the program and used to model the heat pump performance. Thus these computer tools simulate the coupled performance of the ground loop and the heat pump

  3. A Fault Diagnosis Methodology for Gear Pump Based on EEMD and Bayesian Network.

    Science.gov (United States)

    Liu, Zengkai; Liu, Yonghong; Shan, Hongkai; Cai, Baoping; Huang, Qing

    2015-01-01

    This paper proposes a fault diagnosis methodology for a gear pump based on the ensemble empirical mode decomposition (EEMD) method and the Bayesian network. Essentially, the presented scheme is a multi-source information fusion based methodology. Compared with the conventional fault diagnosis with only EEMD, the proposed method is able to take advantage of all useful information besides sensor signals. The presented diagnostic Bayesian network consists of a fault layer, a fault feature layer and a multi-source information layer. Vibration signals from sensor measurement are decomposed by the EEMD method and the energy of intrinsic mode functions (IMFs) are calculated as fault features. These features are added into the fault feature layer in the Bayesian network. The other sources of useful information are added to the information layer. The generalized three-layer Bayesian network can be developed by fully incorporating faults and fault symptoms as well as other useful information such as naked eye inspection and maintenance records. Therefore, diagnostic accuracy and capacity can be improved. The proposed methodology is applied to the fault diagnosis of a gear pump and the structure and parameters of the Bayesian network is established. Compared with artificial neural network and support vector machine classification algorithms, the proposed model has the best diagnostic performance when sensor data is used only. A case study has demonstrated that some information from human observation or system repair records is very helpful to the fault diagnosis. It is effective and efficient in diagnosing faults based on uncertain, incomplete information.

  4. Biomass burning aerosols characterization from ground based and profiling measurements

    Science.gov (United States)

    Marin, Cristina; Vasilescu, Jeni; Marmureanu, Luminita; Ene, Dragos; Preda, Liliana; Mihailescu, Mona

    2018-04-01

    The study goal is to assess the chemical and optical properties of aerosols present in the lofted layers and at the ground. The biomass burning aerosols were evaluated in low level layers from multi-wavelength lidar measurements, while chemical composition at ground was assessed using an Aerosol Chemical Speciation Monitor (ACSM) and an Aethalometer. Classification of aerosol type and specific organic markers were used to explore the potential to sense the particles from the same origin at ground base and on profiles.

  5. An assessment of the performance of global rainfall estimates without ground-based observations

    Directory of Open Access Journals (Sweden)

    C. Massari

    2017-09-01

    Full Text Available Satellite-based rainfall estimates over land have great potential for a wide range of applications, but their validation is challenging due to the scarcity of ground-based observations of rainfall in many areas of the planet. Recent studies have suggested the use of triple collocation (TC to characterize uncertainties associated with rainfall estimates by using three collocated rainfall products. However, TC requires the simultaneous availability of three products with mutually uncorrelated errors, a requirement which is difficult to satisfy with current global precipitation data sets. In this study, a recently developed method for rainfall estimation from soil moisture observations, SM2RAIN, is demonstrated to facilitate the accurate application of TC within triplets containing two state-of-the-art satellite rainfall estimates and a reanalysis product. The validity of different TC assumptions are indirectly tested via a high-quality ground rainfall product over the contiguous United States (CONUS, showing that SM2RAIN can provide a truly independent source of rainfall accumulation information which uniquely satisfies the assumptions underlying TC. On this basis, TC is applied with SM2RAIN on a global scale in an optimal configuration to calculate, for the first time, reliable global correlations (vs. an unknown truth of the aforementioned products without using a ground benchmark data set. The analysis is carried out during the period 2007–2012 using daily rainfall accumulation products obtained at 1° × 1° spatial resolution. Results convey the relatively high performance of the satellite rainfall estimates in eastern North and South America, southern Africa, southern and eastern Asia, eastern Australia, and southern Europe, as well as complementary performances between the reanalysis product and SM2RAIN, with the first performing reasonably well in the Northern Hemisphere and the second providing very good performance in the Southern

  6. How to bring absolute sustainability into decision-making: An industry case study using a Planetary Boundary-based methodology.

    Science.gov (United States)

    Ryberg, Morten W; Owsianiak, Mikołaj; Clavreul, Julie; Mueller, Carina; Sim, Sarah; King, Henry; Hauschild, Michael Z

    2018-09-01

    The Planetary Boundaries concept has emerged as a framework for articulating environmental limits, gaining traction as a basis for considering sustainability in business settings, government policy and international guidelines. There is emerging interest in using the Planetary Boundaries concept as part of life cycle assessment (LCA) for gauging absolute environmental sustainability. We tested the applicability of a novel Planetary Boundaries-based life cycle impact assessment methodology on a hypothetical laundry washing case study at the EU level. We express the impacts corresponding to the control variables of the individual Planetary Boundaries together with a measure of their respective uncertainties. We tested four sharing principles for assigning a share of the safe operating space (SoSOS) to laundry washing and assessed if the impacts were within the assigned SoSOS. The choice of sharing principle had the greatest influence on the outcome. We therefore highlight the need for more research on the development and choice of sharing principles. Although further work is required to operationalize Planetary Boundaries in LCA, this study shows the potential to relate impacts of human activities to environmental boundaries using LCA, offering company and policy decision-makers information needed to promote environmental sustainability. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Model identification methodology for fluid-based inerters

    Science.gov (United States)

    Liu, Xiaofu; Jiang, Jason Zheng; Titurus, Branislav; Harrison, Andrew

    2018-06-01

    Inerter is the mechanical dual of the capacitor via the force-current analogy. It has the property that the force across the terminals is proportional to their relative acceleration. Compared with flywheel-based inerters, fluid-based forms have advantages of improved durability, inherent damping and simplicity of design. In order to improve the understanding of the physical behaviour of this fluid-based device, especially caused by the hydraulic resistance and inertial effects in the external tube, this work proposes a comprehensive model identification methodology. Firstly, a modelling procedure is established, which allows the topological arrangement of the mechanical networks to be obtained by mapping the damping, inertance and stiffness effects directly to their respective hydraulic counterparts. Secondly, an experimental sequence is followed, which separates the identification of friction, stiffness and various damping effects. Furthermore, an experimental set-up is introduced, where two pressure gauges are used to accurately measure the pressure drop across the external tube. The theoretical models with improved confidence are obtained using the proposed methodology for a helical-tube fluid inerter prototype. The sources of remaining discrepancies are further analysed.

  8. Control Method of Single-phase Inverter Based Grounding System in Distribution Networks

    DEFF Research Database (Denmark)

    Wang, Wen; Yan, L.; Zeng, X.

    2016-01-01

    of neutral-to-ground voltage is critical for the safety of distribution networks. An active grounding system based on single-phase inverter is proposed to achieve this objective. Relationship between output current of the system and neutral-to-ground voltage is derived to explain the principle of neutral......The asymmetry of the inherent distributed capacitances causes the rise of neutral-to-ground voltage in ungrounded system or high resistance grounded system. Overvoltage may occur in resonant grounded system if Petersen coil is resonant with the distributed capacitances. Thus, the restraint...

  9. Reconstruction of Sky Illumination Domes from Ground-Based Panoramas

    Science.gov (United States)

    Coubard, F.; Lelégard, L.; Brédif, M.; Paparoditis, N.; Briottet, X.

    2012-07-01

    The knowledge of the sky illumination is important for radiometric corrections and for computer graphics applications such as relighting or augmented reality. We propose an approach to compute environment maps, representing the sky radiance, from a set of ground-based images acquired by a panoramic acquisition system, for instance a mobile-mapping system. These images can be affected by important radiometric artifacts, such as bloom or overexposure. A Perez radiance model is estimated with the blue sky pixels of the images, and used to compute additive corrections in order to reduce these radiometric artifacts. The sky pixels are then aggregated in an environment map, which still suffers from discontinuities on stitching edges. The influence of the quality of estimated sky radiance on the simulated light signal is measured quantitatively on a simple synthetic urban scene; in our case, the maximal error for the total sensor radiance is about 10%.

  10. Multi-Feature Based Multiple Landmine Detection Using Ground Penetration Radar

    Directory of Open Access Journals (Sweden)

    S. Park

    2014-06-01

    Full Text Available This paper presents a novel method for detection of multiple landmines using a ground penetrating radar (GPR. Conventional algorithms mainly focus on detection of a single landmine, which cannot linearly extend to the multiple landmine case. The proposed algorithm is composed of four steps; estimation of the number of multiple objects buried in the ground, isolation of each object, feature extraction and detection of landmines. The number of objects in the GPR signal is estimated by using the energy projection method. Then signals for the objects are extracted by using the symmetry filtering method. Each signal is then processed for features, which are given as input to the support vector machine (SVM for landmine detection. Three landmines buried in various ground conditions are considered for the test of the proposed method. They demonstrate that the proposed method can successfully detect multiple landmines.

  11. A discussion of differences in preparation, performance and postreflections in participant observations within two grounded theory approaches.

    Science.gov (United States)

    Berthelsen, Connie Bøttcher; Lindhardt, Tove; Frederiksen, Kirsten

    2017-06-01

    This paper presents a discussion of the differences in using participant observation as a data collection method by comparing the classic grounded theory methodology of Barney Glaser with the constructivist grounded theory methodology by Kathy Charmaz. Participant observations allow nursing researchers to experience activities and interactions directly in situ. However, using participant observations as a data collection method can be done in many ways, depending on the chosen grounded theory methodology, and may produce different results. This discussion shows that how the differences between using participant observations in classic and constructivist grounded theory can be considerable and that grounded theory researchers should adhere to the method descriptions of performing participant observations according to the selected grounded theory methodology to enhance the quality of research. © 2016 Nordic College of Caring Science.

  12. Risk-based Regulatory Evaluation Program methodology

    International Nuclear Information System (INIS)

    DuCharme, A.R.; Sanders, G.A.; Carlson, D.D.; Asselin, S.V.

    1987-01-01

    The objectives of this DOE-supported Regulatory Evaluation Progrwam are to analyze and evaluate the safety importance and economic significance of existing regulatory guidance in order to assist in the improvement of the regulatory process for current generation and future design reactors. A risk-based cost-benefit methodology was developed to evaluate the safety benefit and cost of specific regulations or Standard Review Plan sections. Risk-based methods can be used in lieu of or in combination with deterministic methods in developing regulatory requirements and reaching regulatory decisions

  13. Silicon carbide optics for space and ground based astronomical telescopes

    Science.gov (United States)

    Robichaud, Joseph; Sampath, Deepak; Wainer, Chris; Schwartz, Jay; Peton, Craig; Mix, Steve; Heller, Court

    2012-09-01

    Silicon Carbide (SiC) optical materials are being applied widely for both space based and ground based optical telescopes. The material provides a superior weight to stiffness ratio, which is an important metric for the design and fabrication of lightweight space telescopes. The material also has superior thermal properties with a low coefficient of thermal expansion, and a high thermal conductivity. The thermal properties advantages are important for both space based and ground based systems, which typically need to operate under stressing thermal conditions. The paper will review L-3 Integrated Optical Systems - SSG’s (L-3 SSG) work in developing SiC optics and SiC optical systems for astronomical observing systems. L-3 SSG has been fielding SiC optical components and systems for over 25 years. Space systems described will emphasize the recently launched Long Range Reconnaissance Imager (LORRI) developed for JHU-APL and NASA-GSFC. Review of ground based applications of SiC will include supporting L-3 IOS-Brashear’s current contract to provide the 0.65 meter diameter, aspheric SiC secondary mirror for the Advanced Technology Solar Telescope (ATST).

  14. Linking Symbolic Interactionism and Grounded Theory Methods in a Research Design

    Directory of Open Access Journals (Sweden)

    Jennifer Chamberlain-Salaun

    2013-09-01

    Full Text Available This article focuses on Corbin and Strauss’ evolved version of grounded theory. In the third edition of their seminal text, Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory, the authors present 16 assumptions that underpin their conception of grounded theory methodology. The assumptions stem from a symbolic interactionism perspective of social life, including the themes of meaning, action and interaction, self and perspectives. As research design incorporates both methodology and methods, the authors aim to expose the linkages between the 16 assumptions and essential grounded theory methods, highlighting the application of the latter in light of the former. Analyzing the links between symbolic interactionism and essential grounded theory methods provides novice researchers and researchers new to grounded theory with a foundation from which to design an evolved grounded theory research study.

  15. Methodology for Selecting Best Management Practices Integrating Multiple Stakeholders and Criteria. Part 2: Case Study

    Directory of Open Access Journals (Sweden)

    Mauricio Carvallo Aceves

    2016-02-01

    Full Text Available The selection of stormwater Best Management Practices (BMPs for mitigating the effects of urbanization on the hydrological cycle could be a complex process due to conflicting stakeholder views, and varying levels of performance of BMPs across a range of criteria (runoff reduction, erosion control, etc.. Part 1 of this article proposed a methodology based on the application of multi-criteria decision aid (MCDA methods, which was tested here on a residential stormwater network in the Montreal area. The case study considered green roofs, rain gardens, rain barrels and pervious pavement over a range of economic, social, and water quality and quantity criteria by applying 4 MCDA methods under three different stakeholder views. The results indicated Elimination et Choix Traduisant la Réalité (ELECTRE III to be the most appropriate method for the methodology, presenting flexibility concerning threshold values, criteria weights, and showing shared top choices across stakeholders (rain gardens, and rain gardens in combination with pervious pavement. The methodology shows potential for more formal applications and research opportunities. Future work may lie in the inclusion of multiple objective optimization, better stakeholder engagement, estimation of economic benefits, water quality modeling, long-term hydrological simulations, and estimating real BMP pollutant removal rates.

  16. Vision-based Ground Test for Active Debris Removal

    Directory of Open Access Journals (Sweden)

    Seong-Min Lim

    2013-12-01

    Full Text Available Due to the continuous space development by mankind, the number of space objects including space debris in orbits around the Earth has increased, and accordingly, difficulties of space development and activities are expected in the near future. In this study, among the stages for space debris removal, the implementation of a vision-based approach technique for approaching space debris from a far-range rendezvous state to a proximity state, and the ground test performance results were described. For the vision-based object tracking, the CAM-shift algorithm with high speed and strong performance, and the Kalman filter were combined and utilized. For measuring the distance to a tracking object, a stereo camera was used. For the construction of a low-cost space environment simulation test bed, a sun simulator was used, and in the case of the platform for approaching, a two-dimensional mobile robot was used. The tracking status was examined while changing the position of the sun simulator, and the results indicated that the CAM-shift showed a tracking rate of about 87% and the relative distance could be measured down to 0.9 m. In addition, considerations for future space environment simulation tests were proposed.

  17. Methodology for the Study of the Envelope Airtightness of Residential Buildings in Spain: A Case Study

    Directory of Open Access Journals (Sweden)

    Feijó-Muñoz Jesús

    2018-03-01

    Full Text Available Air leakage and its impact on the energy performance of dwellings has been broadly studied in countries with cold climates in Europe, US, and Canada. However, there is a lack of knowledge in this field in Mediterranean countries. Current Spanish building regulations establish ventilation rates based on ideal airtight envelopes, causing problems of over-ventilation and substantial energy losses. The aim of this paper is to develop a methodology that allows the characterization of the envelope of the housing stock in Spain in order to adjust ventilation rates taking into consideration air leakage. A methodology that is easily applicable to other countries that consider studying the airtightness of the envelope and its energetic behaviour improvement is proposed. A statistical sampling method has been established to determine the dwellings to be tested, considering relevant variables concerning airtightness: climate zone, year of construction, and typology. The air leakage rate is determined using a standardized building pressurization technique according to European Standard EN 13829. A representative case study has been presented as an example of the implementation of the designed methodology and results are compared to preliminary values obtained from the database.

  18. How to Mutually Advance General Education and Major-Based Education: A Grounded Theory Study on the Course Level

    Science.gov (United States)

    Fang, Hualiang

    2018-01-01

    The author employs grounded theory to investigate the teaching process of an interdisciplinary general education course at A University as a case. The author finds that under the condition of rather concrete relations between the subject of a major-based course and that of an elected general education course, if the major course is taught with a…

  19. A risk-based sensor placement methodology

    International Nuclear Information System (INIS)

    Lee, Ronald W.; Kulesz, James J.

    2008-01-01

    A risk-based sensor placement methodology is proposed to solve the problem of optimal location of sensors to protect population against the exposure to, and effects of, known and/or postulated chemical, biological, and/or radiological threats. Risk is calculated as a quantitative value representing population at risk from exposure at standard exposure levels. Historical meteorological data are used to characterize weather conditions as the frequency of wind speed and direction pairs. The meteorological data drive atmospheric transport and dispersion modeling of the threats, the results of which are used to calculate risk values. Sensor locations are determined via an iterative dynamic programming algorithm whereby threats detected by sensors placed in prior iterations are removed from consideration in subsequent iterations. In addition to the risk-based placement algorithm, the proposed methodology provides a quantification of the marginal utility of each additional sensor. This is the fraction of the total risk accounted for by placement of the sensor. Thus, the criteria for halting the iterative process can be the number of sensors available, a threshold marginal utility value, and/or a minimum cumulative utility achieved with all sensors

  20. Review: Franz Breuer with Assistance of Barbara Dieris and Antje Lettau (2009. Reflexive Grounded Theory. Eine Einführung für die Forschungspraxis [Reflexive Grounded Theory: An Introduction to Research Praxis

    Directory of Open Access Journals (Sweden)

    Sandra Da Rin

    2010-03-01

    Full Text Available This textbook by Franz BREUER, produced with the assistance of Barbara DIERIS and Antje LETTAU, is of interest more for the introduction it provides to reflexive research praxis than to grounded theory methodology. This means the subjectivity of the researcher is included in the research process as a decisive source of cognition. Reflexive grounded theory methodology is characterized by three elements that also structure the textbook. In the present review, I focus on two of these in detail: the approach to the research field based on ethnography, particular its epistemological prerequisites, and the inclusion of (self- reflexivity. The latter points to questions that are addressed at the end of this review. URN: urn:nbn:de:0114-fqs1002140

  1. EnergiTools. A methodology for performance monitoring and diagnosis

    International Nuclear Information System (INIS)

    Ancion, P.; Bastien, R.; Ringdahl, K.

    2000-01-01

    EnergiTools is a performance monitoring and diagnostic tool that combines the power of on-line process data acquisition with advanced diagnosis methodologies. Analytical models based on thermodynamic principles are combined with neural networks to validate sensor data and to estimate missing or faulty measurements. Advanced diagnostic technologies are then applied to point out potential faults and areas to be investigated further. The diagnosis methodologies are based on Bayesian belief networks. Expert knowledge is captured in the form of the fault-symptom relationships and includes historical information as the likelihood of faults and symptoms. The methodology produces the likelihood of component failure root causes using the expert knowledge base. EnergiTools is used at Ringhals nuclear power plants. It has led to the diagnosis of various performance issues. Three case studies based on this plant data and model are presented and illustrate the diagnosis support methodologies implemented in EnergiTools . In the first case, the analytical data qualification technique points out several faulty measurements. The application of a neural network for the estimation of the nuclear reactor power by interpreting several plant indicators is then illustrated. The use of the Bayesian belief networks is finally described. (author)

  2. A usage-centered evaluation methodology for unmanned ground vehicles

    NARCIS (Netherlands)

    Diggelen, J. van; Looije, R.; Mioch, T.; Neerincx, M.A.; Smets, N.J.J.M.

    2012-01-01

    This paper presents a usage-centered evaluation method to assess the capabilities of a particular Unmanned Ground Vehicle (UGV) for establishing the operational goals. The method includes a test battery consisting of basic tasks (e.g., slalom, funnel driving, object detection). Tests can be of

  3. Resisting Coherence: Trans Men's Experiences and the Use of Grounded Theory Methods

    Science.gov (United States)

    Catalano, D. Chase J.

    2017-01-01

    In this methodological reflective manuscript, I explore my decision to use a grounded theoretical approach to my dissertation study on trans* men in higher education. Specifically, I question whether grounded theory as a methodology is capable of capturing the complexity and capaciousness of trans*-masculine experiences. Through the lenses of…

  4. Effects of Long-Duration Ground Motions on Liquefaction Hazards

    Science.gov (United States)

    Greenfield, Michael W.

    Soil liquefaction during past earthquakes has caused extensive damage to buildings, bridges, dam, pipelines and other elements of infrastructure. Geotechnical engineers use empirical observations from earthquake case histories in conjunction with soil mechanics to predict the behavior of liquefiable soils. However, current empirical databases are insufficient to evaluate the behavior of soils subject to long-duration earthquakes, such as a possible Mw = 9.0 Cascadia Subduction Zone earthquake. The objective of this research is to develop insight into the triggering and effects of liquefaction due to long-duration ground motions and to provide recommendations for analysis and design. Recorded ground motions from 21 case histories with surficial evidence of liquefaction showed marked differences in soil behavior before and after liquefaction was triggered. In some cases, strong shaking continued for several minutes after the soil liquefied, and a variety of behaviors were observed including dilation pulses, continued softening due to soil fabric degradation, and soil stiffening due to pore pressure dissipation and drainage. Supplemental field and laboratory investigations were performed at three sites that liquefied during the 2011 Mw = 9.0 Tohoku earthquake. The recorded ground motions and field investigation data were used in conjunction with laboratory observations, analytical models, and numerical models to evaluate the behavior of liquefiable soils subjected to long-duration ground motions. Observations from the case histories inspired a framework to predict ground deformations based on the differences in soil behavior before and after liquefaction has triggered. This framework decouples the intensity of shaking necessary to trigger liquefaction from the intensity of shaking that drives deformation by identifying the time when liquefaction triggers. The timing-based framework promises to dramatically reduce the uncertainty in deformation estimates compared to

  5. Security Testing in Agile Web Application Development - A Case Study Using the EAST Methodology

    CERN Document Server

    Erdogan, Gencer

    2010-01-01

    There is a need for improved security testing methodologies specialized for Web applications and their agile development environment. The number of web application vulnerabilities is drastically increasing, while security testing tends to be given a low priority. In this paper, we analyze and compare Agile Security Testing with two other common methodologies for Web application security testing, and then present an extension of this methodology. We present a case study showing how our Extended Agile Security Testing (EAST) performs compared to a more ad hoc approach used within an organization. Our working hypothesis is that the detection of vulnerabilities in Web applications will be significantly more efficient when using a structured security testing methodology specialized for Web applications, compared to existing ad hoc ways of performing security tests. Our results show a clear indication that our hypothesis is on the right track.

  6. Effects on ground motion related to spatial variability

    International Nuclear Information System (INIS)

    Vanmarcke, E.H.

    1987-01-01

    Models of the spectral content and the space-time correlation structure of strong earthquake ground motion are combined with transient random vibration analysis to yield site-specific response spectra that can account for the effect of local spatial averaging of the ground motion across a rigid foundation of prescribed size. The methodology is presented with reference to sites in eastern North America, although the basic approach is applicable to other seismic regions provided the source and attenuation parameters are regionally adjusted. Parameters in the spatial correlation model are based on data from the SMART-I accelerograph array, and the sensitivity of response spectra reduction factors with respect to these parameters is examined. The starting point of the analysis is the Fourier amplitude spectrum of site displacement expresses as a function of earthquake source parameters and source-to-site distance. The bedrock acceleration spectral density function at a point, derived from the displacement spectrum, is modified to account for anelastic attenuation, and where appropriate, for local soil effects and/or local spatial averaging across a foundation. Transient random vibration analysis yields approximate analytical expressions for median ground motion amplitudes and median response spectra of an earthquake defined in terms of its spectral density function and strong motion duration. The methodology is illustrated for three events characterized by their m b magnitude and epicentral distance. The focus in this paper is on the stochastic response prediction methodology enabling explicit accounting for strong motion duration and the effect of local spatial averaging on response spectra. The numerical examples enable a preliminary assessment of the reduction of response spectral amplitudes attributable to local spatial averaging across rigid foundations of different sizes. 36 refs

  7. Evaluating data worth for ground-water management under uncertainty

    Science.gov (United States)

    Wagner, B.J.

    1999-01-01

    A decision framework is presented for assessing the value of ground-water sampling within the context of ground-water management under uncertainty. The framework couples two optimization models-a chance-constrained ground-water management model and an integer-programing sampling network design model-to identify optimal pumping and sampling strategies. The methodology consists of four steps: (1) The optimal ground-water management strategy for the present level of model uncertainty is determined using the chance-constrained management model; (2) for a specified data collection budget, the monitoring network design model identifies, prior to data collection, the sampling strategy that will minimize model uncertainty; (3) the optimal ground-water management strategy is recalculated on the basis of the projected model uncertainty after sampling; and (4) the worth of the monitoring strategy is assessed by comparing the value of the sample information-i.e., the projected reduction in management costs-with the cost of data collection. Steps 2-4 are repeated for a series of data collection budgets, producing a suite of management/monitoring alternatives, from which the best alternative can be selected. A hypothetical example demonstrates the methodology's ability to identify the ground-water sampling strategy with greatest net economic benefit for ground-water management.A decision framework is presented for assessing the value of ground-water sampling within the context of ground-water management under uncertainty. The framework couples two optimization models - a chance-constrained ground-water management model and an integer-programming sampling network design model - to identify optimal pumping and sampling strategies. The methodology consists of four steps: (1) The optimal ground-water management strategy for the present level of model uncertainty is determined using the chance-constrained management model; (2) for a specified data collection budget, the monitoring

  8. A Model-Based Methodology for Simultaneous Design and Control of a Bioethanol Production Process

    DEFF Research Database (Denmark)

    Alvarado-Morales, Merlin; Abd.Hamid, Mohd-Kamaruddin; Sin, Gürkan

    2010-01-01

    . The PGC methodology is used to generate more efficient separation designs in terms of energy consumption by targeting the separation task at the largest DF. Both methodologies are highlighted through the application of two case studies, a bioethanol production process and a succinic acid production...

  9. Developing knowledge management systems with an active expert methodology

    International Nuclear Information System (INIS)

    Sandahl, K.

    1992-01-01

    Knowledge management, understood as the ability to store, distribute and utilize human knowledge in an organization, is the subject of this dissertation. In particular we have studied the design of methods and supporting software for this process. Detailed and systematic description of the design and development processes of three case-study implementations of knowledge management software are provided. The outcome of the projects is explained in terms of an active expert development methodology, which is centered around support for a domain expert to take substantial responsibility for the design and maintenance of a knowledge management system in a given area of application. Based on the experiences from the case studies and the resulting methodology, an environment for automatically supporting knowledge management was designed in the KNOWLEDGE-LINKER research project. The vital part of this architecture is a knowledge acquisition tool, used directly by the experts in creating and maintaining a knowledge base. An elaborated version of the active expert development methodology was then formulated as the result of applying the KNOWLEDGE-LINKER approach in a fourth case study. This version of the methodology is also accounted for and evaluated together within the supporting KNOWLEDGE-LINKER architecture. (au)

  10. Development of ground-based wind energy in DOM and Corsica - Joint CGEDD / CGEIET report

    International Nuclear Information System (INIS)

    Joannis de Verclos, Christian de; Albrecht, Patrick; Iselin, Philippe; Legait, Benoit; Vignolles, Denis

    2012-09-01

    Addressing the peculiar cases of the French overseas districts (DOM: Guadeloupe, Martinique, Guyana, Mayotte, La Reunion) and Corsica, this report analyzes four main topics: the objectives and challenges of ground-based wind energy (sustainable development, not-interconnected areas, and public service of electricity supply), the local situations and their cartography, the legal issues and the possible evolution options (energy law, environmental law, urban planning law, local community law), and the modalities of devolution of project. The authors highlight the issues which require a new legal framework, notably governance and the devolution procedure

  11. An Image Analysis-Based Methodology for Chromite Exploration through Opto-Geometric Parameters; a Case Study in Faryab Area, SE of Iran

    Directory of Open Access Journals (Sweden)

    Mansur Ziaii

    2017-06-01

    Full Text Available Traditional methods of chromite exploration are mostly based on geophysical techniques and drilling operations. They are expensive and time-consuming. Furthermore, they suffer from several shortcomings such as lack of sufficient geophysical density contrast. In order to overcome these drawbacks, the current research work is carried out to introduce a novel, automatic and opto-geometric image analysis (OGIA technique for extracting the structural properties of chromite minerals using polished thin sections prepared from outcrops. Several images are taken from polished thick sections through a reflected-light microscope equipped with a digital camera. The images are processed in filtering and segmentation steps to extract the worthwhile information of chromite minerals. The directional density of chromite minerals, as a textural property, is studied in different inclinations, and the main trend of chromite growth is identified. Microscopic inclination of chromite veins can be generalized for exploring the macroscopic layers of chromite buried under either the surface quaternary alluvium or overburden rocks. The performance of the OGIA methodology is tested in a real case study, where several exploratory boreholes are drilled. The results obtained show that the microscopic investigation outlines through image analysis are in good agreement with the results obtained from interpretation of boreholes. The OGIA method represents a reliable map of the absence or existence of chromite ore deposits in different horizontal surfaces. Directing the exploration investigations toward more susceptible zones (potentials and preventing from wasting time and money are the major contributions of the OGIA methodology. It leads to make an optimal managerial and economical decision.

  12. Integrated HPTLC-based Methodology for the Tracing of Bioactive Compounds in Herbal Extracts Employing Multivariate Chemometrics. A Case Study on Morus alba.

    Science.gov (United States)

    Chaita, Eliza; Gikas, Evagelos; Aligiannis, Nektarios

    2017-03-01

    In drug discovery, bioassay-guided isolation is a well-established procedure, and still the basic approach for the discovery of natural products with desired biological properties. However, in these procedures, the most laborious and time-consuming step is the isolation of the bioactive constituents. A prior identification of the compounds that contribute to the demonstrated activity of the fractions would enable the selection of proper chromatographic techniques and lead to targeted isolation. The development of an integrated HPTLC-based methodology for the rapid tracing of the bioactive compounds during bioassay-guided processes, using multivariate statistics. Materials and Methods - The methanol extract of Morus alba was fractionated employing CPC. Subsequently, fractions were assayed for tyrosinase inhibition and analyzed with HPTLC. PLS-R algorithm was performed in order to correlate the analytical data with the biological response of the fractions and identify the compounds with the highest contribution. Two methodologies were developed for the generation of the dataset; one based on manual peak picking and the second based on chromatogram binning. Results and Discussion - Both methodologies afforded comparable results and were able to trace the bioactive constituents (e.g. oxyresveratrol, trans-dihydromorin, 2,4,3'-trihydroxydihydrostilbene). The suggested compounds were compared in terms of R f values and UV spectra with compounds isolated from M. alba using typical bioassay-guided process. Chemometric tools supported the development of a novel HPTLC-based methodology for the tracing of tyrosinase inhibitors in M. alba extract. All steps of the experimental procedure implemented techniques that afford essential key elements for application in high-throughput screening procedures for drug discovery purposes. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  13. Ground based measurements of particulate emissions from supersonic transports. Concorde olympus engine

    Energy Technology Data Exchange (ETDEWEB)

    Whitefield, Ph D; Hagen, D E [Missouri Univ., Rolla, MO (United States). Cloud and Aerosol Sciences Lab.; Lilenfeld, H V [McDonnell Douglas Corp., St. Louis, MO (United States)

    1998-12-31

    The application of a mobile aerosol monitoring facility, the Mobile Aerosol Sampling System (MASS) is described to characterize engine aerosol emissions from the Rolls Royce Olympus Engine. The multi-configurational MASS has been employed in both ground and airborne field operations. It has been successfully flown on research aircrafts. In ground tests the MASS has participated in numerous jet engine related ground tests, and has been deployed to resolve aerosol generation problems in a high power chemical laser system. In all cases the measurements were made on samples taken from a harsh physical and chemical environment, with both high and low temperature and pressure, and in the presence of highly reactive gases. (R.P.) 9 refs.

  14. Ground based measurements of particulate emissions from supersonic transports. Concorde olympus engine

    Energy Technology Data Exchange (ETDEWEB)

    Whitefield, Ph.D.; Hagen, D.E. [Missouri Univ., Rolla, MO (United States). Cloud and Aerosol Sciences Lab.; Lilenfeld, H.V. [McDonnell Douglas Corp., St. Louis, MO (United States)

    1997-12-31

    The application of a mobile aerosol monitoring facility, the Mobile Aerosol Sampling System (MASS) is described to characterize engine aerosol emissions from the Rolls Royce Olympus Engine. The multi-configurational MASS has been employed in both ground and airborne field operations. It has been successfully flown on research aircrafts. In ground tests the MASS has participated in numerous jet engine related ground tests, and has been deployed to resolve aerosol generation problems in a high power chemical laser system. In all cases the measurements were made on samples taken from a harsh physical and chemical environment, with both high and low temperature and pressure, and in the presence of highly reactive gases. (R.P.) 9 refs.

  15. Safety of High Speed Ground Transportation Systems : Analytical Methodology for Safety Validation of Computer Controlled Subsystems : Volume 2. Development of a Safety Validation Methodology

    Science.gov (United States)

    1995-01-01

    This report describes the development of a methodology designed to assure that a sufficiently high level of safety is achieved and maintained in computer-based systems which perform safety cortical functions in high-speed rail or magnetic levitation ...

  16. Compliance strategy for statistically based neutron overpower protection safety analysis methodology

    International Nuclear Information System (INIS)

    Holliday, E.; Phan, B.; Nainer, O.

    2009-01-01

    The methodology employed in the safety analysis of the slow Loss of Regulation (LOR) event in the OPG and Bruce Power CANDU reactors, referred to as Neutron Overpower Protection (NOP) analysis, is a statistically based methodology. Further enhancement to this methodology includes the use of Extreme Value Statistics (EVS) for the explicit treatment of aleatory and epistemic uncertainties, and probabilistic weighting of the initial core states. A key aspect of this enhanced NOP methodology is to demonstrate adherence, or compliance, with the analysis basis. This paper outlines a compliance strategy capable of accounting for the statistical nature of the enhanced NOP methodology. (author)

  17. RECONSTRUCTION OF SKY ILLUMINATION DOMES FROM GROUND-BASED PANORAMAS

    Directory of Open Access Journals (Sweden)

    F. Coubard

    2012-07-01

    Full Text Available The knowledge of the sky illumination is important for radiometric corrections and for computer graphics applications such as relighting or augmented reality. We propose an approach to compute environment maps, representing the sky radiance, from a set of ground-based images acquired by a panoramic acquisition system, for instance a mobile-mapping system. These images can be affected by important radiometric artifacts, such as bloom or overexposure. A Perez radiance model is estimated with the blue sky pixels of the images, and used to compute additive corrections in order to reduce these radiometric artifacts. The sky pixels are then aggregated in an environment map, which still suffers from discontinuities on stitching edges. The influence of the quality of estimated sky radiance on the simulated light signal is measured quantitatively on a simple synthetic urban scene; in our case, the maximal error for the total sensor radiance is about 10%.

  18. Methodological challenges in assessing the environmental status of a marine ecosystem: case study of the Baltic Sea.

    Directory of Open Access Journals (Sweden)

    Henn Ojaveer

    Full Text Available Assessments of the environmental status of marine ecosystems are increasingly needed to inform management decisions and regulate human pressures to meet the objectives of environmental policies. This paper addresses some generic methodological challenges and related uncertainties involved in marine ecosystem assessment, using the central Baltic Sea as a case study. The objectives of good environmental status of the Baltic Sea are largely focusing on biodiversity, eutrophication and hazardous substances. In this paper, we conduct comparative evaluations of the status of these three segments, by applying different methodological approaches. Our analyses indicate that the assessment results are sensitive to a selection of indicators for ecological quality objectives that are affected by a broad spectrum of human activities and natural processes (biodiversity, less so for objectives that are influenced by a relatively narrow array of drivers (eutrophications, hazardous substances. The choice of indicator aggregation rule appeared to be of essential importance for assessment results for all three segments, whereas the hierarchical structure of indicators had only a minor influence. Trend-based assessment was shown to be a useful supplement to reference-based evaluation, being independent of the problems related to defining reference values and indicator aggregation methodologies. Results of this study will help in setting priorities for future efforts to improve environmental assessments in the Baltic Sea and elsewhere, and to ensure the transparency of the assessment procedure.

  19. High energy astrophysics with ground-based gamma ray detectors

    International Nuclear Information System (INIS)

    Aharonian, F; Buckley, J; Kifune, T; Sinnis, G

    2008-01-01

    Recent advances in ground-based gamma ray astronomy have led to the discovery of more than 70 sources of very high energy (E γ ≥ 100 GeV) gamma rays, falling into a number of source populations including pulsar wind nebulae, shell type supernova remnants, Wolf-Rayet stars, giant molecular clouds, binary systems, the Galactic Center, active galactic nuclei and 'dark' (yet unidentified) galactic objects. We summarize the history of TeV gamma ray astronomy up to the current status of the field including a description of experimental techniques and highlight recent astrophysical results. We also discuss the potential of ground-based gamma ray astronomy for future discoveries and describe possible directions for future instrumental developments

  20. Guidelines for reporting evaluations based on observational methodology.

    Science.gov (United States)

    Portell, Mariona; Anguera, M Teresa; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana

    2015-01-01

    Observational methodology is one of the most suitable research designs for evaluating fidelity of implementation, especially in complex interventions. However, the conduct and reporting of observational studies is hampered by the absence of specific guidelines, such as those that exist for other evaluation designs. This lack of specific guidance poses a threat to the quality and transparency of these studies and also constitutes a considerable publication hurdle. The aim of this study thus was to draw up a set of proposed guidelines for reporting evaluations based on observational methodology. The guidelines were developed by triangulating three sources of information: observational studies performed in different fields by experts in observational methodology, reporting guidelines for general studies and studies with similar designs to observational studies, and proposals from experts in observational methodology at scientific meetings. We produced a list of guidelines grouped into three domains: intervention and expected outcomes, methods, and results. The result is a useful, carefully crafted set of simple guidelines for conducting and reporting observational studies in the field of program evaluation.

  1. Constructing New Theory for Identifying Students with Emotional Disturbance: A Constructivist Approach to Grounded Theory

    OpenAIRE

    Dori Barnett

    2012-01-01

    A grounded theory study that examined how practitioners in a county alternative and correctional education setting identify youth with emotional and behavioral difficulties for special education services provides an exemplar for a constructivist approach to grounded theory methodology. Discussion focuses on how a constructivist orientation to grounded theory methodology informed research decisions, shaped the development of the emergent grounded theory, and prompted a way of thinking about da...

  2. Research on the evaluation indicators of skilled employees’ career success based on grounded theory

    Directory of Open Access Journals (Sweden)

    Fulei Chu

    2015-04-01

    Full Text Available Purpose: summarized and sorted career success evaluation indicators of skilled employees Design/methodology/approach: Based on Grounded Theory, through interviews and questionnaires to railway skilled employees Findings and Originality/value: the study shows that “subjective career success”, including work-family balance, life satisfaction, career satisfaction, perception of career success, “objective career success”, including level of total revenue venue, growth rate of wage and times of promotion, “knowledge and skills career success” including upgrade of knowledge and skills, classification of skills, external competitiveness and job autonomy, are three important career success evaluation indicators of skilled employees. Originality/value: The results show that different age groups, different titles and different positions of skilled employees, there is a significant difference in the choice of career success evaluation indicators. It provides a useful reference to establish a career development system for the skilled employees.

  3. OBEST: The Object-Based Event Scenario Tree Methodology

    International Nuclear Information System (INIS)

    WYSS, GREGORY D.; DURAN, FELICIA A.

    2001-01-01

    Event tree analysis and Monte Carlo-based discrete event simulation have been used in risk assessment studies for many years. This report details how features of these two methods can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology with some of the best features of each. The resultant Object-Based Event Scenarios Tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible (especially those that exhibit inconsistent or variable event ordering, which are difficult to represent in an event tree analysis). Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST method uses a recursive algorithm to solve the object model and identify all possible scenarios and their associated probabilities. Since scenario likelihoods are developed directly by the solution algorithm, they need not be computed by statistical inference based on Monte Carlo observations (as required by some discrete event simulation methods). Thus, OBEST is not only much more computationally efficient than these simulation methods, but it also discovers scenarios that have extremely low probabilities as a natural analytical result--scenarios that would likely be missed by a Monte Carlo-based method. This report documents the OBEST methodology, the demonstration software that implements it, and provides example OBEST models for several different application domains, including interactions among failing interdependent infrastructure systems, circuit analysis for fire risk evaluation in nuclear power plants, and aviation safety studies

  4. Developing purchasing strategy: a case study of a District Health Authority using soft systems methodology.

    Science.gov (United States)

    Brown, A D

    1997-02-01

    This paper examines the attempt by a District Health Authority (DHA) to create structures (called Purchasing Strategy Groups or PSGs) to facilitate the effective development of its purchasing strategy. The paper is based on a case study design conducted using Soft Systems Methodology (SSM). The research contribution the paper makes is twofold. First, it analyses some of the fundamental management-related difficulties that a DHA can experience when attempting to come to terms with its role and responsibilities in the 1990s. Second, it provides a discussion and evaluation of the utility of SSM for qualitative research in the National Health Service (NHS) in the UK.

  5. Review and evaluation of paleohydrologic methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Foley, M.G.; Zimmerman, D.A.; Doesburg, J.M.; Thorne, P.D.

    1982-12-01

    A literature review was conducted to identify methodologies that could be used to interpret paleohydrologic environments. Paleohydrology is the study of past hydrologic systems or of the past behavior of an existing hydrologic system. The purpose of the review was to evaluate how well these methodologies could be applied to the siting of low-level radioactive waste facilities. The computer literature search queried five bibliographical data bases containing over five million citations of technical journals, books, conference papers, and reports. Two data-base searches (United States Geological Survey - USGS) and a manual search were also conducted. The methodologies were examined for data requirements and sensitivity limits. Paleohydrologic interpretations are uncertain because of the effects of time on hydrologic and geologic systems and because of the complexity of fluvial systems. Paleoflow determinations appear in many cases to be order-of-magnitude estimates. However, the methodologies identified in this report mitigate this uncertainty when used collectively as well as independently. That is, the data from individual methodologies can be compared or combined to corroborate hydrologic predictions. In this manner, paleohydrologic methodologies are viable tools to assist in evaluating the likely future hydrology of low-level radioactive waste sites.

  6. Review and evaluation of paleohydrologic methodologies

    International Nuclear Information System (INIS)

    Foley, M.G.; Zimmerman, D.A.; Doesburg, J.M.; Thorne, P.D.

    1982-12-01

    A literature review was conducted to identify methodologies that could be used to interpret paleohydrologic environments. Paleohydrology is the study of past hydrologic systems or of the past behavior of an existing hydrologic system. The purpose of the review was to evaluate how well these methodologies could be applied to the siting of low-level radioactive waste facilities. The computer literature search queried five bibliographical data bases containing over five million citations of technical journals, books, conference papers, and reports. Two data-base searches (United States Geological Survey - USGS) and a manual search were also conducted. The methodologies were examined for data requirements and sensitivity limits. Paleohydrologic interpretations are uncertain because of the effects of time on hydrologic and geologic systems and because of the complexity of fluvial systems. Paleoflow determinations appear in many cases to be order-of-magnitude estimates. However, the methodologies identified in this report mitigate this uncertainty when used collectively as well as independently. That is, the data from individual methodologies can be compared or combined to corroborate hydrologic predictions. In this manner, paleohydrologic methodologies are viable tools to assist in evaluating the likely future hydrology of low-level radioactive waste sites

  7. SCIENTIFIC EFFICIENCY OF GROUND-BASED TELESCOPES

    International Nuclear Information System (INIS)

    Abt, Helmut A.

    2012-01-01

    I scanned the six major astronomical journals of 2008 for all 1589 papers that are based on new data obtained from ground-based optical/IR telescopes worldwide. Then I collected data on numbers of papers, citations to them in 3+ years, the most-cited papers, and annual operating costs. These data are assigned to four groups by telescope aperture. For instance, while the papers from telescopes with an aperture >7 m average 1.29 more citations than those with an aperture of 2 to 7 m) telescopes. I wonder why the large telescopes do so relatively poorly and suggest possible reasons. I also found that papers based on archival data, such as the Sloan Digital Sky Survey, produce 10.6% as many papers and 20.6% as many citations as those based on new data. Also, the 577.2 papers based on radio data produced 36.3% as many papers and 33.6% as many citations as the 1589 papers based on optical/IR telescopes.

  8. Ground-Based VIS/NIR Reflectance Spectra of 25143 Itokawa: What Hayabusa will See and How Ground-Based Data can Augment Analyses

    Science.gov (United States)

    Vilas, Faith; Abell, P. A.; Jarvis, K. S.

    2004-01-01

    Planning for the arrival of the Hayabusa spacecraft at asteroid 25143 Itokawa includes consideration of the expected spectral information to be obtained using the AMICA and NIRS instruments. The rotationally-resolved spatial coverage the asteroid we have obtained with ground-based telescopic spectrophotometry in the visible and near-infrared can be utilized here to address expected spacecraft data. We use spectrophotometry to simulate the types of data that Hayabusa will receive with the NIRS and AMICA instruments, and will demonstrate them here. The NIRS will cover a wavelength range from 0.85 m, and have a dispersion per element of 250 Angstroms. Thus, we are limited in coverage of the 1.0 micrometer and 2.0 micrometer mafic silicate absorption features. The ground-based reflectance spectra of Itokawa show a large component of olivine in its surface material, and the 2.0 micrometer feature is shallow. Determining the olivine to pyroxene abundance ratio is critically dependent on the attributes of the 1.0- and 2.0 micrometer features. With a cut-off near 2,1 micrometer the longer edge of the 2.0- feature will not be obtained by NIRS. Reflectance spectra obtained using ground-based telescopes can be used to determine the regional composition around space-based spectral observations, and possibly augment the longer wavelength spectral attributes. Similarly, the shorter wavelength end of the 1.0 micrometer absorption feature will be partially lost to the NIRS. The AMICA filters mimic the ECAS filters, and have wavelength coverage overlapping with the NIRS spectral range. We demonstrate how merging photometry from AMICA will extend the spectral coverage of the NIRS. Lessons learned from earlier spacecraft to asteroids should be considered.

  9. Methodologies Developed for EcoCity Related Projects: New Borg El Arab, an Egyptian Case Study

    Directory of Open Access Journals (Sweden)

    Carmen Antuña-Rozado

    2016-08-01

    Full Text Available The aim of the methodologies described here is to propose measures and procedures for developing concepts and technological solutions, which are adapted to the local conditions, to build sustainable communities in developing countries and emerging economies. These methodologies are linked to the EcoCity framework outlined by VTT Technical Research Centre of Finland Ltd. for sustainable community and neighbourhood regeneration and development. The framework is the result of a long experience in numerous EcoCity related projects, mainly Nordic and European in scope, which has been reformulated in recent years to respond to the local needs in the previously mentioned countries. There is also a particular emphasis on close collaboration with local partners and major stakeholders. In order to illustrate how these methodologies can support EcoCity concept development and implementation, results from a case study in Egypt will be discussed. The referred case study relates to the transformation of New Borg El Arab (NBC, near Alexandria, into an EcoCity. The viability of the idea was explored making use of different methodologies (Roadmap, Feasibility Study, and Residents Energy Survey and Building Consumption Assessment and considering the Residential, Commercial/Public Facilities, Industrial, Services/Utilities, and Transport sectors.

  10. A systematic methodology for design of tailor-made blended products

    DEFF Research Database (Denmark)

    Yunus, Nor Alafiza Binti; Gernaey, Krist; Woodley, John

    2014-01-01

    A systematic methodology for design of tailor-made blended products has been developed. In tailor-made blended products, one identifies the product needs and matches them by blending different chemicals. The systematic methodology has four main tasks. First, the design problem is defined: the pro......, the methodology is highlighted through two case studies involving gasoline blends and lubricant base oils....

  11. Performance Based Criteria for Ship Collision and Grounding

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    2009-01-01

    The paper outlines a probabilistic procedure whereby the maritime industry can develop performance based rules to reduce the risk associated with human, environmental and economic costs of collision and grounding events and identify the most economic risk control options associated with prevention...

  12. Using Modern Methodologies with Maintenance Software

    Science.gov (United States)

    Streiffert, Barbara A.; Francis, Laurie K.; Smith, Benjamin D.

    2014-01-01

    Jet Propulsion Laboratory uses multi-mission software produced by the Mission Planning and Sequencing (MPS) team to process, simulate, translate, and package the commands that are sent to a spacecraft. MPS works under the auspices of the Multi-Mission Ground Systems and Services (MGSS). This software consists of nineteen applications that are in maintenance. The MPS software is classified as either class B (mission critical) or class C (mission important). The scheduling of tasks is difficult because mission needs must be addressed prior to performing any other tasks and those needs often spring up unexpectedly. Keeping track of the tasks that everyone is working on is also difficult because each person is working on a different software component. Recently the group adopted the Scrum methodology for planning and scheduling tasks. Scrum is one of the newer methodologies typically used in agile development. In the Scrum development environment, teams pick their tasks that are to be completed within a sprint based on priority. The team specifies the sprint length usually a month or less. Scrum is typically used for new development of one application. In the Scrum methodology there is a scrum master who is a facilitator who tries to make sure that everything moves smoothly, a product owner who represents the user(s) of the software and the team. MPS is not the traditional environment for the Scrum methodology. MPS has many software applications in maintenance, team members who are working on disparate applications, many users, and is interruptible based on mission needs, issues and requirements. In order to use scrum, the methodology needed adaptation to MPS. Scrum was chosen because it is adaptable. This paper is about the development of the process for using scrum, a new development methodology, with a team that works on disparate interruptible tasks on multiple software applications.

  13. A comparison of ground-based hydroxyl airglow temperatures with SABER/TIMED measurements over 23° N, India

    Science.gov (United States)

    Parihar, Navin; Singh, Dupinder; Gurubaran, Subramanian

    2017-03-01

    Ground-based observations of OH (6, 2) Meinel band nightglow were carried out at Ranchi (23.3° N, 85.3° E), India, during January-March 2011, December 2011-May 2012 and December 2012-March 2013 using an all-sky imaging system. Near the mesopause, OH temperatures were derived from the OH (6, 2) Meinel band intensity information. A limited comparison of OH temperatures (TOH) with SABER/TIMED measurements in 30 cases was performed by defining almost coincident criterion of ±1.5° latitude-longitude and ±3 min of the ground-based observations. Using SABER OH 1.6 and 2.0 µm volume emission rate profiles as the weighing function, two sets of OH-equivalent temperature (T1. 6 and T2. 0 respectively) were estimated from its kinetic temperature profile for comparison with OH nightglow measurements. Overall, fair agreement existed between ground-based and SABER measurements in the majority of events within the limits of experimental errors. Overall, the mean value of OH-derived temperatures and SABER OH-equivalent temperatures were 197.3 ± 4.6, 192.0 ± 10.8 and 192.7 ± 10.3 K, and the ground-based temperatures were 4-5 K warmer than SABER values. A difference of 8 K or more is noted between two measurements when the peak of the OH emission layer lies in the vicinity of large temperature inversions. A comparison of OH temperatures derived using different sets of Einstein transition probabilities and SABER measurements was also performed; however, OH temperatures derived using Langhoff et al. (1986) transition probabilities were found to compare well.

  14. El aprendizaje basado en problemas como estrategia para el cambio metodológico en los trabajos de laboratorio Problem based learning as estrategy for methodological change in laboratory work

    Directory of Open Access Journals (Sweden)

    Juan-Antonio Llorens-Molina

    2010-01-01

    Full Text Available The Problem Based Learning (PBL can be used as a strategy for methodological change in conventional learning environments. In this paper, the integration of laboratory work in PBL grounded activities during an introductory organic chemistry course is described. The most decisive issues of their implementation are discussed. The results show how this methodology favours the laboratory work contextualization in subject-matter and promotes the Science-Technology-Society-Environment relationships. Besides, it contributes to competence development like planning and organization skills, information search and selection, cooperative work, etc., the same way as the tutorial action improvement.

  15. Validation and Comparison of One-Dimensional Ground Motion Methodologies

    International Nuclear Information System (INIS)

    B. Darragh; W. Silva; N. Gregor

    2006-01-01

    Both point- and finite-source stochastic one-dimensional ground motion models, coupled to vertically propagating equivalent-linear shear-wave site response models are validated using an extensive set of strong motion data as part of the Yucca Mountain Project. The validation and comparison exercises are presented entirely in terms of 5% damped pseudo absolute response spectra. The study consists of a quantitative analyses involving modeling nineteen well-recorded earthquakes, M 5.6 to 7.4 at over 600 sites. The sites range in distance from about 1 to about 200 km in the western US (460 km for central-eastern US). In general, this validation demonstrates that the stochastic point- and finite-source models produce accurate predictions of strong ground motions over the range of 0 to 100 km and for magnitudes M 5.0 to 7.4. The stochastic finite-source model appears to be broadband, producing near zero bias from about 0.3 Hz (low frequency limit of the analyses) to the high frequency limit of the data (100 and 25 Hz for response and Fourier amplitude spectra, respectively)

  16. Supporting Problem Solving with Case-Stories Learning Scenario and Video-based Collaborative Learning Technology

    Directory of Open Access Journals (Sweden)

    Chun Hu

    2004-04-01

    Full Text Available In this paper, we suggest that case-based resources, which are used for assisting cognition during problem solving, can be structured around the work of narratives in social cultural psychology. Theories and other research methods have proposed structures within narratives and stories which may be useful to the design of case-based resources. Moreover, embedded within cases are stories which are contextually rich, supporting the epistemological groundings of situated cognition. Therefore the purposes of this paper are to discuss possible frameworks of case-stories; derive design principles as to “what” constitutes a good case story or narrative; and suggest how technology can support story-based learning. We adopt video-based Computer-Supported Collaborative Learning (CSCL technology to support problem solving with case-stories learning scenarios. Our hypothesis in this paper is that well-designed case-based resources are able to aid in the cognitive processes undergirding problem solving and meaning making. We also suggest the use of an emerging video-based collaborative learning technology to support such an instructional strategy.

  17. A methodology based on openEHR archetypes and software agents for developing e-health applications reusing legacy systems.

    Science.gov (United States)

    Cardoso de Moraes, João Luís; de Souza, Wanderley Lopes; Pires, Luís Ferreira; do Prado, Antonio Francisco

    2016-10-01

    In Pervasive Healthcare, novel information and communication technologies are applied to support the provision of health services anywhere, at anytime and to anyone. Since health systems may offer their health records in different electronic formats, the openEHR Foundation prescribes the use of archetypes for describing clinical knowledge in order to achieve semantic interoperability between these systems. Software agents have been applied to simulate human skills in some healthcare procedures. This paper presents a methodology, based on the use of openEHR archetypes and agent technology, which aims to overcome the weaknesses typically found in legacy healthcare systems, thereby adding value to the systems. This methodology was applied in the design of an agent-based system, which was used in a realistic healthcare scenario in which a medical staff meeting to prepare a cardiac surgery has been supported. We conducted experiments with this system in a distributed environment composed by three cardiology clinics and a center of cardiac surgery, all located in the city of Marília (São Paulo, Brazil). We evaluated this system according to the Technology Acceptance Model. The case study confirmed the acceptance of our agent-based system by healthcare professionals and patients, who reacted positively with respect to the usefulness of this system in particular, and with respect to task delegation to software agents in general. The case study also showed that a software agent-based interface and a tools-based alternative must be provided to the end users, which should allow them to perform the tasks themselves or to delegate these tasks to other people. A Pervasive Healthcare model requires efficient and secure information exchange between healthcare providers. The proposed methodology allows designers to build communication systems for the message exchange among heterogeneous healthcare systems, and to shift from systems that rely on informal communication of actors to

  18. Site study plan for geochemical analytical requirements and methodologies: Revision 1

    International Nuclear Information System (INIS)

    1987-12-01

    This site study plan documents the analytical methodologies and procedures that will be used to analyze geochemically the rock and fluid samples collected during Site Characterization. Information relating to the quality aspects of these analyses is also provided, where available. Most of the proposed analytical procedures have been used previously on the program and are sufficiently sensitive to yield high-quality analyses. In a few cases improvements in analytical methodology (e.g., greater sensitivity, fewer interferences) are desired. Suggested improvements to these methodologies are discussed. In most cases these method-development activities have already been initiated. The primary source of rock and fluid samples for geochemical analysis during Site Characterization will be the drilling program, as described in various SRP Site Study Plans. The Salt Repository Project (SRP) Networks specify the schedule under which the program will operate. Drilling will not begin until after site ground water baseline conditions have been established. The Technical Field Services Contractor (TFSC) is responsible for conducting the field program of drilling and testing. Samples and data will be handled and reported in accordance with established SRP procedures. A quality assurance program will be utilized to assure that activities affecting quality are performed correctly and that the appropriate documentation is maintained. 28 refs., 9 figs., 14 tabs

  19. A novel technique for extracting clouds base height using ground based imaging

    Directory of Open Access Journals (Sweden)

    E. Hirsch

    2011-01-01

    Full Text Available The height of a cloud in the atmospheric column is a key parameter in its characterization. Several remote sensing techniques (passive and active, either ground-based or on space-borne platforms and in-situ measurements are routinely used in order to estimate top and base heights of clouds. In this article we present a novel method that combines thermal imaging from the ground and sounded wind profile in order to derive the cloud base height. This method is independent of cloud types, making it efficient for both low boundary layer and high clouds. In addition, using thermal imaging ensures extraction of clouds' features during daytime as well as at nighttime. The proposed technique was validated by comparison to active sounding by ceilometers (which is a standard ground based method, to lifted condensation level (LCL calculations, and to MODIS products obtained from space. As all passive remote sensing techniques, the proposed method extracts only the height of the lowest cloud layer, thus upper cloud layers are not detected. Nevertheless, the information derived from this method can be complementary to space-borne cloud top measurements when deep-convective clouds are present. Unlike techniques such as LCL, this method is not limited to boundary layer clouds, and can extract the cloud base height at any level, as long as sufficient thermal contrast exists between the radiative temperatures of the cloud and its surrounding air parcel. Another advantage of the proposed method is its simplicity and modest power needs, making it particularly suitable for field measurements and deployment at remote locations. Our method can be further simplified for use with visible CCD or CMOS camera (although nighttime clouds will not be observed.

  20. A Metadata based Knowledge Discovery Methodology for Seeding Translational Research.

    Science.gov (United States)

    Kothari, Cartik R; Payne, Philip R O

    2015-01-01

    In this paper, we present a semantic, metadata based knowledge discovery methodology for identifying teams of researchers from diverse backgrounds who can collaborate on interdisciplinary research projects: projects in areas that have been identified as high-impact areas at The Ohio State University. This methodology involves the semantic annotation of keywords and the postulation of semantic metrics to improve the efficiency of the path exploration algorithm as well as to rank the results. Results indicate that our methodology can discover groups of experts from diverse areas who can collaborate on translational research projects.

  1. Benefits of rotational ground motions for planetary seismology

    Science.gov (United States)

    Donner, S.; Joshi, R.; Hadziioannou, C.; Nunn, C.; van Driel, M.; Schmelzbach, C.; Wassermann, J. M.; Igel, H.

    2017-12-01

    Exploring the internal structure of planetary objects is fundamental to understand the evolution of our solar system. In contrast to Earth, planetary seismology is hampered by the limited number of stations available, often just a single one. Classic seismology is based on the measurement of three components of translational ground motion. Its methods are mainly developed for a larger number of available stations. Therefore, the application of classical seismological methods to other planets is very limited. Here, we show that the additional measurement of three components of rotational ground motion could substantially improve the situation. From sparse or single station networks measuring translational and rotational ground motions it is possible to obtain additional information on structure and source. This includes direct information on local subsurface seismic velocities, separation of seismic phases, propagation direction of seismic energy, crustal scattering properties, as well as moment tensor source parameters for regional sources. The potential of this methodology will be highlighted through synthetic forward and inverse modeling experiments.

  2. Simulating large-scale pedestrian movement using CA and event driven model: Methodology and case study

    Science.gov (United States)

    Li, Jun; Fu, Siyao; He, Haibo; Jia, Hongfei; Li, Yanzhong; Guo, Yi

    2015-11-01

    Large-scale regional evacuation is an important part of national security emergency response plan. Large commercial shopping area, as the typical service system, its emergency evacuation is one of the hot research topics. A systematic methodology based on Cellular Automata with the Dynamic Floor Field and event driven model has been proposed, and the methodology has been examined within context of a case study involving the evacuation within a commercial shopping mall. Pedestrians walking is based on Cellular Automata and event driven model. In this paper, the event driven model is adopted to simulate the pedestrian movement patterns, the simulation process is divided into normal situation and emergency evacuation. The model is composed of four layers: environment layer, customer layer, clerk layer and trajectory layer. For the simulation of movement route of pedestrians, the model takes into account purchase intention of customers and density of pedestrians. Based on evacuation model of Cellular Automata with Dynamic Floor Field and event driven model, we can reflect behavior characteristics of customers and clerks at the situations of normal and emergency evacuation. The distribution of individual evacuation time as a function of initial positions and the dynamics of the evacuation process is studied. Our results indicate that the evacuation model using the combination of Cellular Automata with Dynamic Floor Field and event driven scheduling can be used to simulate the evacuation of pedestrian flows in indoor areas with complicated surroundings and to investigate the layout of shopping mall.

  3. METHODOLOGICAL BASES OF PUBLIC ADMINISTRATION OF PUBLIC DEVELOPMENT IN UKRAINE

    Directory of Open Access Journals (Sweden)

    Kyrylo Ohdanskyi

    2016-11-01

    Full Text Available An author in the article examines theoretical bases in the question of dynamics of community development. According to classic canons a dynamic process on any of levels of management hierarchy can be presented as a complex of changes of its ecological, economic and social components. For today, national politics in the field of realization of conception of community development does not take into account most theoretical works, which testify that in our country the mechanism of its effective adjusting is not yet created. In connection to this the author of the article accents the attention on the necessity of the use in modern Ukraine realities of the effective approaches to government control of community development. As the subject of research of the article the author chose the analysis of process of community development and methodological bases for the choice of variants for a management by this process. System approach is chosen by author as a research methodology. The aim. Analysis of theoretical bases and developing of the new approaches to the government administration of community development. An author divides the process of community development by constituents: social, economic and ecological components. From the indicated warning it is necessary to take into account the objective necessity of developing of the new conceptual approaches to the elaboration of tool of adjusting of community development. For the decision of this task the author of the article it is suggested to use the category “dynamics”. An author in the article does the analysis of different interpretations of term “dynamics”and offers his own interpretation in the context of community development. Our researches confirm that, mainly, it is methodologically possible to form the blocks of quantitative and quality factors of specific different information of ecological, economic and social character. Author’s researches confirm that it is methodologically

  4. Agent-based Modeling Methodology for Analyzing Weapons Systems

    Science.gov (United States)

    2015-03-26

    technique involve model structure, system representation and the degree of validity, coupled with the simplicity, of the overall model. ABM is best suited... system representation of the air combat system . We feel that a simulation model that combines ABM with equation-based representation of weapons and...AGENT-BASED MODELING METHODOLOGY FOR ANALYZING WEAPONS SYSTEMS THESIS Casey D. Connors, Major, USA

  5. Atmospheric effect on the ground-based measurements of broadband surface albedo

    Directory of Open Access Journals (Sweden)

    T. Manninen

    2012-11-01

    Full Text Available Ground-based pyranometer measurements of the (clear-sky broadband surface albedo are affected by the atmospheric conditions (mainly by aerosol particles, water vapour and ozone. A new semi-empirical method for estimating the magnitude of the effect of atmospheric conditions on surface albedo measurements in clear-sky conditions is presented. Global and reflected radiation and/or aerosol optical depth (AOD at two wavelengths are needed to apply the method. Depending on the aerosol optical depth and the solar zenith angle values, the effect can be as large as 20%. For the cases we tested using data from the Cabauw atmospheric test site in the Netherlands, the atmosphere caused typically up to 5% overestimation of surface albedo with respect to corresponding black-sky surface albedo values.

  6. Towards an MDA-based development methodology for distributed applications

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Gavras, A.; Belaunde, M.; Ferreira Pires, Luis; Andrade Almeida, João

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  7. Bipolar cloud-to-ground lightning flash observations

    Science.gov (United States)

    Saba, Marcelo M. F.; Schumann, Carina; Warner, Tom A.; Helsdon, John H.; Schulz, Wolfgang; Orville, Richard E.

    2013-10-01

    lightning is usually defined as a lightning flash where the current waveform exhibits a polarity reversal. There are very few reported cases of cloud-to-ground (CG) bipolar flashes using only one channel in the literature. Reports on this type of bipolar flashes are not common due to the fact that in order to confirm that currents of both polarities follow the same channel to the ground, one necessarily needs video records. This study presents five clear observations of single-channel bipolar CG flashes. High-speed video and electric field measurement observations are used and analyzed. Based on the video images obtained and based on previous observations of positive CG flashes with high-speed cameras, we suggest that positive leader branches which do not participate in the initial return stroke of a positive cloud-to-ground flash later generate recoil leaders whose negative ends, upon reaching the branch point, traverse the return stroke channel path to the ground resulting in a subsequent return stroke of opposite polarity.

  8. Preliminary Study Intended for the Application of the INPRO Methodology in the area of Infrastructure (Public Acceptance) for the Case of Mexico

    International Nuclear Information System (INIS)

    Heredia, R.; Puente Espel, F.

    2016-01-01

    possible expansion programme, public acceptance would play a key role and will be one of the most important aspects in the development of a sustainable nuclear project. This research considers different perspectives to attain public acceptance for the case of Mexico; seen as a newcomer country and as an experienced user, both due to the characteristics of the national programme. After a comprehensive study of the INPRO Methodology, the analysis of the available information to the public regarding nuclear energy was performed; 1) the institutions involved on nuclear energy were studied; 2) the available surveys reflect that the public is not aware of the benefits of nuclear energy; 3) the study also analyzed the public opinion towards the existing NPP and the possible expansion program. Results showed that current information given to the population does not fulfill the requirements of the INPRO Methodology. A comparative research on experienced countries is also included. This is a preliminary study to set ground on the implementation of a proper public acceptance strategy, based on the INPRO Methodology. (author).

  9. Improving Agricultural Water Resources Management Using Ground-based Infrared Thermometry

    Science.gov (United States)

    Taghvaeian, S.

    2014-12-01

    Irrigated agriculture is the largest user of freshwater resources in arid/semi-arid parts of the world. Meeting rapidly growing demands in food, feed, fiber, and fuel while minimizing environmental pollution under a changing climate requires significant improvements in agricultural water management and irrigation scheduling. Although recent advances in remote sensing techniques and hydrological modeling has provided valuable information on agricultural water resources and their management, real improvements will only occur if farmers, the decision makers on the ground, are provided with simple, affordable, and practical tools to schedule irrigation events. This presentation reviews efforts in developing methods based on ground-based infrared thermometry and thermography for day-to-day management of irrigation systems. The results of research studies conducted in Colorado and Oklahoma show that ground-based remote sensing methods can be used effectively in quantifying water stress and consequently triggering irrigation events. Crop water use estimates based on stress indices have also showed to be in good agreement with estimates based on other methods (e.g. surface energy balance, root zone soil water balance, etc.). Major challenges toward the adoption of this approach by agricultural producers include the reduced accuracy under cloudy and humid conditions and its inability to forecast irrigation date, which is a critical knowledge since many irrigators need to decide about irrigations a few days in advance.

  10. Prediction of strong ground motion based on scaling law of earthquake

    International Nuclear Information System (INIS)

    Kamae, Katsuhiro; Irikura, Kojiro; Fukuchi, Yasunaga.

    1991-01-01

    In order to predict more practically strong ground motion, it is important to study how to use a semi-empirical method in case of having no appropriate observation records for actual small-events as empirical Green's functions. We propose a prediction procedure using artificially simulated small ground motions as substitute for the actual motions. First, we simulate small-event motion by means of stochastic simulation method proposed by Boore (1983) in considering pass effects such as attenuation, and broadening of waveform envelope empirically in the objective region. Finally, we attempt to predict the strong ground motion due to a future large earthquake (M 7, Δ = 13 km) using the same summation procedure as the empirical Green's function method. We obtained the results that the characteristics of the synthetic motion using M 5 motion were in good agreement with those by the empirical Green's function method. (author)

  11. Testing a ground-based canopy model using the wind river canopy crane

    Science.gov (United States)

    Robert Van Pelt; Malcolm P. North

    1999-01-01

    A ground-based canopy model that estimates the volume of occupied space in forest canopies was tested using the Wind River Canopy Crane. A total of 126 trees in a 0.25 ha area were measured from the ground and directly from a gondola suspended from the crane. The trees were located in a low elevation, old-growth forest in the southern Washington Cascades. The ground-...

  12. Space weather effects on ground based technology

    Science.gov (United States)

    Clark, T.

    Space weather can affect a variety of forms of ground-based technology, usually as a result of either the direct effects of the varying geomagnetic field, or as a result of the induced electric field that accompanies such variations. Technologies affected directly by geomagnetic variations include magnetic measurements made d ringu geophysical surveys, and navigation relying on the geomagnetic field as a direction reference, a method that is particularly common in the surveying of well-bores in the oil industry. The most obvious technology affected by induced electric fields during magnetic storms is electric power transmission, where the example of the blackout in Quebec during the March 1989 magnetic storm is widely known. Additionally, space weather effects must be taken into account in the design of active cathodic protection systems on pipelines to protect them against corrosion. Long-distance telecommunication cables may also have to be designed to cope with space weather related effects. This paper reviews the effects of space weather in these different areas of ground-based technology, and provides examples of how mitigation against hazards may be achieved. (The paper does not include the effects of space weather on radio communication or satellite navigation systems).

  13. Measurement of ground motion in various sites

    International Nuclear Information System (INIS)

    Bialowons, W.; Amirikas, R.; Bertolini, A.; Kruecker, D.

    2007-04-01

    Ground vibrations may affect low emittance beam transport in linear colliders, Free Electron Lasers (FEL) and synchrotron radiation facilities. This paper is an overview of a study program to measure ground vibrations in various sites which can be used for site characterization in relation to accelerator design. Commercial broadband seismometers have been used to measure ground vibrations and the resultant database is available to the scientific community. The methodology employed is to use the same equipment and data analysis tools for ease of comparison. This database of ground vibrations taken in 19 sites around the world is first of its kind. (orig.)

  14. Case-based Reasoning for Automotive Engine Performance Tune-up

    International Nuclear Information System (INIS)

    Vong, C. M.; Huang, H.; Wong, P. K.

    2010-01-01

    The automotive engine performance tune-up is greatly affected by the calibration of its electronic control unit (ECU). The ECU calibration is traditionally done by trial-and-error method. This traditional method consumes a large amount of time and money because of a large number of dynamometer tests. To resolve this problem, case based reasoning (CBR) is employed, so that an existing and effective ECU setup can be adapted to fit another similar class of engines. The adaptation procedure is done through a more sophisticated step called case-based adaptation (CBA)[1, 2]. CBA is an effective knowledge management tool, which can interactively learn the expert adaptation knowledge. The paper briefly reviews the methodologies of CBR and CBA. Then the application to ECU calibration is described via a case study. With CBR and CBA, the efficiency of calibrating an ECU can be enhanced. A prototype system has also been developed to verify the usefulness of CBR in ECU calibration.

  15. The use of grounded theory in studies of nurses and midwives' coping processes: a systematic literature search.

    Science.gov (United States)

    Cheer, Karen; MacLaren, David; Tsey, Komla

    2015-01-01

    Researchers are increasingly using grounded theory methodologies to study the professional experience of nurses and midwives. To review common grounded theory characteristics and research design quality as described in grounded theory studies of coping strategies used by nurses and midwives. A systematic database search for 2005-2015 identified and assessed grounded theory characteristics from 16 studies. Study quality was assessed using a modified Critical Appraisal Skills Programme tool. Grounded theory was considered a methodology or a set of methods, able to be used within different nursing and midwifery contexts. Specific research requirements determined the common grounded theory characteristics used in different studies. Most researchers did not clarify their epistemological and theoretical perspectives. To improve research design and trustworthiness of grounded theory studies in nursing and midwifery, researchers need to state their theoretical stance and clearly articulate their use of grounded theory methodology and characteristics in research reporting.

  16. Design and optimization of a ground water monitoring system using GIS and multicriteria decision analysis

    Energy Technology Data Exchange (ETDEWEB)

    Dutta, D.; Gupta, A.D.; Ramnarong, V.

    1998-12-31

    A GIS-based methodology has been developed to design a ground water monitoring system and implemented for a selected area in Mae-Klong River Basin, Thailand. A multicriteria decision-making analysis has been performed to optimize the network system based on major criteria which govern the monitoring network design such as minimization of cost of construction, reduction of kriging standard deviations, etc. The methodology developed in this study is a new approach to designing monitoring networks which can be used for any site considering site-specific aspects. It makes it possible to choose the best monitoring network from various alternatives based on the prioritization of decision factors.

  17. The Temporal Sensitivity of Enforced Accelerated Work Pace: A grounded theory building approach

    Directory of Open Access Journals (Sweden)

    Graham John James Kenealy, BA (Hons, Ph.D. Candidate

    Full Text Available This research explores how a large national UK government organisation copes with radical structural change over time and provides an insight into the temporal effects of ‘Enforced Accelerated Work Pace’ on behaviour and receptivity within an organisational context. The stages of ‘Acceptance’, ‘Reaction’ and‘Withdrawal’ capture the essence of the ‘Coping Reflex Actions relating to Enforced Accelerated Work Pace’, all sensitive to the effects of time. ‘Temporal Sensitivity’; the duration of the changes to work patterns played a large part in the behavioural responses. The underlying logic of this research is grounded theory building, a general method that works well with qualitative data collection approaches and involves inducting insights from field based, case data (Glaser, 1998. A methodology discovered and developed by Glaser and Strauss (1967, negating all others.

  18. Human health risk assessment methodology for the UMTRA Ground Water Project

    International Nuclear Information System (INIS)

    1994-11-01

    This document presents the method used to evaluate human risks associated with ground water contamination at inactive uranium processing sites. The intent of these evaluations is to provide the public and remedial action decision-makers with information about the health risks that might be expected at each site in a manner that is easily understood. The method (1) develops probabilistic distributions for exposure variables where sufficient data exist, (2) simulates predicted exposure distributions using Monte Carlo techniques, and (3) develops toxicity ranges that reflect human data when available, animal data if human data are insufficient, regulatory levels, and uncertainties. Risk interpretation is based on comparison of the potential exposure distributions with the derived toxicity ranges. Graphic presentations are an essential element of the semiquantitative interpretation and are expected to increase understanding by the public and decision-makers

  19. Unified modeling language and design of a case-based retrieval system in medical imaging.

    Science.gov (United States)

    LeBozec, C; Jaulent, M C; Zapletal, E; Degoulet, P

    1998-01-01

    One goal of artificial intelligence research into case-based reasoning (CBR) systems is to develop approaches for designing useful and practical interactive case-based environments. Explaining each step of the design of the case-base and of the retrieval process is critical for the application of case-based systems to the real world. We describe herein our approach to the design of IDEM--Images and Diagnosis from Examples in Medicine--a medical image case-based retrieval system for pathologists. Our approach is based on the expressiveness of an object-oriented modeling language standard: the Unified Modeling Language (UML). We created a set of diagrams in UML notation illustrating the steps of the CBR methodology we used. The key aspect of this approach was selecting the relevant objects of the system according to user requirements and making visualization of cases and of the components of the case retrieval process. Further evaluation of the expressiveness of the design document is required but UML seems to be a promising formalism, improving the communication between the developers and users.

  20. Contextual assessment of organisational culture - methodological development in two case studies

    International Nuclear Information System (INIS)

    Reiman, T.; Oedewald, P.

    2002-01-01

    Despite the acknowledged significance of organisational culture in the nuclear field, previous cultural studies have concentrated on purely safety related matters, or been only descriptive in nature. New kinds of methods, taking into account the overall objectives of the organisation, were needed to assess culture and develop its working practices appropriately. VTT developed the Contextual Assessment of Organisational Culture (CAOC) methodology during the FINNUS programme. The methodology utilises two concepts, organisational culture and core task. The core task can be defined as the core demands and content of work that the organisation has to accomplish in order to be effective. The core task concept is used in assessing the central dimensions of the organisation's culture. Organisational culture is defined as a solution the company has generated in order to fulfil the perceived demands of its core task. The CAOC-methodology was applied in two case studies, in the Radiation and Nuclear Safety Authority of Finland and in the maintenance unit of Loviisa NPP. The aim of the studies was not only to assess the given culture, but also to give the personnel new concepts and new tools for reflecting on their organisation, their jobs and on appropriate working practices. The CAOC-methodology contributes to the design and redesign of work in complex sociotechnical systems. It strives to enhance organisations' capability to assess their current working practices and the meanings attached to them and compare these to the actual demands of their basic mission and so change unadaptive practices. (orig.)

  1. Synthetic strong ground motions for engineering design utilizing empirical Green`s functions

    Energy Technology Data Exchange (ETDEWEB)

    Hutchings, L.J.; Jarpe, S.P.; Kasameyer, P.W.; Foxall, W.

    1996-04-11

    We present a methodology for developing realistic synthetic strong ground motions for specific sites from specific earthquakes. We analyzed the possible ground motion resulting from a M = 7.25 earthquake that ruptures 82 km of the Hayward fault for a site 1.4 km from the fault in the eastern San Francisco Bay area. We developed a suite of 100 rupture scenarios for the Hayward fault earthquake and computed the corresponding strong ground motion time histories. We synthesized strong ground motion with physics-based solutions of earthquake rupture and applied physical bounds on rupture parameters. By having a suite of rupture scenarios of hazardous earthquakes for a fixed magnitude and identifying the hazard to the site from the statistical distribution of engineering parameters, we introduce a probabilistic component into the deterministic hazard calculation. Engineering parameters of synthesized ground motions agree with those recorded from the 1995 Kobe, Japan and the 1992 Landers, California earthquakes at similar distances and site geologies.

  2. Problems and Issues in Using Computer- Based Support Tools to Enhance 'Soft' Systems Methodologies

    Directory of Open Access Journals (Sweden)

    Mark Stansfield

    2001-11-01

    Full Text Available This paper explores the issue of whether computer-based support tools can enhance the use of 'soft' systems methodologies as applied to real-world problem situations. Although work has been carried out by a number of researchers in applying computer-based technology to concepts and methodologies relating to 'soft' systems thinking such as Soft Systems Methodology (SSM, such attempts appear to be still in their infancy and have not been applied widely to real-world problem situations. This paper will highlight some of the problems that may be encountered in attempting to develop computer-based support tools for 'soft' systems methodologies. Particular attention will be paid to an attempt by the author to develop a computer-based support tool for a particular 'soft' systems method of inquiry known as the Appreciative Inquiry Method that is based upon Vickers' notion of 'appreciation' (Vickers, 196S and Checkland's SSM (Checkland, 1981. The final part of the paper will explore some of the lessons learnt from developing and applying the computer-based support tool to a real world problem situation, as well as considering the feasibility of developing computer-based support tools for 'soft' systems methodologies. This paper will put forward the point that a mixture of manual and computer-based tools should be employed to allow a methodology to be used in an unconstrained manner, but the benefits provided by computer-based technology should be utilised in supporting and enhancing the more mundane and structured tasks.

  3. Mycological evaluation of a ground cocoa-based beverage ...

    African Journals Online (AJOL)

    Cocoa beans (Theobroma cacao) are processed into cocoa beverage through fermentation, drying, roasting and grounding of the seed to powder. The mycological quality of 39 samples of different brand of these cocoa – based beverage referred to as 'eruku oshodi' collected from 3 different markets in south – west Nigeria ...

  4. Examining the Nexus between Grounded Theory and Symbolic Interactionism

    OpenAIRE

    P. Jane Milliken RN, PhD; Rita Schreiber RN, DNS

    2012-01-01

    Grounded theory is inherently symbolic interactionist; however, not all grounded theory researchers appreciate its importance or benefit from its influence. Elsewhere, we have written about the intrinsic relationship between grounded theory and symbolic interactionism, highlighting the silent, fundamental contribution of symbolic interactionism to the methodology. At the same time, there are significant insights to be had by bringing a conscious awareness of the philosophy of symbolic interac...

  5. A pattern-based methodology for optimizing stitches in double-patterning technology

    Science.gov (United States)

    Wang, Lynn T.; Madhavan, Sriram; Dai, Vito; Capodieci, Luigi

    2015-03-01

    A pattern-based methodology for optimizing stitches is developed based on identifying stitch topologies and replacing them with pre-characterized fixing solutions in decomposed layouts. A topology-based library of stitches with predetermined fixing solutions is built. A pattern-based engine searches for matching topologies in the decomposed layouts. When a match is found, the engine opportunistically replaces the predetermined fixing solution: only a design rule check error-free replacement is preserved. The methodology is demonstrated on a 20nm layout design that contains over 67 million, first metal layer stitches. Results show that a small library containing 3 stitch topologies improves the stitch area regularity by 4x.

  6. An Efficient Power Estimation Methodology for Complex RISC Processor-based Platforms

    OpenAIRE

    Rethinagiri , Santhosh Kumar; Ben Atitallah , Rabie; Dekeyser , Jean-Luc; Niar , Smail; Senn , Eric

    2012-01-01

    International audience; In this contribution, we propose an efficient power estima- tion methodology for complex RISC processor-based plat- forms. In this methodology, the Functional Level Power Analysis (FLPA) is used to set up generic power models for the different parts of the system. Then, a simulation framework based on virtual platform is developed to evalu- ate accurately the activities used in the related power mod- els. The combination of the two parts above leads to a het- erogeneou...

  7. Case Studies Approach in Tourism Destination Branding Research

    OpenAIRE

    Adeyinka-Ojo S.F.; Nair V.; Khoo-Lattimore C.

    2014-01-01

    A review of literature indicates that there are different types of qualitative research methods such as action research, content analysis, ethnography, grounded theory, historical analysis, phenomenology and case study. However, which approach is to be used depends on several factors such as the nature and objectives of the research. The aim of this paper is to focus on the research methodology aspects of applying case study as a research approach and its relevance in tourism destination bran...

  8. Asteroseismology of solar-type stars with Kepler: III. Ground-based data

    DEFF Research Database (Denmark)

    Karoff, Christoffer; Molenda-Żakowicz , J.

    2010-01-01

    We report on the ground-based follow-up program of spectroscopic and photometric observations of solar-like asteroseismic targets for the Kepler space mission. These stars constitute a large group of more than a thousand objects which are the subject of an intensive study by the Kepler Asteroseis......We report on the ground-based follow-up program of spectroscopic and photometric observations of solar-like asteroseismic targets for the Kepler space mission. These stars constitute a large group of more than a thousand objects which are the subject of an intensive study by the Kepler...

  9. Calibration of Ground-based Lidar instrument

    DEFF Research Database (Denmark)

    Yordanova, Ginka; Gómez Arranz, Paula

    This report presents the result of the lidar calibration performed for the given Ground-based Lidar at DTU’s test site for large wind turbines at Høvsøre, Denmark. Calibration is here understood as the establishment of a relation between the reference wind speed measurements with measurement...... uncertainties provided by measurement standard and corresponding lidar wind speed indications with associated measurement uncertainties. The lidar calibration concerns the 10 minute mean wind speed measurements. The comparison of the lidar measurements of the wind direction with that from wind vanes...

  10. Implementing case-based teaching strategies in a decentralised nursing management programme in South Africa

    Directory of Open Access Journals (Sweden)

    Zethu Nkosi

    2013-11-01

    Full Text Available Background: Case-based education has a long history in the disciplines of education, business, law and the health professions. Research suggests that students who learn via a case-based method have advanced critical thinking skills and a greater ability for application of knowledge in practice. In medical education, case-based methodology is widely used to facilitate knowledge transfer from theoretical knowledge to application in patient care. Nursing education has also adopted case-based methodology to enhance learner outcomes and critical thinking.Objectives: The objectives of the study was to describe a decentralised nursing management education programme located in Durban, South Africa and describe the perceptions of nursing faculty facilitators regarding implementation of this teaching method.Method: Data was collected through the use of one-on-one interviews and also focus groups amongst the fifteen facilitators who were using a case-based curriculum to teach the programme content. The average facilitator was female, between 41 and 50 years of age, working part-time, educated with a baccalaureate degree, working as a professional nurse for between 11 and 20 years; slightly more than half had worked as a facilitator for three or more years.Results: The facilitators identified themes related to the student learners, the learning environment, and strengths and challenges of using facilitation to teach the content through cases. Decentralised nursing management educational programmes can meet the needs of nurses who are located in remote areas which are characterised by poor transportation patterns and limited resources and have great need for quality healthcare services.Conclusion: Nursing faculty facilitators need knowledgeable and accessible contact with centrally based full-time nursing faculty in order to promote high quality educational programmes.

  11. Implementing case-based teaching strategies in a decentralised nursing management programme in South Africa

    Directory of Open Access Journals (Sweden)

    Zethu Nkosi

    2013-11-01

    Full Text Available Background: Case-based education has a long history in the disciplines of education, business, law and the health professions. Research suggests that students who learn via acase-based method have advanced critical thinking skills and a greater ability for application of knowledge in practice. In medical education, case-based methodology is widely used to facilitate knowledge transfer from theoretical knowledge to application in patient care. Nursing education has also adopted case-based methodology to enhance learner outcomes and critical thinking. Objectives: The objectives of the study was to describe a decentralised nursing management education programme located in Durban, South Africa and describe the perceptions of nursing faculty facilitators regarding implementation of this teaching method. Method: Data was collected through the use of one-on-one interviews and also focus groups amongst the fifteen facilitators who were using a case-based curriculum to teach the programme content. The average facilitator was female, between 41 and 50 years of age,working part-time, educated with a baccalaureate degree, working as a professional nurse for between 11 and 20 years; slightly more than half had worked as a facilitator for three or more years. Results: The facilitators identified themes related to the student learners, the learning environment, and strengths and challenges of using facilitation to teach the content through cases. Decentralised nursing management educational programmes can meet the needs of nurses who are located in remote areas which are characterised by poor transportation patterns and limited resources and have great need for quality healthcare services. Conclusion: Nursing faculty facilitators need knowledgeable and accessible contact with centrally based full-time nursing faculty in order to promote high quality educational programmes.

  12. Knowledge-based and model-based hybrid methodology for comprehensive waste minimization in electroplating plants

    Science.gov (United States)

    Luo, Keqin

    1999-11-01

    The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and

  13. The automated ground network system

    Science.gov (United States)

    Smith, Miles T.; Militch, Peter N.

    1993-01-01

    The primary goal of the Automated Ground Network System (AGNS) project is to reduce Ground Network (GN) station life-cycle costs. To accomplish this goal, the AGNS project will employ an object-oriented approach to develop a new infrastructure that will permit continuous application of new technologies and methodologies to the Ground Network's class of problems. The AGNS project is a Total Quality (TQ) project. Through use of an open collaborative development environment, developers and users will have equal input into the end-to-end design and development process. This will permit direct user input and feedback and will enable rapid prototyping for requirements clarification. This paper describes the AGNS objectives, operations concept, and proposed design.

  14. Costs and profitability of renewable energies in metropolitan France - ground-based wind energy, biomass, solar photovoltaic. Analysis

    International Nuclear Information System (INIS)

    2014-04-01

    After a general presentation of the framework of support to renewable energies and co-generation (purchasing obligation, tendering, support funding), of the missions of the CRE (Commission for Energy Regulation) within the frame of the purchasing obligation, and of the methodology adopted for this analysis, this document reports an analysis of production costs for three different renewable energy sectors: ground-based wind energy, biomass energy, and solar photovoltaic energy. For each of them, the report recalls the context (conditions of purchasing obligation, winning bid installations, installed fleet in France at the end of 2012), indicates the installations taken into consideration in this study, analyses the installation costs and funding (investment costs, exploitation and maintenance costs, project funding, production costs), and assesses the profitability in terms of capital and for stakeholders

  15. Rigour and grounded theory.

    Science.gov (United States)

    Cooney, Adeline

    2011-01-01

    This paper explores ways to enhance and demonstrate rigour in a grounded theory study. Grounded theory is sometimes criticised for a lack of rigour. Beck (1993) identified credibility, auditability and fittingness as the main standards of rigour for qualitative research methods. These criteria were evaluated for applicability to a Straussian grounded theory study and expanded or refocused where necessary. The author uses a Straussian grounded theory study (Cooney, In press) to examine how the revised criteria can be applied when conducting a grounded theory study. Strauss and Corbin (1998b) criteria for judging the adequacy of a grounded theory were examined in the context of the wider literature examining rigour in qualitative research studies in general and grounded theory studies in particular. A literature search for 'rigour' and 'grounded theory' was carried out to support this analysis. Criteria are suggested for enhancing and demonstrating the rigour of a Straussian grounded theory study. These include: cross-checking emerging concepts against participants' meanings, asking experts if the theory 'fit' their experiences, and recording detailed memos outlining all analytical and sampling decisions. IMPLICATIONS FOR RESEARCH PRACTICE: The criteria identified have been expressed as questions to enable novice researchers to audit the extent to which they are demonstrating rigour when writing up their studies. However, it should not be forgotten that rigour is built into the grounded theory method through the inductive-deductive cycle of theory generation. Care in applying the grounded theory methodology correctly is the single most important factor in ensuring rigour.

  16. Ground rubber: Sorption media for ground water containing benzene and O-xylene

    International Nuclear Information System (INIS)

    Kershaw, D.S.; Pamukcu, S.

    1997-01-01

    The purpose of the current study is to examine the ability of ground rubber to sorb benzene and O-xylene from water contained with aromatic hydrocarbons. The study consisted of running both batch and packed bed column tests to determine the sorption capacity, the required sorption equilibration time, and the flow through utilization efficiency of ground rubber under various contact times when exposed to water contaminated with various amounts of benzene or O-xylene. Initial batch test results indicate that ground rubber can attain equilibrium sorption capacities up to 1.3 or 8.2 mg of benzene or O-xylene, respectively, per gram of tire rubber at solution equilibrium concentrations of 10 mg/L. Packed bed column tests indicate that ground tire rubber has on the average a 40% utilization rate when a hydraulic residence time of 15 min is used. Possible future uses of round rubber as a sorption media could include, but are not limited to, the use of ground rubber as an aggregate in slurry cutoff walls that are in contact with petroleum products. Ground rubber could also be used as a sorption media in pump-and-treat methodologies or as a sorption media in in-situ reactive permeable barriers

  17. Study of methodology diversification in diagnostics

    International Nuclear Information System (INIS)

    Suda, Kazunori; Yonekawa, Tsuyoshi; Yoshikawa, Shinji; Hasegawa, Makoto

    1999-03-01

    There are several research activities to enhance safety and reliability of nuclear power plant operation and maintenance. We are developing a concept of an autonomous operation system where the role of operators is replaced with artificial intelligence. The purpose of the study described in this report is to develop a operator support system in abnormal plant situations. Conventionally, diagnostic modules based on individual methodology such as expert system have been developed and verified. In this report, methodology diversification is considered to integrate diagnostic modules which performance are confirmed using information processing technique. Technical issues to be considered in diagnostic methodology diversification are; 1)reliability of input data, 2)diversification of knowledge models, algorithms and reasoning schemes, 3)mutual complement and robustness. The diagnostic module utilizing the different approaches defined along with strategy of diversification was evaluated using fast breeder plant simulator. As a result, we confirmed that any singular diagnostic module can not meet accuracy criteria for the entire set of anomaly events. In contrast with this, we confirmed that every abnormality could be precisely diagnosed by a mutual combination. In other words, legitimacy of approach selected by strategy of diversification was shown, and methodology diversification attained clear efficiency for abnormal diagnosis. It has been also confirmed that the diversified diagnostic system implemented in this study is able to maintain its accuracy even in case that encountered scale of abnormality is different from reference cases embedded in the knowledge base. (author)

  18. Comparison of case-based and lecture-based learning in dental education using the SOLO taxonomy.

    Science.gov (United States)

    Ilgüy, Mehmet; Ilgüy, Dilhan; Fişekçioğlu, Erdoğan; Oktay, Inci

    2014-11-01

    The aim of this study was to compare the impact of case-based learning (CBL) and lecture-based learning (LBL) on fourth-year dental students' clinical decision making by using the Structure of Observed Learning Outcome (SOLO) taxonomy. Participants in the study were fourth-year dental students (n=55) in academic year 2012-13 taught in a large-group LBL context and fourth-year dental students (n=54) in academic year 2013-14 taught with the CBL methodology; both took place in the oral diseases course at Yeditepe University Faculty of Dentistry, Istanbul, Turkey. All eligible students participated, for a 100 percent response rate. A real case was presented to the students in both groups to assess their clinical decision making on the topic of oral diseases. Their performance was evaluated with the SOLO taxonomy. Student t-test was used for statistical evaluation, and significance was set at the pSOLO taxonomy than students taught with LBL. These findings suggest that an integrated case-based curriculum may be effective in promoting students' deep learning and it holds promise for better integration of clinical cases likely to be encountered during independent practice.

  19. Graphene ground states

    Science.gov (United States)

    Friedrich, Manuel; Stefanelli, Ulisse

    2018-06-01

    Graphene is locally two-dimensional but not flat. Nanoscale ripples appear in suspended samples and rolling up often occurs when boundaries are not fixed. We address this variety of graphene geometries by classifying all ground-state deformations of the hexagonal lattice with respect to configurational energies including two- and three-body terms. As a consequence, we prove that all ground-state deformations are either periodic in one direction, as in the case of ripples, or rolled up, as in the case of nanotubes.

  20. Integrated ground-water monitoring strategy for NRC-licensed facilities and sites: Case study applications

    Science.gov (United States)

    Price, V.; Temples, T.; Hodges, R.; Dai, Z.; Watkins, D.; Imrich, J.

    2007-01-01

    This document discusses results of applying the Integrated Ground-Water Monitoring Strategy (the Strategy) to actual waste sites using existing field characterization and monitoring data. The Strategy is a systematic approach to dealing with complex sites. Application of such a systematic approach will reduce uncertainty associated with site analysis, and therefore uncertainty associated with management decisions about a site. The Strategy can be used to guide the development of a ground-water monitoring program or to review an existing one. The sites selected for study fall within a wide range of geologic and climatic settings, waste compositions, and site design characteristics and represent realistic cases that might be encountered by the NRC. No one case study illustrates a comprehensive application of the Strategy using all available site data. Rather, within each case study we focus on certain aspects of the Strategy, to illustrate concepts that can be applied generically to all sites. The test sites selected include:Charleston, South Carolina, Naval Weapons Station,Brookhaven National Laboratory on Long Island, New York,The USGS Amargosa Desert Research Site in Nevada,Rocky Flats in Colorado,C-Area at the Savannah River Site in South Carolina, andThe Hanford 300 Area.A Data Analysis section provides examples of detailed data analysis of monitoring data.

  1. Typology of local government’s projects and it’s using for methodological support of the project’s ground

    Directory of Open Access Journals (Sweden)

    Y. P. Sharov

    2016-07-01

    Full Text Available Taking into account diversity of local government’s projects, traditional project’s technologies of business management can be use for it with some constraints. Present project’s classifications are imperfect, inadequate for modern conditions of local government, have week applied orientation. The approach for systematization of project’s diversity in local government, which realized as typology of local government’s projects, is decrypted in the article. Use of the typology for forming recommendations to determine of deep’ level of local government’s project’s competitiveness ground is shown for different sub multitudes, the main of which are invest projects and social­management projects. Shown, that for technical­economical ground of invest projects could be used the traditional technologies of the project management, but for social­management projects ground could be done on the base of the project’s concept. The need of strengthening of local government’s project concept’s complexity discovered. The logical structure and content of complex concept preferred, the approaches to deeper working up of some concept’s position recommended.  The approach for system determination of results and success’ indicators of projects preferred. It base on problem’s analyze and description of start’s and finish’s state of project’s environment (changes of environment concerning implementation of project. The key words of problem’s symptoms look for in the space of potential target effects of local government’s projects (financial, economical, social, management, political, ecological. Then these key words reform and edit, after that they become the content’s essence of success’ indicators. During form the concept of local government’s projects recommended to do analyze of opportunities to remake projects as partially commercial by the way to give office’s rooms, places for advertising, places for trade

  2. Methodology for systematic analysis and improvement of manufacturing unit process life cycle inventory (UPLCI) CO2PE! initiative (cooperative effort on process emissions in manufacturing). Part 2: case studies

    DEFF Research Database (Denmark)

    Kellens, Karel; Dewulf, Wim; Overcash, Michael

    2012-01-01

    industrial data and engineering calculations for energy use and material loss. This approach is illustrated by means of a case study of a drilling process.The in-depth approach, which leads to more accurate LCI data as well as the identification of potential for environmental improvements...... for environmental improvement based on the in-depth analysis of individual manufacturing unit processes. Two case studies illustrate the applicability of the methodology.......This report presents two case studies, one for both the screening approach and the in-depth approach, demonstrating the application of the life cycle assessment-oriented methodology for systematic inventory analysis of the machine tool use phase of manufacturing unit processes, which has been...

  3. Methods for the performance enhancement and the error characterization of large diameter ground-based diffractive telescopes.

    Science.gov (United States)

    Zhang, Haolin; Liu, Hua; Lizana, Angel; Xu, Wenbin; Caompos, Juan; Lu, Zhenwu

    2017-10-30

    This paper is devoted to the improvement of ground-based telescopes based on diffractive primary lenses, which provide larger aperture and relaxed surface tolerance compared to non-diffractive telescopes. We performed two different studies devised to thoroughly characterize and improve the performance of ground-based diffractive telescopes. On the one hand, we experimentally validated the suitability of the stitching error theory, useful to characterize the error performance of subaperture diffractive telescopes. On the other hand, we proposed a novel ground-based telescope incorporated in a Cassegrain architecture, leading to a telescope with enhanced performance. To test the stitching error theory, a 300 mm diameter, 2000 mm focal length transmissive stitching diffractive telescope, based on a three-belt subaperture primary lens, was designed and implemented. The telescope achieves a 78 cy/mm resolution within 0.15 degree field of view while the working wavelength ranges from 582.8 nm to 682.8 nm without any stitching error. However, the long optical track (35.49 m) introduces air turbulence that reduces the final images contrast in the ground-based test. To enhance this result, a same diameter compacted Cassegrain ground-based diffractive (CGD) telescope with the total track distance of 1.267 m, was implemented within the same wavelength. The ground-based CGD telescope provides higher resolution and better contrast than the transmissive configuration. Star and resolution tests were experimentally performed to compare the CGD and the transmissive configurations, providing the suitability of the proposed ground-based CGD telescope.

  4. A discussion of differences in preparation, performance and postreflections in participant observations within two grounded theory approaches

    DEFF Research Database (Denmark)

    Bøttcher Berthelsen, Connie; Damsgaard, Tove Lindhardt; Frederiksen, Kirsten

    2017-01-01

    This paper presents a discussion of the differences in using participant observation as a data collection method by comparing the classic grounded theory methodology of Barney Glaser with the constructivist grounded theory methodology by Kathy Charmaz. Participant observations allow nursing...... researchers to experience activities and interactions directly in situ. However, using participant observations as a data collection method can be done in many ways, depending on the chosen grounded theory methodology, and may produce different results. This discussion shows that how the differences between...

  5. Sequential Ground Motion Effects on the Behavior of a Base-Isolated RCC Building

    Directory of Open Access Journals (Sweden)

    Zhi Zheng

    2017-01-01

    Full Text Available The sequential ground motion effects on the dynamic responses of reinforced concrete containment (RCC buildings with typical isolators are studied in this paper. Although the base isolation technique is developed to guarantee the security and integrity of RCC buildings under single earthquakes, seismic behavior of base-isolated RCC buildings under sequential ground motions is deficient. Hence, an ensemble of as-recorded sequential ground motions is employed to study the effect of including aftershocks on the seismic evaluation of base-isolated RCC buildings. The results indicate that base isolation can significantly attenuate the earthquake shaking of the RCC building under not only single earthquakes but also seismic sequences. It is also found that the adverse aftershock effect on the RCC can be reduced due to the base isolation applied to the RCC. More importantly, the study indicates that disregarding aftershocks can induce significant underestimation of the isolator displacement for base-isolated RCC buildings.

  6. "Slow-scanning" in Ground-based Mid-infrared Observations

    Science.gov (United States)

    Ohsawa, Ryou; Sako, Shigeyuki; Miyata, Takashi; Kamizuka, Takafumi; Okada, Kazushi; Mori, Kiyoshi; Uchiyama, Masahito S.; Yamaguchi, Junpei; Fujiyoshi, Takuya; Morii, Mikio; Ikeda, Shiro

    2018-04-01

    Chopping observations with a tip-tilt secondary mirror have conventionally been used in ground-based mid-infrared observations. However, it is not practical for next generation large telescopes to have a large tip-tilt mirror that moves at a frequency larger than a few hertz. We propose an alternative observing method, a "slow-scanning" observation. Images are continuously captured as movie data, while the field of view is slowly moved. The signal from an astronomical object is extracted from the movie data by a low-rank and sparse matrix decomposition. The performance of the "slow-scanning" observation was tested in an experimental observation with Subaru/COMICS. The quality of a resultant image in the "slow-scanning" observation was as good as in a conventional chopping observation with COMICS, at least for a bright point-source object. The observational efficiency in the "slow-scanning" observation was better than that in the chopping observation. The results suggest that the "slow-scanning" observation can be a competitive method for the Subaru telescope and be of potential interest to other ground-based facilities to avoid chopping.

  7. Constructing New Theory for Identifying Students with Emotional Disturbance: A Constructivist Approach to Grounded Theory

    Directory of Open Access Journals (Sweden)

    Dori Barnett

    2012-06-01

    Full Text Available A grounded theory study that examined how practitioners in a county alternative and correctional education setting identify youth with emotional and behavioral difficulties for special education services provides an exemplar for a constructivist approach to grounded theory methodology. Discussion focuses on how a constructivist orientation to grounded theory methodology informed research decisions, shaped the development of the emergent grounded theory, and prompted a way of thinking about data collection and analysis. Implications for future research directions and policy and practice in the field of special and alternative education are discussed.

  8. Recovery Act: Finite Volume Based Computer Program for Ground Source Heat Pump Systems

    Energy Technology Data Exchange (ETDEWEB)

    James A Menart, Professor

    2013-02-22

    This report is a compilation of the work that has been done on the grant DE-EE0002805 entitled Finite Volume Based Computer Program for Ground Source Heat Pump Systems. The goal of this project was to develop a detailed computer simulation tool for GSHP (ground source heat pump) heating and cooling systems. Two such tools were developed as part of this DOE (Department of Energy) grant; the first is a two-dimensional computer program called GEO2D and the second is a three-dimensional computer program called GEO3D. Both of these simulation tools provide an extensive array of results to the user. A unique aspect of both these simulation tools is the complete temperature profile information calculated and presented. Complete temperature profiles throughout the ground, casing, tube wall, and fluid are provided as a function of time. The fluid temperatures from and to the heat pump, as a function of time, are also provided. In addition to temperature information, detailed heat rate information at several locations as a function of time is determined. Heat rates between the heat pump and the building indoor environment, between the working fluid and the heat pump, and between the working fluid and the ground are computed. The heat rates between the ground and the working fluid are calculated as a function time and position along the ground loop. The heating and cooling loads of the building being fitted with a GSHP are determined with the computer program developed by DOE called ENERGYPLUS. Lastly COP (coefficient of performance) results as a function of time are provided. Both the two-dimensional and three-dimensional computer programs developed as part of this work are based upon a detailed finite volume solution of the energy equation for the ground and ground loop. Real heat pump characteristics are entered into the program and used to model the heat pump performance. Thus these computer tools simulate the coupled performance of the ground loop and the heat pump. The

  9. A Ground-based validation of GOSAT-observed atmospheric CO2 in Inner-Mongolian grasslands

    International Nuclear Information System (INIS)

    Qin, X; Lei, L; Zeng, Z; Kawasaki, M; Oohasi, M

    2014-01-01

    Atmospheric carbon dioxide (CO 2 ) is a long-lived greenhouse gas that significantly contributes to global warming. Long-term and continuous measurements of atmospheric CO 2 to investigate its global distribution and concentration variations are important for accurately understanding its potential climatic effects. Satellite measurements from space can offer atmospheric CO 2 data for climate change research. For that, ground-based measurements are required for validation and improving the precision of satellite-measured CO 2 . We implemented observation experiment of CO 2 column densities in the Xilinguole grasslands in Inner Mongolia, China, using a ground-based measurement system, which mainly consists of an optical spectrum analyzer (OSA), a sun tracker and a notebook controller. Measurements from our ground-based system were analyzed and compared with those from the Greenhouse gas Observation SATellite (GOSAT). The ground-based measurements had an average value of 389.46 ppm, which was 2.4 ppm larger than from GOSAT, with a standard deviation of 3.4 ppm. This result is slightly larger than the difference between GOSAT and the Total Carbon Column Observing Network (TCCON). This study highlights the usefulness of the ground-based OSA measurement system for analyzing atmospheric CO 2 column densities, which is expected to supplement the current TCCON network

  10. A Methodology to Detect and Update Active Deformation Areas Based on Sentinel-1 SAR Images

    Directory of Open Access Journals (Sweden)

    Anna Barra

    2017-09-01

    Full Text Available This work is focused on deformation activity mapping and monitoring using Sentinel-1 (S-1 data and the DInSAR (Differential Interferometric Synthetic Aperture Radar technique. The main goal is to present a procedure to periodically update and assess the geohazard activity (volcanic activity, landslides and ground-subsidence of a given area by exploiting the wide area coverage and the high coherence and temporal sampling (revisit time up to six days provided by the S-1 satellites. The main products of the procedure are two updatable maps: the deformation activity map and the active deformation areas map. These maps present two different levels of information aimed at different levels of geohazard risk management, from a very simplified level of information to the classical deformation map based on SAR interferometry. The methodology has been successfully applied to La Gomera, Tenerife and Gran Canaria Islands (Canary Island archipelago. The main obtained results are discussed.

  11. Location of Bioelectricity Plants in the Madrid Community Based on Triticale Crop: A Multicriteria Methodology

    Directory of Open Access Journals (Sweden)

    L. Romero

    2015-01-01

    Full Text Available This paper presents a work whose objective is, first, to quantify the potential of the triticale biomass existing in each of the agricultural regions in the Madrid Community through a crop simulation model based on regression techniques and multiple correlation. Second, a methodology for defining which area has the best conditions for the installation of electricity plants from biomass has been described and applied. The study used a methodology based on compromise programming in a discrete multicriteria decision method (MDM context. To make a ranking, the following criteria were taken into account: biomass potential, electric power infrastructure, road networks, protected spaces, and urban nuclei surfaces. The results indicate that, in the case of the Madrid Community, the Campiña region is the most suitable for setting up plants powered by biomass. A minimum of 17,339.9 tons of triticale will be needed to satisfy the requirements of a 2.2 MW power plant. The minimum range of action for obtaining the biomass necessary in Campiña region would be 6.6 km around the municipality of Algete, based on Geographic Information Systems. The total biomass which could be made available in considering this range in this region would be 18,430.68 t.

  12. Light Water Reactor Sustainability Program: Digital Technology Business Case Methodology Guide

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, Ken [Idaho National Lab. (INL), Idaho Falls, ID (United States); Lawrie, Sean [ScottMadden, Inc., Raleigh, NC (United States); Hart, Adam [ScottMadden, Inc., Raleigh, NC (United States); Vlahoplus, Chris [ScottMadden, Inc., Raleigh, NC (United States)

    2014-09-01

    The Department of Energy’s (DOE’s) Light Water Reactor Sustainability Program aims to develop and deploy technologies that will make the existing U.S. nuclear fleet more efficient and competitive. The program has developed a standard methodology for determining the impact of new technologies in order to assist nuclear power plant (NPP) operators in building sound business cases. The Advanced Instrumentation, Information, and Control (II&C) Systems Technologies Pathway is part of the DOE’s Light Water Reactor Sustainability (LWRS) Program. It conducts targeted research and development (R&D) to address aging and reliability concerns with the legacy instrumentation and control and related information systems of the U.S. operating light water reactor (LWR) fleet. This work involves two major goals: (1) to ensure that legacy analog II&C systems are not life-limiting issues for the LWR fleet and (2) to implement digital II&C technology in a manner that enables broad innovation and business improvement in the NPP operating model. Resolving long-term operational concerns with the II&C systems contributes to the long-term sustainability of the LWR fleet, which is vital to the nation’s energy and environmental security. The II&C Pathway is conducting a series of pilot projects that enable the development and deployment of new II&C technologies in existing nuclear plants. Through the LWRS program, individual utilities and plants are able to participate in these projects or otherwise leverage the results of projects conducted at demonstration plants. Performance advantages of the new pilot project technologies are widely acknowledged, but it has proven difficult for utilities to derive business cases for justifying investment in these new capabilities. Lack of a business case is often cited by utilities as a barrier to pursuing wide-scale application of digital technologies to nuclear plant work activities. The decision to move forward with funding usually hinges on

  13. A transit timing analysis with combined ground- and space-based photometry

    Directory of Open Access Journals (Sweden)

    Raetz St.

    2015-01-01

    The CoRoT satellite looks back on six years of high precision photometry of a very high number of stars. Thousands of transiting events are detected from which 27 were confirmed to be transiting planets so far. In my research I search and analyze TTVs in the CoRoT sample and combine the unprecedented precision of the light curves with ground-based follow-up photometry. Because CoRoT can observe transiting planets only for a maximum duration of 150 days the ground-based follow-up can help to refine the ephemeris. Here we present first examples.

  14. Status of advanced ground-based laser interferometers for gravitational-wave detection

    Science.gov (United States)

    Dooley, K. L.; Akutsu, T.; Dwyer, S.; Puppo, P.

    2015-05-01

    Ground-based laser interferometers for gravitational-wave (GW) detection were first constructed starting 20 years ago and as of 2010 collection of several years’ worth of science data at initial design sensitivities was completed. Upgrades to the initial detectors together with construction of brand new detectors are ongoing and feature advanced technologies to improve the sensitivity to GWs. This conference proceeding provides an overview of the common design features of ground-based laser interferometric GW detectors and establishes the context for the status updates of each of the four gravitational-wave detectors around the world: Advanced LIGO, Advanced Virgo, GEO 600 and KAGRA.

  15. Methodological remarks on contraction theory

    DEFF Research Database (Denmark)

    Jouffroy, Jerome; Slotine, Jean-Jacques E.

    Because contraction analysis stems from a differential and incremental framework, the nature and methodology of contraction-based proofs are significantly different from those of their Lyapunov-based counterparts. This paper specifically studies this issue, and illustrates it by revisiting some c...... classical examples traditionally addressed using Lyapunov theory. Even in these cases, contraction tools can often yield significantly simplified analysis. The examples include adaptive control, robotics, and a proof of convergence of the deterministic Extended Kalman Filter....

  16. Tornado missile simulation and design methodology. Volume 1: simulation methodology, design applications, and TORMIS computer code. Final report

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and risk has been assessed for a hypothetical nuclear power plant design case study

  17. Simulation of olive grove gross primary production by the combination of ground and multi-sensor satellite data

    Science.gov (United States)

    Brilli, L.; Chiesi, M.; Maselli, F.; Moriondo, M.; Gioli, B.; Toscano, P.; Zaldei, A.; Bindi, M.

    2013-08-01

    We developed and tested a methodology to estimate olive (Olea europaea L.) gross primary production (GPP) combining ground and multi-sensor satellite data. An eddy-covariance station placed in an olive grove in central Italy provided carbon and water fluxes over two years (2010-2011), which were used as reference to evaluate the performance of a GPP estimation methodology based on a Monteith type model (modified C-Fix) and driven by meteorological and satellite (NDVI) data. A major issue was related to the consideration of the two main olive grove components, i.e. olive trees and inter-tree ground vegetation: this issue was addressed by the separate simulation of carbon fluxes within the two ecosystem layers, followed by their recombination. In this way the eddy covariance GPP measurements were successfully reproduced, with the exception of two periods that followed tillage operations. For these periods measured GPP could be approximated by considering synthetic NDVI values which simulated the expected response of inter-tree ground vegetation to tillages.

  18. GEARS: An Enterprise Architecture Based On Common Ground Services

    Science.gov (United States)

    Petersen, S.

    2014-12-01

    Earth observation satellites collect a broad variety of data used in applications that range from weather forecasting to climate monitoring. Within NOAA the National Environmental Satellite Data and Information Service (NESDIS) supports these applications by operating satellites in both geosynchronous and polar orbits. Traditionally NESDIS has acquired and operated its satellites as stand-alone systems with their own command and control, mission management, processing, and distribution systems. As the volume, velocity, veracity, and variety of sensor data and products produced by these systems continues to increase, NESDIS is migrating to a new concept of operation in which it will operate and sustain the ground infrastructure as an integrated Enterprise. Based on a series of common ground services, the Ground Enterprise Architecture System (GEARS) approach promises greater agility, flexibility, and efficiency at reduced cost. This talk describes the new architecture and associated development activities, and presents the results of initial efforts to improve product processing and distribution.

  19. Reflecting on the challenges of choosing and using a grounded theory approach.

    Science.gov (United States)

    Markey, Kathleen; Tilki, Mary; Taylor, Georgina

    2014-11-01

    To explore three different approaches to grounded theory and consider some of the possible philosophical assumptions underpinning them. Grounded theory is a comprehensive yet complex methodology that offers a procedural structure that guides the researcher. However, divergent approaches to grounded theory present dilemmas for novice researchers seeking to choose a suitable research method. This is a methodology paper. This is a reflexive paper that explores some of the challenges experienced by a PhD student when choosing and operationalising a grounded theory approach. Before embarking on a study, novice grounded theory researchers should examine their research beliefs to assist them in selecting the most suitable approach. This requires an insight into the approaches' philosophical assumptions, such as those pertaining to ontology and epistemology. Researchers need to be clear about the philosophical assumptions underpinning their studies and the effects that different approaches will have on the research results. This paper presents a personal account of the journey of a novice grounded theory researcher who chose a grounded theory approach and worked within its theoretical parameters. Novice grounded theory researchers need to understand the different philosophical assumptions that influence the various grounded theory approaches, before choosing one particular approach.

  20. A case-based reasoning tool for breast cancer knowledge management with data mining concepts and techniques

    Science.gov (United States)

    Demigha, Souâd.

    2016-03-01

    The paper presents a Case-Based Reasoning Tool for Breast Cancer Knowledge Management to improve breast cancer screening. To develop this tool, we combine both concepts and techniques of Case-Based Reasoning (CBR) and Data Mining (DM). Physicians and radiologists ground their diagnosis on their expertise (past experience) based on clinical cases. Case-Based Reasoning is the process of solving new problems based on the solutions of similar past problems and structured as cases. CBR is suitable for medical use. On the other hand, existing traditional hospital information systems (HIS), Radiological Information Systems (RIS) and Picture Archiving Information Systems (PACS) don't allow managing efficiently medical information because of its complexity and heterogeneity. Data Mining is the process of mining information from a data set and transform it into an understandable structure for further use. Combining CBR to Data Mining techniques will facilitate diagnosis and decision-making of medical experts.

  1. Ground acceleration in a nuclear power plant

    International Nuclear Information System (INIS)

    Pena G, P.; Balcazar, M.; Vega R, E.

    2015-09-01

    A methodology that adopts the recommendations of international organizations for determining the ground acceleration at a nuclear power plant is outlined. Systematic presented here emphasizes the type of geological, geophysical and geotechnical studies in different areas of influence, culminating in assessments of Design Basis earthquake and the earthquake Operating Base. The methodology indicates that in regional areas where the site of the nuclear power plant is located, failures are identified in geological structures, and seismic histories of the region are documented. In the area of detail geophysical tools to generate effects to determine subsurface propagation velocities and spectra of the induced seismic waves are used. The mechanical analysis of drill cores allows estimating the efforts that generate and earthquake postulate. Studies show that the magnitude of the Fukushima earthquake, did not affect the integrity of nuclear power plants due to the rocky settlement found. (Author)

  2. Case-Crossover Analysis of Air Pollution Health Effects: A Systematic Review of Methodology and Application

    Science.gov (United States)

    Carracedo-Martínez, Eduardo; Taracido, Margarita; Tobias, Aurelio; Saez, Marc; Figueiras, Adolfo

    2010-01-01

    Background Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application. Data sources and extraction A search was made of the MEDLINE and EMBASE databases. Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level. Conclusions The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use. PMID:20356818

  3. Review: Günter Mey & Katja Mruck (Eds. (2007. Grounded Theory Reader

    Directory of Open Access Journals (Sweden)

    Adrian Schmidtke

    2009-07-01

    Full Text Available The volume was published to mark the 40th anniversary of the publication of "The Discovery of Grounded Theory." The first part describes the emergence and fundamental positions of grounded theory methodology (GTM in methodological and theoretical terms; the second part focuses on research practices. The "Grounded Theory Reader" is an excellent compilation that doesn’t claim to be a standard textbook for newcomers to GTM. Rather, it is a reflection of the state of the art in GTM and enables insights in complex research practices. A basic understanding of GTM is recommended in order to get the most from the book. URN: urn:nbn:de:0114-fqs0903286

  4. Ground roll attenuation using a curvelet-SVD filter: a case study from the west of Iran

    International Nuclear Information System (INIS)

    Boustani, Bahareh; Javaherian, Abdorahim; Mortazavi, Seyed Ahmad; Torabi, Siyavash

    2013-01-01

    In reflection seismology, a ground roll is a low frequency, low velocity and high amplitude surface wave. It usually has stronger amplitude than reflections, and masks valuable information carried by signals. Many filters in different domains have been used for ground roll attenuation such as tau-p and f-k filters. Recently, in many studies, the curvelet transform has been used for ground roll attenuation. The curvelet transform creates a good separation between ground roll and reflections in dip and frequency, especially in high frequency subbands. In this paper, based on the adaptive curvelet filter, a new method is introduced through a combination of the adaptive curvelet and adaptive singular value decomposition (ASVD) filters and is called a curvelet-SVD filter. In this filter, the subbands in a curvelet domain are divided into three categories based on the ground roll energy in each subband. These categories are subbands (1) with high energy containing only ground roll, (2) with medium energy that contains both ground roll and reflections, and (3) with low energy containing only reflections. The category that contains only ground roll will be muted, as in common usage of the adaptive curvelet filter. If the category that contains both ground roll and reflections is unchanged, part of the ground roll will not be attenuated. If this category is muted, part of the reflections will be damaged. To overcome this problem, ASVD is applied to attenuate ground roll in the subbands of this category. The category that contains only reflections will not be touched. This filter was applied to a synthetic and to a real data set from the west of Iran. The synthetic data contained dispersed and aliased ground roll. A curvelet-SVD filter could attenuate dispersed ground roll but it could not completely attenuate aliased ground roll. Because of the damage to the reflections, the energy threshold for applying ASVD in the curvelet domain could not be selected any lower. In real

  5. Software development methodology for computer based I&C systems of prototype fast breeder reactor

    International Nuclear Information System (INIS)

    Manimaran, M.; Shanmugam, A.; Parimalam, P.; Murali, N.; Satya Murty, S.A.V.

    2015-01-01

    Highlights: • Software development methodology adopted for computer based I&C systems of PFBR is detailed. • Constraints imposed as part of software requirements and coding phase are elaborated. • Compliance to safety and security requirements are described. • Usage of CASE (Computer Aided Software Engineering) tools during software design, analysis and testing phase are explained. - Abstract: Prototype Fast Breeder Reactor (PFBR) is sodium cooled reactor which is in the advanced stage of construction in Kalpakkam, India. Versa Module Europa bus based Real Time Computer (RTC) systems are deployed for Instrumentation & Control of PFBR. RTC systems have to perform safety functions within the stipulated time which calls for highly dependable software. Hence, well defined software development methodology is adopted for RTC systems starting from the requirement capture phase till the final validation of the software product. V-model is used for software development. IEC 60880 standard and AERB SG D-25 guideline are followed at each phase of software development. Requirements documents and design documents are prepared as per IEEE standards. Defensive programming strategies are followed for software development using C language. Verification and validation (V&V) of documents and software are carried out at each phase by independent V&V committee. Computer aided software engineering tools are used for software modelling, checking for MISRA C compliance and to carry out static and dynamic analysis. Various software metrics such as cyclomatic complexity, nesting depth and comment to code are checked. Test cases are generated using equivalence class partitioning, boundary value analysis and cause and effect graphing techniques. System integration testing is carried out wherein functional and performance requirements of the system are monitored

  6. Software development methodology for computer based I&C systems of prototype fast breeder reactor

    Energy Technology Data Exchange (ETDEWEB)

    Manimaran, M., E-mail: maran@igcar.gov.in; Shanmugam, A.; Parimalam, P.; Murali, N.; Satya Murty, S.A.V.

    2015-10-15

    Highlights: • Software development methodology adopted for computer based I&C systems of PFBR is detailed. • Constraints imposed as part of software requirements and coding phase are elaborated. • Compliance to safety and security requirements are described. • Usage of CASE (Computer Aided Software Engineering) tools during software design, analysis and testing phase are explained. - Abstract: Prototype Fast Breeder Reactor (PFBR) is sodium cooled reactor which is in the advanced stage of construction in Kalpakkam, India. Versa Module Europa bus based Real Time Computer (RTC) systems are deployed for Instrumentation & Control of PFBR. RTC systems have to perform safety functions within the stipulated time which calls for highly dependable software. Hence, well defined software development methodology is adopted for RTC systems starting from the requirement capture phase till the final validation of the software product. V-model is used for software development. IEC 60880 standard and AERB SG D-25 guideline are followed at each phase of software development. Requirements documents and design documents are prepared as per IEEE standards. Defensive programming strategies are followed for software development using C language. Verification and validation (V&V) of documents and software are carried out at each phase by independent V&V committee. Computer aided software engineering tools are used for software modelling, checking for MISRA C compliance and to carry out static and dynamic analysis. Various software metrics such as cyclomatic complexity, nesting depth and comment to code are checked. Test cases are generated using equivalence class partitioning, boundary value analysis and cause and effect graphing techniques. System integration testing is carried out wherein functional and performance requirements of the system are monitored.

  7. The crisis of the informant authority on ethnographic research. Covered methodologies and research on human right and vulnerable population: Two case studies in Mexico

    Directory of Open Access Journals (Sweden)

    Rubén Muñoz Martínez

    2018-02-01

    Full Text Available Drawing upon two case studies from two different researches, both related to the field of sexual health, sexual citizenship and human rights, and both making different use of undercover investigation techniques, we discuss some of the reaches and limitations of methodological approaches not-based on informed consent. Two questions constitute the core of this work: Is there research that from the beginning, development and/or products, always complies with informed consent? And, if the unfinished nature of informed consent is a state of affairs and a research option, when do we make clear that a research is partially or totally undercover? Who does it? What for? In this case, the methodologies and their specific practices need to make visible and problematize, through the call to their transformation, the fields of ideologically configured social relations that constitute them.

  8. Earthquake Scenarios Based Upon the Data and Methodologies of the U.S. Geological Survey's National Seismic Hazard Mapping Project

    Science.gov (United States)

    Rukstales, K. S.; Petersen, M. D.; Frankel, A. D.; Harmsen, S. C.; Wald, D. J.; Quitoriano, V. R.; Haller, K. M.

    2011-12-01

    The U.S. Geological Survey's (USGS) National Seismic Hazard Mapping Project (NSHMP) utilizes a database of over 500 faults across the conterminous United States to constrain earthquake source models for probabilistic seismic hazard maps. Additionally, the fault database is now being used to produce a suite of deterministic ground motions for earthquake scenarios that are based on the same fault source parameters and empirical ground motion prediction equations used for the probabilistic hazard maps. Unlike the calculated hazard map ground motions, local soil amplification is applied to the scenario calculations based on the best available Vs30 (average shear-wave velocity down to 30 meters) mapping, or in some cases using topographic slope as a proxy. Systematic outputs include all standard USGS ShakeMap products, including GIS, KML, XML, and HAZUS input files. These data are available from the ShakeMap web pages with a searchable archive. The scenarios are being produced within the framework of a geographic information system (GIS) so that alternative scenarios can readily be produced by altering fault source parameters, Vs30 soil amplification, as well as the weighting of ground motion prediction equations used in the calculations. The alternative scenarios can then be used for sensitivity analysis studies to better characterize uncertainty in the source model and convey this information to decision makers. By providing a comprehensive collection of earthquake scenarios based upon the established data and methods of the USGS NSHMP, we hope to provide a well-documented source of data which can be used for visualization, planning, mitigation, loss estimation, and research purposes.

  9. Large-Scale Demand Driven Design of a Customized Bus Network: A Methodological Framework and Beijing Case Study

    Directory of Open Access Journals (Sweden)

    Jihui Ma

    2017-01-01

    Full Text Available In recent years, an innovative public transportation (PT mode known as the customized bus (CB has been proposed and implemented in many cities in China to efficiently and effectively shift private car users to PT to alleviate traffic congestion and traffic-related environmental pollution. The route network design activity plays an important role in the CB operation planning process because it serves as the basis for other operation planning activities, for example, timetable development, vehicle scheduling, and crew scheduling. In this paper, according to the demand characteristics and operational purpose, a methodological framework that includes the elements of large-scale travel demand data processing and analysis, hierarchical clustering-based route origin-destination (OD region division, route OD region pairing, and a route selection model is proposed for CB network design. Considering the operating cost and social benefits, a route selection model is proposed and a branch-and-bound-based solution method is developed. In addition, a computer-aided program is developed to analyze a real-world Beijing CB route network design problem. The results of the case study demonstrate that the current CB network of Beijing can be significantly improved, thus demonstrating the effectiveness of the proposed methodology.

  10. Methodology for assessing laser-based equipment

    Science.gov (United States)

    Pelegrina-Bonilla, Gabriel; Hermsdorf, Jörg; Thombansen, Ulrich; Abels, Peter; Kaierle, Stefan; Neumann, Jörg

    2017-10-01

    Methodologies for the assessment of technology's maturity are widely used in industry and research. Probably the best known are technology readiness levels (TRLs), initially pioneered by the National Aeronautics and Space Administration (NASA). At the beginning, only descriptively defined TRLs existed, but over time, automated assessment techniques in the form of questionnaires emerged in order to determine TRLs. Originally TRLs targeted equipment for space applications, but the demands on industrial relevant equipment are partly different in terms of, for example, overall costs, product quantities, or the presence of competitors. Therefore, we present a commonly valid assessment methodology with the aim of assessing laser-based equipment for industrial use, in general. The assessment is carried out with the help of a questionnaire, which allows for a user-friendly and easy accessible way to monitor the progress from the lab-proven state to the application-ready product throughout the complete development period. The assessment result is presented in a multidimensional metric in order to reveal the current specific strengths and weaknesses of the equipment development process, which can be used to direct the remaining development process of the equipment in the right direction.

  11. A Simulation-Based Soft Error Estimation Methodology for Computer Systems

    OpenAIRE

    Sugihara, Makoto; Ishihara, Tohru; Hashimoto, Koji; Muroyama, Masanori

    2006-01-01

    This paper proposes a simulation-based soft error estimation methodology for computer systems. Accumulating soft error rates (SERs) of all memories in a computer system results in pessimistic soft error estimation. This is because memory cells are used spatially and temporally and not all soft errors in them make the computer system faulty. Our soft-error estimation methodology considers the locations and the timings of soft errors occurring at every level of memory hierarchy and estimates th...

  12. (Environmental investigation of ground water contamination at Wright-Patterson Air Force Base, Ohio)

    Energy Technology Data Exchange (ETDEWEB)

    1992-03-01

    An environmental investigation of ground water conditions has been undertaken at Wright-Patterson Air Force Base (WPAFB), Ohio to obtain data to assist in the evaluation of a potential removal action to prevent, to the extent practicable, migration of the contaminated ground water across Base boundaries. Field investigations were limited to the central section of the southwestern boundary of Area C and the Springfield Pike boundary of Area B. Further, the study was limited to a maximum depth of 150 feet below grade. Three primary activities of the field investigation were: (1) installation of 22 monitoring wells, (2) collection and analysis of ground water from 71 locations, (3) measurement of ground water elevations at 69 locations. Volatile organic compounds including trichloroethylene, perchloroethylene, and/or vinyl chloride were detected in concentrations exceeding Maximum Contaminant Levels (MCL) at three locations within the Area C investigation area. Ground water at the Springfield Pike boundary of Area B occurs in two primary units, separated by a thicker-than-expected clay layers. One well within Area B was determined to exceed the MCL for trichloroethylene.

  13. Toward a Framework for Systematic Error Modeling of NASA Spaceborne Radar with NOAA/NSSL Ground Radar-Based National Mosaic QPE

    Science.gov (United States)

    Kirstettier, Pierre-Emmanual; Honh, Y.; Gourley, J. J.; Chen, S.; Flamig, Z.; Zhang, J.; Howard, K.; Schwaller, M.; Petersen, W.; Amitai, E.

    2011-01-01

    Characterization of the error associated to satellite rainfall estimates is a necessary component of deterministic and probabilistic frameworks involving space-born passive and active microwave measurement") for applications ranging from water budget studies to forecasting natural hazards related to extreme rainfall events. We focus here on the error structure of NASA's Tropical Rainfall Measurement Mission (TRMM) Precipitation Radar (PR) quantitative precipitation estimation (QPE) at ground. The problem is addressed by comparison of PR QPEs with reference values derived from ground-based measurements using NOAA/NSSL ground radar-based National Mosaic and QPE system (NMQ/Q2). A preliminary investigation of this subject has been carried out at the PR estimation scale (instantaneous and 5 km) using a three-month data sample in the southern part of US. The primary contribution of this study is the presentation of the detailed steps required to derive trustworthy reference rainfall dataset from Q2 at the PR pixel resolution. It relics on a bias correction and a radar quality index, both of which provide a basis to filter out the less trustworthy Q2 values. Several aspects of PR errors arc revealed and quantified including sensitivity to the processing steps with the reference rainfall, comparisons of rainfall detectability and rainfall rate distributions, spatial representativeness of error, and separation of systematic biases and random errors. The methodology and framework developed herein applies more generally to rainfall rate estimates from other sensors onboard low-earth orbiting satellites such as microwave imagers and dual-wavelength radars such as with the Global Precipitation Measurement (GPM) mission.

  14. An overview of farming system typology methodologies and its use in the study of pasture-based farming system: a review

    Energy Technology Data Exchange (ETDEWEB)

    Madry, W.; Mena, Y.; Roszkowska, B.; Gozdowski, D.; Hryniewski, R.; Castel, J. M.

    2013-06-01

    The main objective of the paper is to do a critic study of the use of typology methodologies within pasture-based farming systems (PBFS), especially those situated in less favoured areas, showing in each case the more relevant variables or indicators determining the farming system classification. Another objective is to do an overview of the most used farming system typology methodologies in general. First some considerations about the concept of farming system and approaches to its study have been done. Next, the farming system typology methodologies have been showed in general to different farming systems, but addressed preferably to PBFS. The different tools integrated in these methodologies have been considered: sampling methods, sources of data, variables or indicators obtained from available data and techniques of analysis (statistical or not). Methods for farming system classification have been presented (expert methods, analytical methods or a combination of both types). Among the statistical methods, the multivariate analysis has been overall treated, including the principal component analysis and the cluster analysis. Finally, the use of farming system typology methodologies on different pasture-based farming systems has been presented. The most important aspects considered are following: the main objective of the typology, the main animal species, the employed methods of classification and the main variables involved in this classification. (Author) 56 refs.

  15. NO2 DOAS measurements from ground and space: comparison of ground based measurements and OMI data in Mexico City

    Science.gov (United States)

    Rivera, C.; Stremme, W.; Grutter, M.

    2012-04-01

    The combination of satellite data and ground based measurements can provide valuable information about atmospheric chemistry and air quality. In this work we present a comparison between measured ground based NO2 differential columns at the Universidad Nacional Autónoma de México (UNAM) in Mexico City, using the Differential Optical Absorption Spectroscopy (DOAS) technique and NO2 total columns measured by the Ozone Monitoring Instrument (OMI) onboard the Aura satellite using the same measurement technique. From these data, distribution maps of average NO2 above the Mexico basin were constructed and hot spots inside the city could be identified. In addition, a clear footprint was detected from the Tula industrial area, ~50 km northwest of Mexico City, where a refinery, a power plant and other industries are located. A less defined footprint was identified in the Cuernavaca basin, South of Mexico City, and the nearby cities of Toluca and Puebla do not present strong enhancements in the NO2 total columns. With this study we expect to cross-validate space and ground measurements and provide useful information for future studies.

  16. Examining the Nexus between Grounded Theory and Symbolic Interactionism

    Directory of Open Access Journals (Sweden)

    P. Jane Milliken RN, PhD

    2012-12-01

    Full Text Available Grounded theory is inherently symbolic interactionist; however, not all grounded theory researchers appreciate its importance or benefit from its influence. Elsewhere, we have written about the intrinsic relationship between grounded theory and symbolic interactionism, highlighting the silent, fundamental contribution of symbolic interactionism to the methodology. At the same time, there are significant insights to be had by bringing a conscious awareness of the philosophy of symbolic interactionism to grounded theory research. In this article we discuss the symbolic interactionist concepts of mind, self, and society, and their applicability in grounded theorizing. Our purpose is to highlight foundational concepts of symbolic interactionism and their centrality in the processes of conducting grounded theory research.

  17. Obligatory inquiries concerning buried pipelines in case of intended excavations on private ground; Zur Erkundigungspflicht nach erdverlegten Versorgungsanlagen bei Tiefbauarbeiten auf privaten Grundstuecken

    Energy Technology Data Exchange (ETDEWEB)

    Schulze, O.

    2006-09-15

    If inquiries concerning buried pipelines in case of intended excavations on private ground are neglected, this may have dramatic consequences. As an example, at Griselenghein near Ath, Belgium, excavations damaged the gas pipeline on 30 July 2004, and 18 were killed during the subsequent explosion. The contribution outlines cases in which inquiries concerning buried pipelines in case of intended excavations on private ground are obligatory. It points out regulations that come into force in case of neglect, and the potential contributory negligence of the utility if pipelines are not buried deep enough. (orig.)

  18. Communication grounding facility

    International Nuclear Information System (INIS)

    Lee, Gye Seong

    1998-06-01

    It is about communication grounding facility, which is made up twelve chapters. It includes general grounding with purpose, materials thermal insulating material, construction of grounding, super strength grounding method, grounding facility with grounding way and building of insulating, switched grounding with No. 1A and LCR, grounding facility of transmission line, wireless facility grounding, grounding facility in wireless base station, grounding of power facility, grounding low-tenton interior power wire, communication facility of railroad, install of arrester in apartment and house, install of arrester on introduction and earth conductivity and measurement with introduction and grounding resistance.

  19. Status of advanced ground-based laser interferometers for gravitational-wave detection

    International Nuclear Information System (INIS)

    Dooley, K L; Akutsu, T; Dwyer, S; Puppo, P

    2015-01-01

    Ground-based laser interferometers for gravitational-wave (GW) detection were first constructed starting 20 years ago and as of 2010 collection of several years’ worth of science data at initial design sensitivities was completed. Upgrades to the initial detectors together with construction of brand new detectors are ongoing and feature advanced technologies to improve the sensitivity to GWs. This conference proceeding provides an overview of the common design features of ground-based laser interferometric GW detectors and establishes the context for the status updates of each of the four gravitational-wave detectors around the world: Advanced LIGO, Advanced Virgo, GEO 600 and KAGRA. (paper)

  20. Symbolic interactionism in grounded theory studies: women surviving with HIV/AIDS in rural northern Thailand.

    Science.gov (United States)

    Klunklin, Areewan; Greenwood, Jennifer

    2006-01-01

    Although it is generally acknowledged that symbolic interactionism and grounded theory are connected, the precise nature of their connection remains implicit and unexplained. As a result, many grounded theory studies are undertaken without an explanatory framework. This in turn results in the description rather than the explanation of data determined. In this report, the authors make explicit and explain the nature of the connections between symbolic interactionism and grounded theory research. Specifically, they make explicit the connection between Blumer's methodological principles and processes and grounded theory methodology. In addition, the authors illustrate the explanatory power of symbolic interactionism in grounded theory using data from a study of the HIV/AIDS experiences of married and widowed Thai women.

  1. Adaptation of Agile Project Management Methodology for Project Team

    Directory of Open Access Journals (Sweden)

    Rasnacis Arturs

    2015-12-01

    Full Text Available A project management methodology that defines basic processes, tools, techniques, methods, resources and procedures used to manage a project is necessary for effective and successful IT project management. Each company needs to define its own methodology or adapt some of the existing ones. The purpose of the research is to evaluate the possibilities of adapting IT project development methodology according to the company, company employee characteristics and their mutual relations. The adaptation process will be illustrated with a case study at an IT company in Latvia where the developed methodology is based on Agile Scrum, one of the most widespread Agile methods.

  2. Methodology of synchronization among strategy and operation. A standards-based modeling approach

    Directory of Open Access Journals (Sweden)

    VICTOR EDWIN COLLAZOS

    2017-05-01

    Full Text Available Enterprise Architecture (EA has gained importance in recent years, mainly for its concept of “alignment” between the strategic and operational levels of organizations. Such alignment occurs when Information Technology (IT is applied correctly and timely, working in synergy and harmony with strategy and the operation to achieve mutually their own goals and satisfy the organizational needs.Both the strategic and operational levels have standards that help model elements necessary to obtain desired results. In this sense, BMM and BPMN were selected because both have the support of OMG and they are fairly well known for modelling the strategic level and operational level, respectively. In addition, i* modeling goal can be used for reducing the gap between these two standards. This proposal may help both the high-level design of the information system and to the appropriate identification of the business processes that will support it.This paper presents a methodology for aligning strategy and the operation based on standards and heuristics. We have made a classification for elements of the models and, for some specific cases, an extension of the heuristics associated between them. This allows us to propose methodology, which uses above-mentioned standards and combines mappings, transformations and actions to be considered in the alignment process.

  3. Grounding language in action and perception: from cognitive agents to humanoid robots.

    Science.gov (United States)

    Cangelosi, Angelo

    2010-06-01

    In this review we concentrate on a grounded approach to the modeling of cognition through the methodologies of cognitive agents and developmental robotics. This work will focus on the modeling of the evolutionary and developmental acquisition of linguistic capabilities based on the principles of symbol grounding. We review cognitive agent and developmental robotics models of the grounding of language to demonstrate their consistency with the empirical and theoretical evidence on language grounding and embodiment, and to reveal the benefits of such an approach in the design of linguistic capabilities in cognitive robotic agents. In particular, three different models will be discussed, where the complexity of the agent's sensorimotor and cognitive system gradually increases: from a multi-agent simulation of language evolution, to a simulated robotic agent model for symbol grounding transfer, to a model of language comprehension in the humanoid robot iCub. The review also discusses the benefits of the use of humanoid robotic platform, and specifically of the open source iCub platform, for the study of embodied cognition. Copyright 2010 Elsevier B.V. All rights reserved.

  4. The middle ground of the NDE R and D spectrum

    International Nuclear Information System (INIS)

    Burte, H.M.; Chimenti, D.E.; Thompson, D.O.; Thompson, R.B.

    1983-01-01

    This keynote talk attempts to call attention to the interdisciplinary nature of NDE (nondestructive evaluation) science and technology and introduce some approaches for fostering R and D in such a situation. The objectives of DARPA, the Air Force core program for developing a science base for NDE are described. Finally, several exploratory development programs are investigated. The needs addressed by NDE include performance demands, safety, conservation, productivity with quality, and minimization of life cycle costs. The science base for electromagnetic techniques includes eddy-current flaw interactions, inversion techniques, and probe figures-of-merit. The problem of inspection reliability is addressed, and an accept-reject methodology schematicized. A methodology for approaching the middle ground of the NDE R and D spectrum is outlined. Finally, future possibilities such as the characterization of flaws in ceramics, transducer understanding, new electromagnetic probes, and thermal wave imaging are also discussed

  5. Model-Based Knowing: How Do Students Ground Their Understanding About Climate Systems in Agent-Based Computer Models?

    Science.gov (United States)

    Markauskaite, Lina; Kelly, Nick; Jacobson, Michael J.

    2017-12-01

    This paper gives a grounded cognition account of model-based learning of complex scientific knowledge related to socio-scientific issues, such as climate change. It draws on the results from a study of high school students learning about the carbon cycle through computational agent-based models and investigates two questions: First, how do students ground their understanding about the phenomenon when they learn and solve problems with computer models? Second, what are common sources of mistakes in students' reasoning with computer models? Results show that students ground their understanding in computer models in five ways: direct observation, straight abstraction, generalisation, conceptualisation, and extension. Students also incorporate into their reasoning their knowledge and experiences that extend beyond phenomena represented in the models, such as attitudes about unsustainable carbon emission rates, human agency, external events, and the nature of computational models. The most common difficulties of the students relate to seeing the modelled scientific phenomenon and connecting results from the observations with other experiences and understandings about the phenomenon in the outside world. An important contribution of this study is the constructed coding scheme for establishing different ways of grounding, which helps to understand some challenges that students encounter when they learn about complex phenomena with agent-based computer models.

  6. An inexpensive, interdisciplinary, methodology to conduct an impact study of homeless persons on hospital based services.

    Science.gov (United States)

    Parker, R David; Regier, Michael; Brown, Zachary; Davis, Stephen

    2015-02-01

    Homelessness is a primary concern for community health. Scientific literature on homelessness is wide ranging and diverse. One opportunity to add to existing literature is the development and testing of affordable, easily implemented methods for measuring the impact of homeless on the healthcare system. Such methodological approaches rely on the strengths in a multidisciplinary approach, including providers, both healthcare and homeless services and applied clinical researchers. This paper is a proof of concept for a methodology which is easily adaptable nationwide, given the mandated implementation of homeless management information systems in the United States and other countries; medical billing systems by hospitals; and research methods of researchers. Adaptation is independent of geographic region, budget restraints, specific agency skill sets, and many other factors that impact the application of a consistent methodological science based approach to assess and address homelessness. We conducted a secondary data analysis merging data from homeless utilization and hospital case based data. These data detailed care utilization among homeless persons in a small, Appalachian city in the United States. In our sample of 269 persons who received at least one hospital based service and one homeless service between July 1, 2012 and June 30, 2013, the total billed costs were $5,979,463 with 10 people costing more than one-third ($1,957,469) of the total. Those persons were primarily men, living in an emergency shelter, with pre-existing disabling conditions. We theorize that targeted services, including Housing First, would be an effective intervention. This is proposed in a future study.

  7. MODELING ATMOSPHERIC EMISSION FOR CMB GROUND-BASED OBSERVATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Errard, J.; Borrill, J. [Space Sciences Laboratory, University of California, Berkeley, CA 94720 (United States); Ade, P. A. R. [School of Physics and Astronomy, Cardiff University, Cardiff CF10 3XQ (United Kingdom); Akiba, Y.; Chinone, Y. [High Energy Accelerator Research Organization (KEK), Tsukuba, Ibaraki 305-0801 (Japan); Arnold, K.; Atlas, M.; Barron, D.; Elleflot, T. [Department of Physics, University of California, San Diego, CA 92093-0424 (United States); Baccigalupi, C.; Fabbian, G. [International School for Advanced Studies (SISSA), Trieste I-34014 (Italy); Boettger, D. [Department of Astronomy, Pontifica Universidad Catolica de Chile (Chile); Chapman, S. [Department of Physics and Atmospheric Science, Dalhousie University, Halifax, NS, B3H 4R2 (Canada); Cukierman, A. [Department of Physics, University of California, Berkeley, CA 94720 (United States); Delabrouille, J. [AstroParticule et Cosmologie, Univ Paris Diderot, CNRS/IN2P3, CEA/Irfu, Obs de Paris, Sorbonne Paris Cité (France); Dobbs, M.; Gilbert, A. [Physics Department, McGill University, Montreal, QC H3A 0G4 (Canada); Ducout, A.; Feeney, S. [Department of Physics, Imperial College London, London SW7 2AZ (United Kingdom); Feng, C. [Department of Physics and Astronomy, University of California, Irvine (United States); and others

    2015-08-10

    Atmosphere is one of the most important noise sources for ground-based cosmic microwave background (CMB) experiments. By increasing optical loading on the detectors, it amplifies their effective noise, while its fluctuations introduce spatial and temporal correlations between detected signals. We present a physically motivated 3D-model of the atmosphere total intensity emission in the millimeter and sub-millimeter wavelengths. We derive a new analytical estimate for the correlation between detectors time-ordered data as a function of the instrument and survey design, as well as several atmospheric parameters such as wind, relative humidity, temperature and turbulence characteristics. Using an original numerical computation, we examine the effect of each physical parameter on the correlations in the time series of a given experiment. We then use a parametric-likelihood approach to validate the modeling and estimate atmosphere parameters from the polarbear-i project first season data set. We derive a new 1.0% upper limit on the linear polarization fraction of atmospheric emission. We also compare our results to previous studies and weather station measurements. The proposed model can be used for realistic simulations of future ground-based CMB observations.

  8. Successful adaptation of three-dimensional inversion methodologies for archaeological-scale, total-field magnetic data sets

    Science.gov (United States)

    Cheyney, S.; Fishwick, S.; Hill, I. A.; Linford, N. T.

    2015-08-01

    Despite the development of advanced processing and interpretation tools for magnetic data sets in the fields of mineral and hydrocarbon industries, these methods have not achieved similar levels of adoption for archaeological or very near surface surveys. Using a synthetic data set we demonstrate that certain methodologies and assumptions used to successfully invert more regional-scale data can lead to large discrepancies between the true and recovered depths when applied to archaeological-type anomalies. We propose variations to the current approach, analysing the choice of the depth-weighting function, mesh design and parameter constraints, to develop an appropriate technique for the 3-D inversion of archaeological-scale data sets. The results show a successful recovery of a synthetic scenario, as well as a case study of a Romano-Celtic temple in the UK. For the case study, the final susceptibility model is compared with two coincident ground penetrating radar surveys, showing a high correlation with the comparative depth slices. The new approach takes interpretation of archaeological data sets beyond a simple 2-D visual interpretation based on pattern recognition.

  9. Solar energy prediction and verification using operational model forecasts and ground-based solar measurements

    International Nuclear Information System (INIS)

    Kosmopoulos, P.G.; Kazadzis, S.; Lagouvardos, K.; Kotroni, V.; Bais, A.

    2015-01-01

    The present study focuses on the predictions and verification of these predictions of solar energy using ground-based solar measurements from the Hellenic Network for Solar Energy and the National Observatory of Athens network, as well as solar radiation operational forecasts provided by the MM5 mesoscale model. The evaluation was carried out independently for the different networks, for two forecast horizons (1 and 2 days ahead), for the seasons of the year, for varying solar elevation, for the indicative energy potential of the area, and for four classes of cloud cover based on the calculated clearness index (k_t): CS (clear sky), SC (scattered clouds), BC (broken clouds) and OC (overcast). The seasonal dependence presented relative rRMSE (Root Mean Square Error) values ranging from 15% (summer) to 60% (winter), while the solar elevation dependence revealed a high effectiveness and reliability near local noon (rRMSE ∼30%). An increment of the errors with cloudiness was also observed. For CS with mean GHI (global horizontal irradiance) ∼ 650 W/m"2 the errors are 8%, for SC 20% and for BC and OC the errors were greater (>40%) but correspond to much lower radiation levels (<120 W/m"2) of consequently lower energy potential impact. The total energy potential for each ground station ranges from 1.5 to 1.9 MWh/m"2, while the mean monthly forecast error was found to be consistently below 10%. - Highlights: • Long term measurements at different atmospheric cases are needed for energy forecasting model evaluations. • The total energy potential at the Greek sites presented ranges from 1.5 to 1.9 MWh/m"2. • Mean monthly energy forecast errors are within 10% for all cases analyzed. • Cloud presence results of an additional forecast error that varies with the cloud cover.

  10. MERMOS. An EDF project to update the PHRA methodology

    International Nuclear Information System (INIS)

    Le Bot, P.; Desmares, E.; Bieder, C.; Bonnet, J.L.; Cara, F.

    1997-09-01

    To account for successive evolution of nuclear power plants emergency operation, EDF had several times to review PHRA methodologies. It was particularly the case when event-based procedures were left behind to the benefit of state-based procedures. These are the main ambitions of the project named MERMOS that started in 1996. The design effort for a new PHRA method is carried out by a multidisciplinary team involving reliability engineers, psychologists and ergonomists. The method, considered as the analysis tool dedicated to PHRA analysts, is one of the two outcomes of the project. The other one is the formalization of the design approach for the method, aimed at a good appropriation of the method by the analysts. The PHRA method is based upon a model of emergency operation called 'SAD model'. The formalization effort of the design approach lead to clarify and justify it. The model describes and explains both functioning and dys-functioning of emergency operation in PSA scenarios. It combines a systemic approach and what is called distributed cognition in cognitive sciences. Collective aspects are considered as an important feature in explaining phenomena under study in operation dys-functioning. The PHRA method is to be operational early next year (1998). Preliminary validation tests were performed quite early in the design process. These tests will set the grounds for the presentation of examples of the application of the method. (author)

  11. Theoretical and methodological foundations of sustainable development of Geosystems

    Science.gov (United States)

    Mandryk, O. M.; Arkhypova, L. M.; Pukish, A. V.; Zelmanovych, A.; Yakovlyuk, Kh

    2017-05-01

    The theoretical and methodological foundations of sustainable development of Geosystems were further evolved. It was grounded the new scientific direction “constructive Hydroecology” - the science that studies the Hydrosphere from the standpoint of natural and technogenic safety based on geosystematical approach. A structural separation for constructive Hydroecology based on objective, subjective, and application characteristics was set. The main object of study of the new scientific field is the hydroecological environment under which the part of Hydrosphere should be understood as a part of the multicomponent dynamic system that is influenced by engineering and economical human activities and, in turn, determines to some extent this activity.

  12. AGR core safety assessment methodologies

    International Nuclear Information System (INIS)

    McLachlan, N.; Reed, J.; Metcalfe, M.P.

    1996-01-01

    To demonstrate the safety of its gas-cooled graphite-moderated AGR reactors, nuclear safety assessments of the cores are based upon a methodology which demonstrates no component failures, geometrical stability of the structure and material properties bounded by a database. All AGRs continue to meet these three criteria. However, predictions of future core behaviour indicate that the safety case methodology will eventually need to be modified to deal with new phenomena. A new approach to the safety assessment of the cores is currently under development, which can take account of these factors while at the same time providing the same level of protection for the cores. This approach will be based on the functionality of the core: unhindered movement of control rods, continued adequate cooling of the fuel and the core, continued ability to charge and discharge fuel. (author). 5 figs

  13. Time-to-event methodology improved statistical evaluation in register-based health services research.

    Science.gov (United States)

    Bluhmki, Tobias; Bramlage, Peter; Volk, Michael; Kaltheuner, Matthias; Danne, Thomas; Rathmann, Wolfgang; Beyersmann, Jan

    2017-02-01

    Complex longitudinal sampling and the observational structure of patient registers in health services research are associated with methodological challenges regarding data management and statistical evaluation. We exemplify common pitfalls and want to stimulate discussions on the design, development, and deployment of future longitudinal patient registers and register-based studies. For illustrative purposes, we use data from the prospective, observational, German DIabetes Versorgungs-Evaluation register. One aim was to explore predictors for the initiation of a basal insulin supported therapy in patients with type 2 diabetes initially prescribed to glucose-lowering drugs alone. Major challenges are missing mortality information, time-dependent outcomes, delayed study entries, different follow-up times, and competing events. We show that time-to-event methodology is a valuable tool for improved statistical evaluation of register data and should be preferred to simple case-control approaches. Patient registers provide rich data sources for health services research. Analyses are accompanied with the trade-off between data availability, clinical plausibility, and statistical feasibility. Cox' proportional hazards model allows for the evaluation of the outcome-specific hazards, but prediction of outcome probabilities is compromised by missing mortality information. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Research on conflict resolution of collaborative design with fuzzy case-based reasoning method

    Institute of Scientific and Technical Information of China (English)

    HOU Jun-ming; SU Chong; LIANG Shuang; WANG Wan-shan

    2009-01-01

    Collaborative design is a new style for modern mechanical design to meet the requirement of increasing competition. Designers of different places complete the same work, but the conflict appears in the process of design which may interface the design. Case-based reasoning (CBR) method is applied to the problem of conflict resolution, which is in the artificial intelligence field. However, due to the uncertainties in knowledge representation, attribute description, and similarity measures of CBR, it is very difficult to find the similar cases from case database. A fuzzy CBR method was proposed to solve the problem of conflict resolution in collaborative design. The process of fuzzy CBR was introduced. Based on the feature attributes and their relative weights determined by a fuzzy technique, a fuzzy CBR retrieving mechanism was developed to retrieve conflict resolution cases that tend to enhance the functions of the database. By indexing, calculating the weight and defuzzicating of the cases, the case similarity can be obtained. Then the case consistency was measured to keep the right result. Finally, the fuzzy CBR method for conflict resolution was demonstrated by means of a case study. The prototype system based on web is developed to illustrate the methodology.

  15. A GIS-based methodology for the estimation of potential volcanic damage and its application to Tenerife Island, Spain

    Science.gov (United States)

    Scaini, C.; Felpeto, A.; Martí, J.; Carniel, R.

    2014-05-01

    This paper presents a GIS-based methodology to estimate damages produced by volcanic eruptions. The methodology is constituted by four parts: definition and simulation of eruptive scenarios, exposure analysis, vulnerability assessment and estimation of expected damages. Multi-hazard eruptive scenarios are defined for the Teide-Pico Viejo active volcanic complex, and simulated through the VORIS tool. The exposure analysis identifies the elements exposed to the hazard at stake and focuses on the relevant assets for the study area. The vulnerability analysis is based on previous studies on the built environment and complemented with the analysis of transportation and urban infrastructures. Damage assessment is performed associating a qualitative damage rating to each combination of hazard and vulnerability. This operation consists in a GIS-based overlap, performed for each hazardous phenomenon considered and for each element. The methodology is then automated into a GIS-based tool using an ArcGIS® program. Given the eruptive scenarios and the characteristics of the exposed elements, the tool produces expected damage maps. The tool is applied to the Icod Valley (North of Tenerife Island) which is likely to be affected by volcanic phenomena in case of eruption from both the Teide-Pico Viejo volcanic complex and North-West basaltic rift. Results are thematic maps of vulnerability and damage that can be displayed at different levels of detail, depending on the user preferences. The aim of the tool is to facilitate territorial planning and risk management in active volcanic areas.

  16. Helicopter-borne observations of the continental background aerosol in combination with remote sensing and ground-based measurements

    Science.gov (United States)

    Düsing, Sebastian; Wehner, Birgit; Seifert, Patric; Ansmann, Albert; Baars, Holger; Ditas, Florian; Henning, Silvia; Ma, Nan; Poulain, Laurent; Siebert, Holger; Wiedensohler, Alfred; Macke, Andreas

    2018-01-01

    This paper examines the representativeness of ground-based in situ measurements for the planetary boundary layer (PBL) and conducts a closure study between airborne in situ and ground-based lidar measurements up to an altitude of 2300 m. The related measurements were carried out in a field campaign within the framework of the High-Definition Clouds and Precipitation for Advancing Climate Prediction (HD(CP)2) Observational Prototype Experiment (HOPE) in September 2013 in a rural background area of central Europe.The helicopter-borne probe ACTOS (Airborne Cloud and Turbulence Observation System) provided measurements of the aerosol particle number size distribution (PNSD), the aerosol particle number concentration (PNC), the number concentration of cloud condensation nuclei (CCN-NC), and meteorological atmospheric parameters (e.g., temperature and relative humidity). These measurements were supported by the ground-based 3+2 wavelength polarization lidar system PollyXT, which provided profiles of the particle backscatter coefficient (σbsc) for three wavelengths (355, 532, and 1064 nm). Particle extinction coefficient (σext) profiles were obtained by using a fixed backscatter-to-extinction ratio (also lidar ratio, LR). A new approach was used to determine profiles of CCN-NC for continental aerosol. The results of this new approach were consistent with the airborne in situ measurements within the uncertainties.In terms of representativeness, the PNSD measurements on the ground showed a good agreement with the measurements provided with ACTOS for lower altitudes. The ground-based measurements of PNC and CCN-NC are representative of the PBL when the PBL is well mixed. Locally isolated new particle formation events on the ground or at the top of the PBL led to vertical variability in the cases presented here and ground-based measurements are not entirely representative of the PBL. Based on Mie theory (Mie, 1908), optical aerosol properties under ambient conditions for

  17. Helicopter-borne observations of the continental background aerosol in combination with remote sensing and ground-based measurements

    Directory of Open Access Journals (Sweden)

    S. Düsing

    2018-01-01

    Full Text Available This paper examines the representativeness of ground-based in situ measurements for the planetary boundary layer (PBL and conducts a closure study between airborne in situ and ground-based lidar measurements up to an altitude of 2300 m. The related measurements were carried out in a field campaign within the framework of the High-Definition Clouds and Precipitation for Advancing Climate Prediction (HD(CP2 Observational Prototype Experiment (HOPE in September 2013 in a rural background area of central Europe.The helicopter-borne probe ACTOS (Airborne Cloud and Turbulence Observation System provided measurements of the aerosol particle number size distribution (PNSD, the aerosol particle number concentration (PNC, the number concentration of cloud condensation nuclei (CCN-NC, and meteorological atmospheric parameters (e.g., temperature and relative humidity. These measurements were supported by the ground-based 3+2 wavelength polarization lidar system PollyXT, which provided profiles of the particle backscatter coefficient (σbsc for three wavelengths (355, 532, and 1064 nm. Particle extinction coefficient (σext profiles were obtained by using a fixed backscatter-to-extinction ratio (also lidar ratio, LR. A new approach was used to determine profiles of CCN-NC for continental aerosol. The results of this new approach were consistent with the airborne in situ measurements within the uncertainties.In terms of representativeness, the PNSD measurements on the ground showed a good agreement with the measurements provided with ACTOS for lower altitudes. The ground-based measurements of PNC and CCN-NC are representative of the PBL when the PBL is well mixed. Locally isolated new particle formation events on the ground or at the top of the PBL led to vertical variability in the cases presented here and ground-based measurements are not entirely representative of the PBL. Based on Mie theory (Mie, 1908, optical aerosol properties under ambient

  18. Ground-based observation of emission lines from the corona of a red-dwarf star.

    Science.gov (United States)

    Schmitt, J H; Wichmann, R

    2001-08-02

    All 'solar-like' stars are surrounded by coronae, which contain magnetically confined plasma at temperatures above 106 K. (Until now, only the Sun's corona could be observed in the optical-as a shimmering envelope during a total solar eclipse.) As the underlying stellar 'surfaces'-the photospheres-are much cooler, some non-radiative process must be responsible for heating the coronae. The heating mechanism is generally thought to be magnetic in origin, but is not yet understood even for the case of the Sun. Ultraviolet emission lines first led to the discovery of the enormous temperature of the Sun's corona, but thermal emission from the coronae of other stars has hitherto been detectable only from space, at X-ray wavelengths. Here we report the detection of emission from highly ionized iron (Fe XIII at 3,388.1 A) in the corona of the red-dwarf star CN Leonis, using a ground-based telescope. The X-ray flux inferred from our data is consistent with previously measured X-ray fluxes, and the non-thermal line width of 18.4 km s-1 indicates great similarities between solar and stellar coronal heating mechanisms. The accessibility and spectral resolution (45,000) of the ground-based instrument are much better than those of X-ray satellites, so a new window to the study of stellar coronae has been opened.

  19. Augmenting WFIRST Microlensing with a Ground-Based Telescope Network

    Science.gov (United States)

    Zhu, Wei; Gould, Andrew

    2016-06-01

    Augmenting the Wide Field Infrared Survey Telescope (WFIRST) microlensing campaigns with intensive observations from a ground-based network of wide-field survey telescopes would have several major advantages. First, it would enable full two-dimensional (2-D) vector microlens parallax measurements for a substantial fraction of low-mass lenses as well as planetary and binary events that show caustic crossing features. For a significant fraction of the free-floating planet (FFP) events and all caustic-crossing planetary/binary events, these 2-D parallax measurements directly lead to complete solutions (mass, distance, transverse velocity) of the lens object (or lens system). For even more events, the complementary ground-based observations will yield 1-D parallax measurements. Together with the 1-D parallaxes from WFIRST alone, they can probe the entire mass range M > M_Earth. For luminous lenses, such 1-D parallax measurements can be promoted to complete solutions (mass, distance, transverse velocity) by high-resolution imaging. This would provide crucial information not only about the hosts of planets and other lenses, but also enable a much more precise Galactic model. Other benefits of such a survey include improved understanding of binaries (particularly with low mass primaries), and sensitivity to distant ice-giant and gas-giant companions of WFIRST lenses that cannot be detected by WFIRST itself due to its restricted observing windows. Existing ground-based microlensing surveys can be employed if WFIRST is pointed at lower-extinction fields than is currently envisaged. This would come at some cost to the event rate. Therefore the benefits of improved characterization of lenses must be weighed against these costs.

  20. Linking Symbolic Interactionism and Grounded Theory Methods in a Research Design

    OpenAIRE

    Jennifer Chamberlain-Salaun; Jane Mills; Kim Usher

    2013-01-01

    This article focuses on Corbin and Strauss’ evolved version of grounded theory. In the third edition of their seminal text, Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory, the authors present 16 assumptions that underpin their conception of grounded theory methodology. The assumptions stem from a symbolic interactionism perspective of social life, including the themes of meanin...

  1. Automatic Barometric Updates from Ground-Based Navigational Aids

    Science.gov (United States)

    1990-03-12

    ro fAutomatic Barometric Updates US Department from of Transportation Ground-Based Federal Aviation Administration Navigational Aids Office of Safety...tighter vertical spacing controls , particularly for operations near Terminal Control Areas (TCAs), Airport Radar Service Areas (ARSAs), military climb and...E.F., Ruth, J.C., and Williges, B.H. (1987). Speech Controls and Displays. In Salvendy, G., E. Handbook of Human Factors/Ergonomics, New York, John

  2. Technology-Enhanced Problem-Based Learning Methodology in Geographically Dispersed Learners of Tshwane University of Technology

    Directory of Open Access Journals (Sweden)

    Sibitse M. Tlhapane

    2010-03-01

    Full Text Available Improving teaching and learning methodologies is not just a wish but rather strife for most educational institutions globally. To attain this, the Adelaide Tambo School of Nursing Science implemented a Technology-enhanced Problem-Based Learning methodology in the programme B Tech Occupational Nursing, in 2006. This is a two-year post-basic nursing program. The students are geographically dispersed and the curriculum design is the typically student-centred outcomes-based education. The research question posed by this paper is: How does technology-enhanced problem-based learning enhance student-centred learning, thinking skills, social skills and social space for learners? To answer the above question, a case study with both qualitative and quantitative data was utilised. The participants consisted of all students registered for the subject Occupational Health level 4. The sample group was chosen from willing participants from the Pretoria, eMalahleni and Polokwane learning sites, using the snowball method. This method was seen as appropriate due to the timing of the study. Data was collected using a questionnaire with both open and closed-ended questions. An analyses of the students‟ end of year examination was also done, including a comparison of performances by students on technology enhanced problem-based learning and those on problem-based learning only. The findings revealed that with Technology-enhanced Problem Based Learning (PBL, students‟ critical thinking, problem solving, and social skills improved and that social space was enhanced. This was supported by improved grades in students‟ on Technology-enhanced PBL as compared to those on PBL only.

  3. State of the art in HGPT (Heuristically Based Generalized Perturbation) methodology

    International Nuclear Information System (INIS)

    Gandini, A.

    1993-01-01

    A distinctive feature of heuristically based generalized perturbation theory (HGPT) methodology consists in the systematic use of importance conservation concepts. As well known, this use leads to fundamental reciprocity relationships from which perturbation, or sensitivity, expressions can be derived. The state of the art of the HGPT methodology is here illustrated. The application to a number of specific nonlinear fields of interest is commented. (author)

  4. Ergonomic problems regarding the interactive touch input via screens in onboard and ground-based flight control

    Science.gov (United States)

    Holzhausen, K. P.; Gaertner, K. P.

    1985-01-01

    A significant problem concerning the integration of display and switching functions is related to the fact that numerous informative data which have to be processed by man must be read from only a few display devices. A satisfactory ergonomic design of integrated display devices and keyboards is in many cases difficult, because not all functions which can be displayed and selected are simultaneously available. A technical solution which provides an integration of display and functional elements on the basis of the highest flexibility is obtained by using a cathode ray tube with a touch-sensitive screen. The employment of an integrated data input/output system is demonstrated for the cases of onboard and ground-based flight control. Ergonomic studies conducted to investigate the suitability of an employment of touch-sensitive screens are also discussed.

  5. State of the Art in Input Ground Motions for Seismic Fragility and Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jung Han; Choi, In Kil; Kim, Min Kyu [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    The purpose of a Seismic Probabilistic Safety Analysis (SPSA) is to determine the probability distribution of core damage due to the potential effects of earthquakes. The SPSA is performed based on four steps, a seismic hazard analysis, a component fragility evaluation, a plant system and accident sequence analysis, and a consequence analysis. There are very different spectrum shapes in every ground motions. The structural response and the seismic load applied to equipment are greatly influenced by a spectral shape of the input ground motion. Therefore the input ground motion need to be determined under the same assumption in risk calculation. Several technic for the determination of input ground motions has developed and reviewed in this study. In this research, the methodologies of the determination of input ground motion for the seismic risk assessment are reviewed and discussed. It has developed to reduce the uncertainty in fragility curves and to remove the conservatism in risk values.

  6. Self-Tuning Threshold Method for Real-Time Gait Phase Detection Based on Ground Contact Forces Using FSRs

    Directory of Open Access Journals (Sweden)

    Jing Tang

    2018-02-01

    Full Text Available This paper presents a novel methodology for detecting the gait phase of human walking on level ground. The previous threshold method (TM sets a threshold to divide the ground contact forces (GCFs into on-ground and off-ground states. However, the previous methods for gait phase detection demonstrate no adaptability to different people and different walking speeds. Therefore, this paper presents a self-tuning triple threshold algorithm (STTTA that calculates adjustable thresholds to adapt to human walking. Two force sensitive resistors (FSRs were placed on the ball and heel to measure GCFs. Three thresholds (i.e., high-threshold, middle-threshold andlow-threshold were used to search out the maximum and minimum GCFs for the self-adjustments of thresholds. The high-threshold was the main threshold used to divide the GCFs into on-ground and off-ground statuses. Then, the gait phases were obtained through the gait phase detection algorithm (GPDA, which provides the rules that determine calculations for STTTA. Finally, the STTTA reliability is determined by comparing the results between STTTA and Mariani method referenced as the timing analysis module (TAM and Lopez–Meyer methods. Experimental results show that the proposed method can be used to detect gait phases in real time and obtain high reliability when compared with the previous methods in the literature. In addition, the proposed method exhibits strong adaptability to different wearers walking at different walking speeds.

  7. Response of base isolated structure during strong ground motions beyond design earthquakes

    International Nuclear Information System (INIS)

    Yabana, Shuichi; Ishida, Katsuhiko; Shiojiri, Hiroo

    1991-01-01

    In Japan, some base isolated structures for fast breeder reactors (FBR) are tried to design. When a base isolated structure are designed, the relative displacement of isolators are generally limited so sa to be remain in linear state of those during design earthquakes. But to estimate safety margin of a base isolated structure, the response of that until the failure must be obtained experimentally to analytically during strong ground motions of beyond design earthquake. The aim of this paper is to investigate the response of a base isolated structure when the stiffness of the isolators hardens and to simulate the response during strong ground motions of beyond design earthquakes. The optimum characteristics of isolators, with which the margin of the structure are increased, are discussed. (author)

  8. A Novel Consensus-Based Particle Swarm Optimization-Assisted Trust-Tech Methodology for Large-Scale Global Optimization.

    Science.gov (United States)

    Zhang, Yong-Feng; Chiang, Hsiao-Dong

    2017-09-01

    A novel three-stage methodology, termed the "consensus-based particle swarm optimization (PSO)-assisted Trust-Tech methodology," to find global optimal solutions for nonlinear optimization problems is presented. It is composed of Trust-Tech methods, consensus-based PSO, and local optimization methods that are integrated to compute a set of high-quality local optimal solutions that can contain the global optimal solution. The proposed methodology compares very favorably with several recently developed PSO algorithms based on a set of small-dimension benchmark optimization problems and 20 large-dimension test functions from the CEC 2010 competition. The analytical basis for the proposed methodology is also provided. Experimental results demonstrate that the proposed methodology can rapidly obtain high-quality optimal solutions that can contain the global optimal solution. The scalability of the proposed methodology is promising.

  9. Uncovering highly obfuscated plagiarism cases using fuzzy semantic-based similarity model

    Directory of Open Access Journals (Sweden)

    Salha M. Alzahrani

    2015-07-01

    Full Text Available Highly obfuscated plagiarism cases contain unseen and obfuscated texts, which pose difficulties when using existing plagiarism detection methods. A fuzzy semantic-based similarity model for uncovering obfuscated plagiarism is presented and compared with five state-of-the-art baselines. Semantic relatedness between words is studied based on the part-of-speech (POS tags and WordNet-based similarity measures. Fuzzy-based rules are introduced to assess the semantic distance between source and suspicious texts of short lengths, which implement the semantic relatedness between words as a membership function to a fuzzy set. In order to minimize the number of false positives and false negatives, a learning method that combines a permission threshold and a variation threshold is used to decide true plagiarism cases. The proposed model and the baselines are evaluated on 99,033 ground-truth annotated cases extracted from different datasets, including 11,621 (11.7% handmade paraphrases, 54,815 (55.4% artificial plagiarism cases, and 32,578 (32.9% plagiarism-free cases. We conduct extensive experimental verifications, including the study of the effects of different segmentations schemes and parameter settings. Results are assessed using precision, recall, F-measure and granularity on stratified 10-fold cross-validation data. The statistical analysis using paired t-tests shows that the proposed approach is statistically significant in comparison with the baselines, which demonstrates the competence of fuzzy semantic-based model to detect plagiarism cases beyond the literal plagiarism. Additionally, the analysis of variance (ANOVA statistical test shows the effectiveness of different segmentation schemes used with the proposed approach.

  10. Methodology and Applications in Non-linear Model-based Geostatistics

    DEFF Research Database (Denmark)

    Christensen, Ole Fredslund

    that are approximately Gaussian. Parameter estimation and prediction for the transformed Gaussian model is studied. In some cases a transformation cannot possibly render the data Gaussian. A methodology for analysing such data was introduced by Diggle, Tawn and Moyeed (1998): The generalised linear spatial model...... priors for Bayesian inference is discussed. Procedures for parameter estimation and prediction are studied. Theoretical properties of Markov chain Monte Carlo algorithms are investigated, and different algorithms are compared. In addition, the thesis contains a manual for an R-package, geoRglmm, which...

  11. Toward methodological emancipation in applied health research.

    Science.gov (United States)

    Thorne, Sally

    2011-04-01

    In this article, I trace the historical groundings of what have become methodological conventions in the use of qualitative approaches to answer questions arising from the applied health disciplines and advocate an alternative logic more strategically grounded in the epistemological orientations of the professional health disciplines. I argue for an increasing emphasis on the modification of conventional qualitative approaches to the particular knowledge demands of the applied practice domain, challenging the merits of what may have become unwarranted attachment to theorizing. Reorienting our methodological toolkits toward the questions arising within an evidence-dominated policy agenda, I encourage my applied health disciplinary colleagues to make themselves useful to that larger project by illuminating that which quantitative research renders invisible, problematizing the assumptions on which it generates conclusions, and filling in the gaps in knowledge needed to make decisions on behalf of people and populations.

  12. Case-based reasoning

    CERN Document Server

    Kolodner, Janet

    1993-01-01

    Case-based reasoning is one of the fastest growing areas in the field of knowledge-based systems and this book, authored by a leader in the field, is the first comprehensive text on the subject. Case-based reasoning systems are systems that store information about situations in their memory. As new problems arise, similar situations are searched out to help solve these problems. Problems are understood and inferences are made by finding the closest cases in memory, comparing and contrasting the problem with those cases, making inferences based on those comparisons, and asking questions whe

  13. Spectral Analysis of the Background in Ground-based, Long-slit ...

    Indian Academy of Sciences (India)

    1996-12-08

    Dec 8, 1996 ... Spectral Analysis of the Background in Ground-based,. Long-slit .... Figure 1 plots spectra from the 2-D array, after instrumental calibration and before correction for ..... which would merit attention and a better understanding.

  14. Chasing Small Exoplanets with Ground-Based Near-Infrared Transit Photometry

    Science.gov (United States)

    Colon, K. D.; Barentsen, G.; Vinicius, Z.; Vanderburg, A.; Coughlin, J.; Thompson, S.; Mullally, F.; Barclay, T.; Quintana, E.

    2017-11-01

    I will present results from a ground-based survey to measure the infrared radius and other properties of small K2 exoplanets and candidates. The survey is preparation for upcoming discoveries from TESS and characterization with JWST.

  15. Methodological variation in economic evaluations conducted in low- and middle-income countries: information for reference case development.

    Science.gov (United States)

    Santatiwongchai, Benjarin; Chantarastapornchit, Varit; Wilkinson, Thomas; Thiboonboon, Kittiphong; Rattanavipapong, Waranya; Walker, Damian G; Chalkidou, Kalipso; Teerawattananon, Yot

    2015-01-01

    Information generated from economic evaluation is increasingly being used to inform health resource allocation decisions globally, including in low- and middle- income countries. However, a crucial consideration for users of the information at a policy level, e.g. funding agencies, is whether the studies are comparable, provide sufficient detail to inform policy decision making, and incorporate inputs from data sources that are reliable and relevant to the context. This review was conducted to inform a methodological standardisation workstream at the Bill and Melinda Gates Foundation (BMGF) and assesses BMGF-funded cost-per-DALY economic evaluations in four programme areas (malaria, tuberculosis, HIV/AIDS and vaccines) in terms of variation in methodology, use of evidence, and quality of reporting. The findings suggest that there is room for improvement in the three areas of assessment, and support the case for the introduction of a standardised methodology or reference case by the BMGF. The findings are also instructive for all institutions that fund economic evaluations in LMICs and who have a desire to improve the ability of economic evaluations to inform resource allocation decisions.

  16. Applying Case-Based Method in Designing Self-Directed Online Instruction: A Formative Research Study

    Science.gov (United States)

    Luo, Heng; Koszalka, Tiffany A.; Arnone, Marilyn P.; Choi, Ikseon

    2018-01-01

    This study investigated the case-based method (CBM) instructional-design theory and its application in designing self-directed online instruction. The purpose of this study was to validate and refine the theory for a self-directed online instruction context. Guided by formative research methodology, this study first developed an online tutorial…

  17. Macrophysical and optical properties of midlatitude cirrus clouds from four ground-based lidars and collocated CALIOP observations

    Energy Technology Data Exchange (ETDEWEB)

    Dupont, Jean-Charles; Haeffelin, M.; Morille, Y.; Noel, V.; Keckhut, P.; Winker, D.; Comstock, Jennifer M.; Chervet, P.; Roblin, A.

    2010-05-27

    Ground-based lidar and CALIOP datasets gathered over four mid-latitude sites, two US and two French sites, are used to evaluate the consistency of cloud macrophysical and optical property climatologies that can be derived by such datasets. The consistency in average cloud height (both base and top height) between the CALIOP and ground datasets ranges from -0.4km to +0.5km. The cloud geometrical thickness distributions vary significantly between the different datasets, due in part to the original vertical resolutions of the lidar profiles. Average cloud geometrical thicknesses vary from 1.2 to 1.9km, i.e. by more than 50%. Cloud optical thickness distributions in subvisible, semi-transparent and moderate intervals differ by more than 50% between ground and space-based datasets. The cirrus clouds with 2 optical thickness below 0.1 (not included in historical cloud climatologies) represent 30-50% of the non-opaque cirrus class. The differences in average cloud base altitude between ground and CALIOP datasets of 0.0-0.1 km, 0.0-0.2 km and 0.0-0.2 km can be attributed to irregular sampling of seasonal variations in the ground-based data, to day-night differences in detection capabilities by CALIOP, and to the restriction to situations without low-level clouds in ground-based data, respectively. The cloud geometrical thicknesses are not affected by irregular sampling of seasonal variations in the ground-based data, while up to 0.0-0.2 km and 0.1-0.3 km differences can be attributed to day-night differences in detection capabilities by CALIOP, and to the restriction to situations without lowlevel clouds in ground-based data, respectively.

  18. Mechanisms of time-based figure-ground segregation.

    Science.gov (United States)

    Kandil, Farid I; Fahle, Manfred

    2003-11-01

    Figure-ground segregation can rely on purely temporal information, that is, on short temporal delays between positional changes of elements in figure and ground (Kandil, F.I. & Fahle, M. (2001) Eur. J. Neurosci., 13, 2004-2008). Here, we investigate the underlying mechanisms by measuring temporal segregation thresholds for various kinds of motion cues. Segregation can rely on monocular first-order motion (based on luminance modulation) and second-order motion cues (contrast modulation) with a high temporal resolution of approximately 20 ms. The mechanism can also use isoluminant motion with a reduced temporal resolution of 60 ms. Figure-ground segregation can be achieved even at presentation frequencies too high for human subjects to inspect successive frames individually. In contrast, when stimuli are presented dichoptically, i.e. separately to both eyes, subjects are unable to perceive any segregation, irrespective of temporal frequency. We propose that segregation in these displays is detected by a mechanism consisting of at least two stages. On the first level, standard motion or flicker detectors signal local positional changes (flips). On the second level, a segregation mechanism combines the local activities of the low-level detectors with high temporal precision. Our findings suggest that the segregation mechanism can rely on monocular detectors but not on binocular mechanisms. Moreover, the results oppose the idea that segregation in these displays is achieved by motion detectors of a higher order (motion-from-motion), but favour mechanisms sensitive to short temporal delays even without activation of higher-order motion detectors.

  19. A methodology of SiP testing based on boundary scan

    Science.gov (United States)

    Qin, He; Quan, Haiyang; Han, Yifei; Zhu, Tianrui; Zheng, Tuo

    2017-10-01

    System in Package (SiP) play an important role in portable, aerospace and military electronic with the microminiaturization, light weight, high density, and high reliability. At present, SiP system test has encountered the problem on system complexity and malfunction location with the system scale exponentially increase. For SiP system, this paper proposed a testing methodology and testing process based on the boundary scan technology. Combining the character of SiP system and referencing the boundary scan theory of PCB circuit and embedded core test, the specific testing methodology and process has been proposed. The hardware requirement of the under test SiP system has been provided, and the hardware platform of the testing has been constructed. The testing methodology has the character of high test efficiency and accurate malfunction location.

  20. Structural health monitoring methodology for aircraft condition-based maintenance

    Science.gov (United States)

    Saniger, Jordi; Reithler, Livier; Guedra-Degeorges, Didier; Takeda, Nobuo; Dupuis, Jean Pierre

    2001-06-01

    Reducing maintenance costs while keeping a constant level of safety is a major issue for Air Forces and airlines. The long term perspective is to implement condition based maintenance to guarantee a constant safety level while decreasing maintenance costs. On this purpose, the development of a generalized Structural Health Monitoring System (SHMS) is needed. The objective of such a system is to localize the damages and to assess their severity, with enough accuracy to allow low cost corrective actions. The present paper describes a SHMS based on acoustic emission technology. This choice was driven by its reliability and wide use in the aerospace industry. The described SHMS uses a new learning methodology which relies on the generation of artificial acoustic emission events on the structure and an acoustic emission sensor network. The calibrated acoustic emission events picked up by the sensors constitute the knowledge set that the system relies on. With this methodology, the anisotropy of composite structures is taken into account, thus avoiding the major cause of errors of classical localization methods. Moreover, it is adaptive to different structures as it does not rely on any particular model but on measured data. The acquired data is processed and the event's location and corrected amplitude are computed. The methodology has been demonstrated and experimental tests on elementary samples presented a degree of accuracy of 1cm.

  1. Organizing the Methodology Work at Higher School

    Directory of Open Access Journals (Sweden)

    O. A. Plaksina

    2012-01-01

    Full Text Available The paper considers the methodology components of organizing the higher school training. The research and analysis of the existing methodology systems carried out by the authors reveals that their advantages and disadvantages are related to the type of the system creating element of the methodology system organizational structure. The optimal scheme of such system has been developed in the context of Vocational School Reorganization implying the specification and expansion of the set of basic design principles of any control system. Following the suggested organizational approach provides the grounds for teachers’ self development and professional growth. The methodology of the approach allows using the given structure in any higher educational institution, providing the system transition from its simple functioning to the sustainable development mode. 

  2. Nighttime Aerosol Optical Depth Measurements Using a Ground-based Lunar Photometer

    Science.gov (United States)

    Berkoff, Tim; Omar, Ali; Haggard, Charles; Pippin, Margaret; Tasaddaq, Aasam; Stone, Tom; Rodriguez, Jon; Slutsker, Ilya; Eck, Tom; Holben, Brent; hide

    2015-01-01

    In recent years it was proposed to combine AERONET network photometer capabilities with a high precision lunar model used for satellite calibration to retrieve columnar nighttime AODs. The USGS lunar model can continuously provide pre-atmosphere high precision lunar irradiance determinations for multiple wavelengths at ground sensor locations. When combined with measured irradiances from a ground-based AERONET photometer, atmospheric column transmissions can determined yielding nighttime column aerosol AOD and Angstrom coefficients. Additional demonstrations have utilized this approach to further develop calibration methods and to obtain data in polar regions where extended periods of darkness occur. This new capability enables more complete studies of the diurnal behavior of aerosols, and feedback for models and satellite retrievals for the nighttime behavior of aerosols. It is anticipated that the nighttime capability of these sensors will be useful for comparisons with satellite lidars such as CALIOP and CATS in additional to ground-based lidars in MPLNET at night, when the signal-to-noise ratio is higher than daytime and more precise AOD comparisons can be made.

  3. Space debris removal using a high-power ground-based laser

    Energy Technology Data Exchange (ETDEWEB)

    Monroe, D.K.

    1993-12-31

    The feasibility and practicality of using a ground-based laser (GBL) to remove artificial space debris is examined. Physical constraints indicate that a reactor-pumped laser (RPL) may be best suited for this mission, because of its capabilities for multimegawatt output long run-times, and near-diffraction-limited initial beams. Simulations of a laser-powered debris removal system indicate that a 5-MW RPL with a 10-meter-diameter beam director and adaptive optics capabilities can deorbit 1-kg debris from space station altitudes. Larger debris can be deorbited or transferred to safer orbits after multiple laser engagements. A ground-based laser system may be the only realistic way to access and remove some 10,000 separate objects, having velocities in the neighborhood of 7 km/sec, and being spatially distributed over some 10{sup 10} km{sup 3} of space.

  4. A revised ground-motion and intensity interpolation scheme for shakemap

    Science.gov (United States)

    Worden, C.B.; Wald, D.J.; Allen, T.I.; Lin, K.; Garcia, D.; Cua, G.

    2010-01-01

    We describe a weighted-average approach for incorporating various types of data (observed peak ground motions and intensities and estimates from groundmotion prediction equations) into the ShakeMap ground motion and intensity mapping framework. This approach represents a fundamental revision of our existing ShakeMap methodology. In addition, the increased availability of near-real-time macroseismic intensity data, the development of newrelationships between intensity and peak ground motions, and new relationships to directly predict intensity from earthquake source information have facilitated the inclusion of intensity measurements directly into ShakeMap computations. Our approach allows for the combination of (1) direct observations (ground-motion measurements or reported intensities), (2) observations converted from intensity to ground motion (or vice versa), and (3) estimated ground motions and intensities from prediction equations or numerical models. Critically, each of the aforementioned data types must include an estimate of its uncertainties, including those caused by scaling the influence of observations to surrounding grid points and those associated with estimates given an unknown fault geometry. The ShakeMap ground-motion and intensity estimates are an uncertainty-weighted combination of these various data and estimates. A natural by-product of this interpolation process is an estimate of total uncertainty at each point on the map, which can be vital for comprehensive inventory loss calculations. We perform a number of tests to validate this new methodology and find that it produces a substantial improvement in the accuracy of ground-motion predictions over empirical prediction equations alone.

  5. Qualitative research in healthcare: an introduction to grounded theory using thematic analysis.

    Science.gov (United States)

    Chapman, A L; Hadfield, M; Chapman, C J

    2015-01-01

    In today's NHS, qualitative research is increasingly important as a method of assessing and improving quality of care. Grounded theory has developed as an analytical approach to qualitative data over the last 40 years. It is primarily an inductive process whereby theoretical insights are generated from data, in contrast to deductive research where theoretical hypotheses are tested via data collection. Grounded theory has been one of the main contributors to the acceptance of qualitative methods in a wide range of applied social sciences. The influence of grounded theory as an approach is, in part, based on its provision of an explicit framework for analysis and theory generation. Furthermore the stress upon grounding research in the reality of participants has also given it credence in healthcare research. As with all analytical approaches, grounded theory has drawbacks and limitations. It is important to have an understanding of these in order to assess the applicability of this approach to healthcare research. In this review we outline the principles of grounded theory, and focus on thematic analysis as the analytical approach used most frequently in grounded theory studies, with the aim of providing clinicians with the skills to critically review studies using this methodology.

  6. Drift design methodology and preliminary application for the Yucca Mountain Site Characterization Project

    International Nuclear Information System (INIS)

    Hardy, M.P.; Bauer, S.J.

    1991-12-01

    Excavation stability in an underground nuclear waste repository is required during construction, emplacement, retrieval (if required), and closure phases to ensure worker health and safety, and to prevent development of potential pathways for radionuclide migration in the post-closure period. Stable excavations are developed by appropriate excavation procedures, design of the room shape, design and installation of rock support reinforcement systems, and implementation of appropriate monitoring and maintenance programs. In addition to the loads imposed by the in situ stress field, the repository drifts will be impacted by thermal loads developed after waste emplacement and, periodically, by seismic loads from naturally occurring earthquakes and underground nuclear events. A priori evaluation of stability is required for design of the ground support system, to confirm that the thermal loads are reasonable, and to support the license application process. In this report, a design methodology for assessing drift stability is presented. This is based on site conditions, together with empirical and analytical methods. Analytical numerical methods are emphasized at this time because empirical data are unavailable for excavations in welded tuff either at elevated temperatures or under seismic loads. The analytical methodology incorporates analysis of rock masses that are systematically jointed, randomly jointed, and sparsely jointed. In situ thermal and seismic loads are considered. Methods of evaluating the analytical results and estimating ground support requirements for all the full range of expected ground conditions are outlines. The results of a preliminary application of the methodology using the limited available data are presented. 26 figs., 55 tabs

  7. Methodology for cloud-based design of robots

    Science.gov (United States)

    Ogorodnikova, O. M.; Vaganov, K. A.; Putimtsev, I. D.

    2017-09-01

    This paper presents some important results for cloud-based designing a robot arm by a group of students. Methodology for the cloud-based design was developed and used to initiate interdisciplinary project about research and development of a specific manipulator. The whole project data files were hosted by Ural Federal University data center. The 3D (three-dimensional) model of the robot arm was created using Siemens PLM software (Product Lifecycle Management) and structured as a complex mechatronics product by means of Siemens Teamcenter thin client; all processes were performed in the clouds. The robot arm was designed in purpose to load blanks up to 1 kg into the work space of the milling machine for performing student's researches.

  8. Building theories from case study research: the progressive case study

    NARCIS (Netherlands)

    Steenhuis, H.J.; de Bruijn, E.J.

    2006-01-01

    Meredith (1998) argues for more case and field research studies in the field of operations management. Based on a literature review, we discuss several existing approaches to case studies and their characteristics. These approaches include; the Grounded Theory approach which proposes no prior

  9. Toward High Altitude Airship Ground-Based Boresight Calibration of Hyperspectral Pushbroom Imaging Sensors

    Directory of Open Access Journals (Sweden)

    Aiwu Zhang

    2015-12-01

    Full Text Available The complexity of the single linear hyperspectral pushbroom imaging based on a high altitude airship (HAA without a three-axis stabilized platform is much more than that based on the spaceborne and airborne. Due to the effects of air pressure, temperature and airflow, the large pitch and roll angles tend to appear frequently that create pushbroom images highly characterized with severe geometric distortions. Thus, the in-flight calibration procedure is not appropriate to apply to the single linear pushbroom sensors on HAA having no three-axis stabilized platform. In order to address this problem, a new ground-based boresight calibration method is proposed. Firstly, a coordinate’s transformation model is developed for direct georeferencing (DG of the linear imaging sensor, and then the linear error equation is derived from it by using the Taylor expansion formula. Secondly, the boresight misalignments are worked out by using iterative least squares method with few ground control points (GCPs and ground-based side-scanning experiments. The proposed method is demonstrated by three sets of experiments: (i the stability and reliability of the method is verified through simulation-based experiments; (ii the boresight calibration is performed using ground-based experiments; and (iii the validation is done by applying on the orthorectification of the real hyperspectral pushbroom images from a HAA Earth observation payload system developed by our research team—“LanTianHao”. The test results show that the proposed boresight calibration approach significantly improves the quality of georeferencing by reducing the geometric distortions caused by boresight misalignments to the minimum level.

  10. Investigation of the structure and lithology of bedrock concealed by basin fill, using ground-based magnetic-field-profile data acquired in the San Rafael Basin, southeastern Arizona

    Science.gov (United States)

    Bultman, Mark W.

    2013-01-01

    Data on the Earth’s total-intensity magnetic field acquired near ground level and at measurement intervals as small as 1 m include information on the spatial distribution of nearsurface magnetic dipoles that in many cases are unique to a specific lithology. Such spatial information is expressed in the texture (physical appearance or characteristics) of the data at scales of hundreds of meters to kilometers. These magnetic textures are characterized by several descriptive statistics, their power spectrum, and their multifractal spectrum. On the basis of a graphical comparison and textural characterization, ground-based magnetic-field profile data can be used to estimate bedrock lithology concealed by as much as 100 m of basin fill in some cases, information that is especially important in assessing and exploring for concealed mineral deposits. I demonstrate that multifractal spectra of ground-based magnetic-field-profile data can be used to differentiate exposed lithologies and that the shape and position of the multifractal spectrum of the ground-based magnetic-field-profile of concealed lithologies can be matched to the upward-continued multifractal spectrum of an exposed lithology to help distinguish the concealed lithology. In addition, ground-based magnetic-field-profile data also detect minute differences in the magnetic susceptibility of rocks over small horizontal and vertical distances and so can be used for precise modeling of bedrock geometry and structure, even when that bedrock is concealed by 100 m or more of nonmagnetic basin fill. Such data contain valuable geologic information on the bedrock concealed by basin fill that may not be so visible in aeromagnetic data, including areas of hydrothermal alteration, faults, and other bedrock structures. Interpretation of these data in the San Rafael Basin, southeastern Arizona, has yielded results for estimating concealed lithologies, concealed structural geology, and a concealed potential mineral

  11. Grounded theory in music therapy research.

    Science.gov (United States)

    O'Callaghan, Clare

    2012-01-01

    Grounded theory is one of the most common methodologies used in constructivist (qualitative) music therapy research. Researchers use the term "grounded theory" when denoting varying research designs and theoretical outcomes. This may be challenging for novice researchers when considering whether grounded theory is appropriate for their research phenomena. This paper examines grounded theory within music therapy research. Grounded theory is briefly described, including some of its "contested" ideas. A literature search was conducted using the descriptor "music therapy and grounded theory" in Pubmed, CINAHL PsychlNFO, SCOPUS, ERIC (CSA), Web of Science databases, and a music therapy monograph series. A descriptive analysis was performed on the uncovered studies to examine researched phenomena, grounded theory methods used, and how findings were presented, Thirty music therapy research projects were found in refereed journals and monographs from 1993 to "in press." The Strauss and Corbin approach to grounded theory dominates the field. Descriptors to signify grounded theory components in the studies greatly varied. Researchers have used partial or complete grounded theory methods to examine clients', family members', staff, music therapy "overhearers," music therapists', and students' experiences, as well as music therapy creative products and professional views, issues, and literature. Seven grounded theories were offered. It is suggested that grounded theory researchers clarify what and who inspired their design, why partial grounded theory methods were used (when relevant), and their ontology. By elucidating assumptions underpinning the data collection, analysis, and findings' contribution, researchers will continue to improve music therapy research using grounded theory methods.

  12. Application of Binomial Model and Market Asset Declaimer Methodology for Valuation of Abandon and Expand Options. The Case Study

    Directory of Open Access Journals (Sweden)

    Paweł Mielcarz

    2007-06-01

    Full Text Available The article presents a case study of valuation of real options included in a investment project. The main goal of the article is to present the calculation and methodological issues of application the methodology for real option valuation. In order to do it there are used the binomial model and Market Asset Declaimer methodology. The project presented in the article concerns the introduction of radio station to a new market. It includes two valuable real options: to abandon the project and to expand.

  13. A methodology for landfill location using geographic information systems: a Colombian regional case

    OpenAIRE

    Carlos Alfonso Zafra Mejía; Franklin Andrés Mendoza Castañeda; Paula Alejandra Montoya Varela

    2012-01-01

    The regions’ economic growth and accelerated development have created high solid waste production rates; such waste is disposed of in many localities in places without any technical and/or environmental measures having been taken. This paper presents guidelines for locating landfills by combining geographic information systems (GIS) with analytic hierarchy process (AHP) and simple additive weighting (SAW). The methodology so developed was applied to the regional case of Tame in the Arauca dep...

  14. A methodology for landfill location using geographic information systems: a Colombian regional case

    OpenAIRE

    Zafra Mejía, Carlos Alfonso; Mendoza Castañeda, Franklin Andrés; Montoya Varela, Paula Alejandra

    2012-01-01

    The regions' economic growth and accelerated development have created high solid waste production rates; such waste is dis-posed of in many localities in places without any technical and/or environmental measures having been taken. This paper presents guidelines for locating landfills by combining geographic information systems (GIS) with analytic hierarchy process (AHP) and simple additive weighting (SAW). The methodology so developed was applied to the regional case of Tame in the Arauca de...

  15. Risk-based prioritization of ground water threatening point sources at catchment and regional scales

    DEFF Research Database (Denmark)

    Overheu, Niels Døssing; Tuxen, Nina; Flyvbjerg, John

    2014-01-01

    framework has been developed to enable a systematic and transparent risk assessment and prioritization of contaminant point sources, considering the local, catchment, or regional scales (Danish EPA, 2011, 2012). The framework has been tested in several catchments in Denmark with different challenges...... and needs, and two of these are presented. Based on the lessons learned, the Danish EPA has prepared a handbook to guide the user through the steps in a risk-based prioritization (Danish EPA, 2012). It provides guidance on prioritization both in an administratively defined area such as a Danish Region...... of the results are presented using the case studies as examples. The methodology was developed by a broad industry group including the Danish EPA, the Danish Regions, the Danish Nature Agency, the Technical University of Denmark, and consultants — and the framework has been widely accepted by the professional...

  16. Developing Foucault's Discourse Analytic Methodology

    Directory of Open Access Journals (Sweden)

    Rainer Diaz-Bone

    2006-01-01

    Full Text Available A methodological position for a FOUCAULTian discourse analysis is presented. A sequence of analytical steps is introduced and an illustrating example is offered. It is emphasized that discourse analysis has to discover the system-level of discursive rules and the deeper structure of the discursive formation. Otherwise the analysis will be unfinished. Michel FOUCAULTs work is theoretically grounded in French structuralism and (the so called post-structuralism. In this paper, post-structuralism is not conceived as a means for overcoming of structuralism, but as a way of critically continuing the structural perspective. In this way, discursive structures can be related to discursive practices and the concept of structure can be disclosed (e. g. to inter-discourse or DERRIDAs concept of structurality. In this way, the structural methodology is continued and radicalized, but not given up. In this paper, FOUCAULTs theory is combined with the works of Michel PÊCHEUX and (especially for the sociology of knowledge and the sociology of culture Pierre BOURDIEU. The practice of discourse analysis is theoretically grounded. This practice can be conceived as a reflexive coupling of deconstruction and reconstruction in the material to be analyzed. This methodology therefore can be characterized as a reconstructive qualitative methodology. At the end of the article, forms of discourse analysis are criticized that do not intend to recover the system level of discursive rules and that do not intend to discover the deeper structure of the discursive formation (i. e. episteme, socio-episteme. These forms merely are commentaries of discourses (not their analyses, they remain phenomenological and are therefore: pre-structuralist. URN: urn:nbn:de:0114-fqs060168

  17. Proposal of an Embedded Methodology that uses Organizational Diagnosis and Reengineering: Case of bamboo panel company

    Directory of Open Access Journals (Sweden)

    Eva Selene Hernández Gress

    2017-08-01

    Full Text Available This work is an extension of the Proceedings of the International Conference on Industrial Engineering, Management Science and Applications, which presented some of the phases of Reengineering applied to Bamboo Panel Company; the results were Strategic planning, Systemic Diagnosis and Performance Indicators through the Balanced Scorecard. Now, the main purpose of this article is to present a methodology that embedding Organizational Diagnosis and Reengineering, which emphasizes the incorporation of culture, context, management style, and knowledge as well as inner and outer actors. The results of the proposed methodology applied to the case study are included, up to the moment of the writing of this article. Future work consists on the development of strategies for Innovation as a strategy planned in the Balanced Scorecard and derived from the embedded methodology.

  18. Seismic methodology in determining basis earthquake for nuclear installation

    International Nuclear Information System (INIS)

    Ameli Zamani, Sh.

    2008-01-01

    Design basis earthquake ground motions for nuclear installations should be determined to assure the design purpose of reactor safety: that reactors should be built and operated to pose no undue risk to public health and safety from earthquake and other hazards. Regarding the influence of seismic hazard to a site, large numbers of earthquake ground motions can be predicted considering possible variability among the source, path, and site parameters. However, seismic safety design using all predicted ground motions is practically impossible. In the determination of design basis earthquake ground motions it is therefore important to represent the influences of the large numbers of earthquake ground motions derived from the seismic ground motion prediction methods for the surrounding seismic sources. Viewing the relations between current design basis earthquake ground motion determination and modem earthquake ground motion estimation, a development of risk-informed design basis earthquake ground motion methodology is discussed for insight into the on going modernization of the Examination Guide for Seismic Design on NPP

  19. Design of an Integrated Methodology for Analytical Design of Complex Supply Chains

    Directory of Open Access Journals (Sweden)

    Shahid Rashid

    2012-01-01

    Full Text Available A literature review and gap analysis indentifies key limitations of industry best practice when modelling of supply chains. To address these limitations the paper reports on the conception and development of an integrated modelling methodology designed to underpin the analytical design of complex supply chains. The methodology is based upon a systematic deployment of EM, CLD, and SM techniques; the integration of which is achieved via common modelling concepts and decomposition principles. Thereby the methodology facilitates: (i graphical representation and description of key “processing”, “resourcing” and “work flow” properties of supply chain configurations; (ii behavioural exploration of currently configured supply chains, to facilitate reasoning about uncertain demand impacts on supply, make, delivery, and return processes; (iii predictive quantification about relative performances of alternative complex supply chain configurations, including risk assessments. Guidelines for the application of each step of the methodology are described. Also described are recommended data collection methods and expected modelling outcomes for each step. The methodology is being extensively case tested to quantify potential benefits & costs relative to current best industry practice. The paper reflects on preliminary benefits gained during industry based case study modelling and identifies areas of potential improvement.

  20. Validating agent oriented methodology (AOM) for netlogo modelling and simulation

    Science.gov (United States)

    WaiShiang, Cheah; Nissom, Shane; YeeWai, Sim; Sharbini, Hamizan

    2017-10-01

    AOM (Agent Oriented Modeling) is a comprehensive and unified agent methodology for agent oriented software development. AOM methodology was proposed to aid developers with the introduction of technique, terminology, notation and guideline during agent systems development. Although AOM methodology is claimed to be capable of developing a complex real world system, its potential is yet to be realized and recognized by the mainstream software community and the adoption of AOM is still at its infancy. Among the reason is that there are not much case studies or success story of AOM. This paper presents two case studies on the adoption of AOM for individual based modelling and simulation. It demonstrate how the AOM is useful for epidemiology study and ecological study. Hence, it further validate the AOM in a qualitative manner.

  1. Electroencephalogram-based methodology for determining unconsciousness during depopulation.

    Science.gov (United States)

    Benson, E R; Alphin, R L; Rankin, M K; Caputo, M P; Johnson, A L

    2012-12-01

    When an avian influenza or virulent Newcastle disease outbreak occurs within commercial poultry, key steps involved in managing a fast-moving poultry disease can include: education; biosecurity; diagnostics and surveillance; quarantine; elimination of infected poultry through depopulation or culling, disposal, and disinfection; and decreasing host susceptibility. Available mass emergency depopulation procedures include whole-house gassing, partial-house gassing, containerized gassing, and water-based foam. To evaluate potential depopulation methods, it is often necessary to determine the time to the loss of consciousness (LOC) in poultry. Many current approaches to evaluating LOC are qualitative and require visual observation of the birds. This study outlines an electroencephalogram (EEG) frequency domain-based approach for determining the point at which a bird loses consciousness. In this study, commercial broilers were used to develop the methodology, and the methodology was validated with layer hens. In total, 42 data sets from 13 broilers aged 5-10 wk and 12 data sets from four spent hens (age greater than 1 yr) were collected and analyzed. A wireless EEG transmitter was surgically implanted, and each bird was monitored during individual treatment with isoflurane anesthesia. EEG data were evaluated using a frequency-based approach. The alpha/delta (A/D, alpha: 8-12 Hz, delta: 0.5-4 Hz) ratio and loss of posture (LOP) were used to determine the point at which the birds became unconscious. Unconsciousness, regardless of the method of induction, causes suppression in alpha and a rise in the delta frequency component, and this change is used to determine unconsciousness. There was no statistically significant difference between time to unconsciousness as measured by A/D ratio or LOP, and the A/D values were correlated at the times of unconsciousness. The correlation between LOP and A/D ratio indicates that the methodology is appropriate for determining

  2. Communication skills: a mandatory competence for ground and airplane crew to reduce tension in extreme situations

    Directory of Open Access Journals (Sweden)

    Isabel Cristina dos Santos

    2010-09-01

    Full Text Available Communication skills have been considered a strategic asset for any kind of organization. However, technical-oriented enterprises usually emphasize the virtues of a cluster of technical competences and technological resources availability. So, this paper aimed to discuss communication skills development beyond technical communication in a high technology and technical-based operation, such as ground and flight operations. To do so, this article describes some tragic-ending cases in commercial aviation in which the poor quality of interpersonal communication was identified as the one of the most influential causes of the aircraft, or at least that was seen as a compelling force for creating the perfect backdrop for a disaster involving civilian aircrafts. Methodological procedures were basically addressed to a qualitative approach, supported by a documental research considering some of the most documented cases of aircraft accidents reported by the Aviation System Safety Report, issued by Federal Aviation Administration (FAA, USA, as well as reports of accidents provided by The National Transportation Safety Board (NTSB, USA, and by the Center for Aircraft Research and Prevention (Cenipa, Brasil.

  3. Long-range transport of dust aerosols over the Arabian Sea and Indian region – A case study using satellite data and ground-based measurements

    Digital Repository Service at National Institute of Oceanography (India)

    Badarinath, K.V.S.; Kharol, S.K.; Kaskaoutis, D.G.; Sharma, A; Ramaswamy, V.; Kambezidis, H.D.

    The present study addresses an intense dust storm event over the Persian Gulf and the Arabian Sea (AS) region and its transport over the Indian subcontinent using multi-satellite observations and ground-based measurements. A time series of Indian...

  4. Infrared stereo calibration for unmanned ground vehicle navigation

    Science.gov (United States)

    Harguess, Josh; Strange, Shawn

    2014-06-01

    The problem of calibrating two color cameras as a stereo pair has been heavily researched and many off-the-shelf software packages, such as Robot Operating System and OpenCV, include calibration routines that work in most cases. However, the problem of calibrating two infrared (IR) cameras for the purposes of sensor fusion and point could generation is relatively new and many challenges exist. We present a comparison of color camera and IR camera stereo calibration using data from an unmanned ground vehicle. There are two main challenges in IR stereo calibration; the calibration board (material, design, etc.) and the accuracy of calibration pattern detection. We present our analysis of these challenges along with our IR stereo calibration methodology. Finally, we present our results both visually and analytically with computed reprojection errors.

  5. Design-based research as a methodological approach to support participatory engagement of learners in the development of learning technologies

    OpenAIRE

    McDowell, James

    2015-01-01

    Following the origination of the design experiment as a mechanism to introduce learning interventions into the messy conditions of the classroom (Brown, 1992; Collins, 1992), design-based research (DBR) faced criticism from opposing paradigmatic camps before its acknowledgement as a promising methodology in which “formative evaluation plays a significant role” (Dede, Ketelhut, Whitehouse, Breit & McCloskey, 2009, p.16). \\ud \\ud This session presents a case study of a researcher-practitioner i...

  6. Studies of national research performance: A case of ‘methodological nationalism’ and ‘zombie science’?

    DEFF Research Database (Denmark)

    Sørensen, Mads P.; Schneider, Jesper Wiborg

    2017-01-01

    The analytical point of departure in this paper is the ongoing debate, initiated by Ulrich Beck, on methodological nationalism within the social sciences. Based on a comprehensive study of research collaboration and mobility of researchers this paper discusses possible traces of methodological...... with researchers in other countries. The national research institutions are increasingly transnationalised due to the growing mobility of researchers. Based on an examination of all the papers registered in the Thompson Reuter’s Web of Science database we follow the development in research collaboration...

  7. a Universal De-Noising Algorithm for Ground-Based LIDAR Signal

    Science.gov (United States)

    Ma, Xin; Xiang, Chengzhi; Gong, Wei

    2016-06-01

    Ground-based lidar, working as an effective remote sensing tool, plays an irreplaceable role in the study of atmosphere, since it has the ability to provide the atmospheric vertical profile. However, the appearance of noise in a lidar signal is unavoidable, which leads to difficulties and complexities when searching for more information. Every de-noising method has its own characteristic but with a certain limitation, since the lidar signal will vary with the atmosphere changes. In this paper, a universal de-noising algorithm is proposed to enhance the SNR of a ground-based lidar signal, which is based on signal segmentation and reconstruction. The signal segmentation serving as the keystone of the algorithm, segments the lidar signal into three different parts, which are processed by different de-noising method according to their own characteristics. The signal reconstruction is a relatively simple procedure that is to splice the signal sections end to end. Finally, a series of simulation signal tests and real dual field-of-view lidar signal shows the feasibility of the universal de-noising algorithm.

  8. Reliability analysis for power supply system in a reprocessing facility based on GO methodology

    International Nuclear Information System (INIS)

    Wang Renze

    2014-01-01

    GO methodology was applied to analyze the reliability of power supply system in a typical reprocessing facility. Based on the fact that tie breakers are set in the system, tie breaker operator was defined. Then GO methodology modeling and quantitative analysis were performed sequently, minimal cut sets and average unavailability of the system were obtained. Parallel analysis between GO methodology and fault tree methodology was also performed. The results showed that setup of tie breakers was rational and necessary and that the modeling was much easier and the chart was much more succinct for GO methodology parallel with fault tree methodology to analyze the reliability of the power supply system. (author)

  9. Methodology to design a municipal solid waste pre-collection system. A case study

    International Nuclear Information System (INIS)

    Gallardo, A.; Carlos, M.; Peris, M.; Colomer, F.J.

    2015-01-01

    Highlights: • MSW recovery starts at homes; therefore it is important to facilitate it to people. • Additionally, to optimize MSW collection a previous pre-collection must be planned. • A methodology to organize pre-collection considering several factors is presented. • The methodology has been verified applying it to a Spanish middle town. - Abstract: The municipal solid waste (MSW) management is an important task that local governments as well as private companies must take into account to protect human health, the environment and to preserve natural resources. To design an adequate MSW management plan the first step consists in defining the waste generation and composition patterns of the town. As these patterns depend on several socio-economic factors it is advisable to organize them previously. Moreover, the waste generation and composition patterns may vary around the town and over the time. Generally, the data are not homogeneous around the city as the number of inhabitants is not constant nor it is the economic activity. Therefore, if all the information is showed in thematic maps, the final waste management decisions can be made more efficiently. The main aim of this paper is to present a structured methodology that allows local authorities or private companies who deal with MSW to design its own MSW management plan depending on the available data. According to these data, this paper proposes two ways of action: a direct way when detailed data are available and an indirect way when there is a lack of data and it is necessary to take into account bibliographic data. In any case, the amount of information needed is considerable. This paper combines the planning methodology with the Geographic Information Systems to present the final results in thematic maps that make easier to interpret them. The proposed methodology is a previous useful tool to organize the MSW collection routes including the selective collection. To verify the methodology it has

  10. Ground-based observations coordinated with Viking satellite measurements

    International Nuclear Information System (INIS)

    Opgenoorth, H.J.; Kirkwood, S.

    1989-01-01

    The instrumentation and the orbit of the Viking satellite made this first Swedish satellite mission ideally suited for coordinated observations with the dense network of ground-based stations in northern Scandinavia. Several arrays of complementing instruments such as magnetometers, all-sky cameras, riometers and doppler radars monitored on a routine basis the ionosphere under the magnetospheric region passed by Viking. For a large number of orbits the Viking passages close to Scandinavia were covered by the operation of specially designed programmes at the European incoherent-scatter facility (EISCAT). First results of coordinated observations on the ground and aboard Viking have shed new light on the most spectacular feature of substorm expansion, the westward-travelling surge. The end of a substorm and the associated decay of a westward-travelling surge have been analysed. EISCAT measurements of high spatial and temporal resolution indicate that the conductivities and electric fields associated with westward-travelling surges are not represented correctly by the existing models. (author)

  11. Strong Sporadic E Occurrence Detected by Ground-Based GNSS

    Science.gov (United States)

    Sun, Wenjie; Ning, Baiqi; Yue, Xinan; Li, Guozhu; Hu, Lianhuan; Chang, Shoumin; Lan, Jiaping; Zhu, Zhengping; Zhao, Biqiang; Lin, Jian

    2018-04-01

    The ionospheric sporadic E (Es) layer has significant impact on radio wave propagation. The traditional techniques employed for Es layer observation, for example, ionosondes, are not dense enough to resolve the morphology and dynamics of Es layer in spatial distribution. The ground-based Global Navigation Satellite Systems (GNSS) technique is expected to shed light on the understanding of regional strong Es occurrence, owing to the facts that the critical frequency (foEs) of strong Es structure is usually high enough to cause pulse-like disturbances in GNSS total electron content (TEC), and a large number of GNSS receivers have been deployed all over the world. Based on the Chinese ground-based GNSS networks, including the Crustal Movement Observation Network of China and the Beidou Ionospheric Observation Network, a large-scale strong Es event was observed in the middle latitude of China. The strong Es shown as a band-like structure in the southwest-northeast direction extended more than 1,000 km. By making a comparative analysis of Es occurrences identified from the simultaneous observations by ionosondes and GNSS TEC receivers over China middle latitude statistically, we found that GNSS TEC can be well employed to observe strong Es occurrence with a threshold value of foEs, 14 MHz.

  12. A probabilistic seismic risk assessment procedure for nuclear power plants: (I) Methodology

    Science.gov (United States)

    Huang, Y.-N.; Whittaker, A.S.; Luco, N.

    2011-01-01

    A new procedure for probabilistic seismic risk assessment of nuclear power plants (NPPs) is proposed. This procedure modifies the current procedures using tools developed recently for performance-based earthquake engineering of buildings. The proposed procedure uses (a) response-based fragility curves to represent the capacity of structural and nonstructural components of NPPs, (b) nonlinear response-history analysis to characterize the demands on those components, and (c) Monte Carlo simulations to determine the damage state of the components. The use of response-rather than ground-motion-based fragility curves enables the curves to be independent of seismic hazard and closely related to component capacity. The use of Monte Carlo procedure enables the correlation in the responses of components to be directly included in the risk assessment. An example of the methodology is presented in a companion paper to demonstrate its use and provide the technical basis for aspects of the methodology. ?? 2011 Published by Elsevier B.V.

  13. Modeling Nonlinear Site Response Uncertainty in Broadband Ground Motion Simulations for the Los Angeles Basin

    Science.gov (United States)

    Assimaki, D.; Li, W.; Steidl, J. M.; Schmedes, J.

    2007-12-01

    The assessment of strong motion site response is of great significance, both for mitigating seismic hazard and for performing detailed analyses of earthquake source characteristics. There currently exists, however, large degree of uncertainty concerning the mathematical model to be employed for the computationally efficient evaluation of local site effects, and the site investigation program necessary to evaluate the nonlinear input model parameters and ensure cost-effective predictions; and while site response observations may provide critical constraints on interpretation methods, the lack of a statistically significant number of in-situ strong motion records prohibits statistical analyses to be conducted and uncertainties to be quantified based entirely on field data. In this paper, we combine downhole observations and broadband ground motion synthetics for characteristic site conditions the Los Angeles Basin, and investigate the variability in ground motion estimation introduced by the site response assessment methodology. In particular, site-specific regional velocity and attenuation structures are initially compiled using near-surface geotechnical data collected at downhole geotechnical arrays, inverse low-strain velocity and attenuation profiles at these sites obtained by inversion of weak motion records and the crustal velocity structure at the corresponding locations obtained from the Southern California Earthquake Centre Community Velocity Model. Successively, broadband ground motions are simulated by means of a hybrid low/high-frequency finite source model with correlated random parameters for rupture scenaria of weak, medium and large magnitude events (M =3.5-7.5). Observed estimates of site response at the stations of interest are first compared to the ensemble of approximate and incremental nonlinear site response models. Parametric studies are next conducted for each fixed magnitude (fault geometry) scenario by varying the source-to-site distance and

  14. Consistent interactive segmentation of pulmonary ground glass nodules identified in CT studies

    Science.gov (United States)

    Zhang, Li; Fang, Ming; Naidich, David P.; Novak, Carol L.

    2004-05-01

    Ground glass nodules (GGNs) have proved especially problematic in lung cancer diagnosis, as despite frequently being malignant they characteristically have extremely slow rates of growth. This problem is further magnified by the small size of many of these lesions now being routinely detected following the introduction of multislice CT scanners capable of acquiring contiguous high resolution 1 to 1.25 mm sections throughout the thorax in a single breathhold period. Although segmentation of solid nodules can be used clinically to determine volume doubling times quantitatively, reliable methods for segmentation of pure ground glass nodules have yet to be introduced. Our purpose is to evaluate a newly developed computer-based segmentation method for rapid and reproducible measurements of pure ground glass nodules. 23 pure or mixed ground glass nodules were identified in a total of 8 patients by a radiologist and subsequently segmented by our computer-based method using Markov random field and shape analysis. The computer-based segmentation was initialized by a click point. Methodological consistency was assessed using the overlap ratio between 3 segmentations initialized by 3 different click points for each nodule. The 95% confidence interval on the mean of the overlap ratios proved to be [0.984, 0.998]. The computer-based method failed on two nodules that were difficult to segment even manually either due to especially low contrast or markedly irregular margins. While achieving consistent manual segmentation of ground glass nodules has proven problematic most often due to indistinct boundaries and interobserver variability, our proposed method introduces a powerful new tool for obtaining reproducible quantitative measurements of these lesions. It is our intention to further document the value of this approach with a still larger set of ground glass nodules.

  15. A long-term study of aerosol–cloud interactions and their radiative effect at the Southern Great Plains using ground-based measurements

    Directory of Open Access Journals (Sweden)

    E. T. Sena

    2016-09-01

    Full Text Available Empirical estimates of the microphysical response of cloud droplet size distribution to aerosol perturbations are commonly used to constrain aerosol–cloud interactions in climate models. Instead of empirical microphysical estimates, here macroscopic variables are analyzed to address the influence of aerosol particles and meteorological descriptors on instantaneous cloud albedo and the radiative effect of shallow liquid water clouds. Long-term ground-based measurements from the Atmospheric Radiation Measurement (ARM program over the Southern Great Plains are used. A broad statistical analysis was performed on 14 years of coincident measurements of low clouds, aerosol, and meteorological properties. Two cases representing conflicting results regarding the relationship between the aerosol and the cloud radiative effect were selected and studied in greater detail. Microphysical estimates are shown to be very uncertain and to depend strongly on the methodology, retrieval technique and averaging scale. For this continental site, the results indicate that the influence of the aerosol on the shallow cloud radiative effect and albedo is weak and that macroscopic cloud properties and dynamics play a much larger role in determining the instantaneous cloud radiative effect compared to microphysical effects. On a daily basis, aerosol shows no correlation with cloud radiative properties (correlation = −0.01 ± 0.03, whereas the liquid water path shows a clear signal (correlation = 0.56 ± 0.02.

  16. UAV-Borne photogrammetry: a low cost 3D surveying methodology for cartographic update

    Directory of Open Access Journals (Sweden)

    Caroti Gabriella

    2017-01-01

    Full Text Available Territorial management requires the most possible up-to-date mapping support of the status quo. Regional scale cartography update cycle is in the medium term (10 to 15 years: therefore, in the intervening time between updates relevant Authorities must provide timely updates for new works or territorial changes. Required surveys can exploit several technologies: ground-based GPS, Terrestrial Laser Scanning (TLS, traditional topography, or, in the case of wider areas, airborne photogrammetry or laser scanning. In recent years UAV-based photogrammetry is becoming increasingly widespread as a versatile, low-cost surveying system for small to medium areas. This surveying methodology was used to generate, in order, a dense point cloud, a high resolution Digital Surface Model (DSM and an orthophotograph of a newly built marina by the mouth of the Arno river in Pisa, Italy, which is not yet included in cartography. Surveying activities took place while the construction site was in operation. Case study issues surfaced in the course of the survey are presented and discussed, suggesting ‘good practice’ rules which, if followed in the survey planning step, can lessen unwanted effects due to criticalities. Besides, results of quality analysis of orthophotographs generated by UAV-borne images are also presented. Such results are discussed in view of a possible use of orthophotographs in updating medium- to large-scale cartography and checked against existing blueprints.

  17. Estimation of the aerosol radiative forcing at ground level, over land, and in cloudless atmosphere, from METEOSAT-7 observation: method and case study

    Directory of Open Access Journals (Sweden)

    T. Elias

    2008-02-01

    Full Text Available A new method is proposed to estimate the spatial and temporal variability of the solar radiative flux reaching the surface over land (DSSF, as well as the Aerosol Radiative Forcing (ARF, in cloud-free atmosphere. The objective of regional applications of the method is attainable by using the visible broadband of METEOSAT-7 satellite instrument which scans Europe and Africa on a half-hourly basis. The method relies on a selection of best correspondence between METEOSAT-7 radiance and radiative transfer computations.

    The validation of DSSF is performed comparing retrievals with ground-based measurements acquired in two contrasted environments: an urban site near Paris and a continental background site located South East of France. The study is concentrated on aerosol episodes occurring around the 2003 summer heat wave, providing 42 cases of comparison for variable solar zenith angle (from 59° to 69°, variable aerosol type (biomass burning emissions and urban pollution, and variable aerosol optical thickness (a factor 6 in magnitude. The method reproduces measurements of DSSF within an accuracy assessment of 20 W m−2 (5% in relative in 70% of the situations, and within 40 W m−2 in 90% of the situations, for the two case studies considered here.

    Considering aerosol is the main contributor in changing the measured radiance at the top of the atmosphere, DSSF temporal variability is assumed to be caused only by aerosols, and consequently ARF at ground level and over land is also retrieved: ARF is computed as the difference between DSSF and a parameterised aerosol-free reference level. Retrievals are linearly correlated with the ground-based measurements of the aerosol optical thickness (AOT: sensitivity is included between 120 and 160 W m−2 per unity of AOT at 440 nm. AOT being an instantaneous measure indicative of the aerosol columnar amount, we prove the feasibility to infer instantaneous

  18. Neural Correlates of Auditory Figure-Ground Segregation Based on Temporal Coherence.

    Science.gov (United States)

    Teki, Sundeep; Barascud, Nicolas; Picard, Samuel; Payne, Christopher; Griffiths, Timothy D; Chait, Maria

    2016-09-01

    To make sense of natural acoustic environments, listeners must parse complex mixtures of sounds that vary in frequency, space, and time. Emerging work suggests that, in addition to the well-studied spectral cues for segregation, sensitivity to temporal coherence-the coincidence of sound elements in and across time-is also critical for the perceptual organization of acoustic scenes. Here, we examine pre-attentive, stimulus-driven neural processes underlying auditory figure-ground segregation using stimuli that capture the challenges of listening in complex scenes where segregation cannot be achieved based on spectral cues alone. Signals ("stochastic figure-ground": SFG) comprised a sequence of brief broadband chords containing random pure tone components that vary from 1 chord to another. Occasional tone repetitions across chords are perceived as "figures" popping out of a stochastic "ground." Magnetoencephalography (MEG) measurement in naïve, distracted, human subjects revealed robust evoked responses, commencing from about 150 ms after figure onset that reflect the emergence of the "figure" from the randomly varying "ground." Neural sources underlying this bottom-up driven figure-ground segregation were localized to planum temporale, and the intraparietal sulcus, demonstrating that this area, outside the "classic" auditory system, is also involved in the early stages of auditory scene analysis." © The Author 2016. Published by Oxford University Press.

  19. Study of the unknown hemisphere of mercury by ground-based astronomical facilities

    Science.gov (United States)

    Ksanfomality, L. V.

    2011-08-01

    The short exposure method proved to be very productive in ground-based observations of Mercury. Telescopic observations with short exposures, together with computer codes for the processing of data arrays of many thousands of original electronic photos, make it possible to improve the resolution of images from ground-based instruments to almost the diffraction limit. The resulting composite images are comparable with images from spacecrafts approaching from a distance of about 1 million km. This paper presents images of the hemisphere of Mercury in longitude sectors 90°-180°W, 215°-350°W, and 50°-90°W, including, among others, areas not covered by spacecraft cameras. For the first time a giant S basin was discovered in the sector of longitudes 250°-290°W, which is the largest formation of this type on terrestrial planets. Mercury has a strong phase effects. As a result, the view of the surface changes completely with the change in the planetary phase. But the choice of the phase in the study using spacecrafts is limited by orbital characteristics of the mission. Thus, ground-based observations of the planet provide a valuable support.

  20. Clarification of the Blurred Boundaries between Grounded Theory and Ethnography: Differences and Similarities

    Science.gov (United States)

    Aldiabat, Khaldoun; Le Navenec, Carol-Lynne

    2011-01-01

    There is confusion among graduate students about how to select the qualitative methodology that best fits their research question. Often this confusion arises in regard to making a choice between a grounded theory methodology and an ethnographic methodology. This difficulty may stem from the fact that these students do not have a clear…

  1. Chair Report Consultancy Meeting on Nuclear Security Assessment Methodologies (NUSAM) Transport Case Study Working Group

    Energy Technology Data Exchange (ETDEWEB)

    Shull, Doug [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-19

    The purpose of the consultancy assignment was to (i) apply the NUSAM assessment methods to hypothetical transport security table top exercise (TTX) analyses and (ii) document its results to working materials of NUSAM case study on transport. A number of working group observations, using the results of TTX methodologies, are noted in the report.

  2. Traveling magnetopause distortion related to a large-scale magnetosheath plasma jet: THEMIS and ground-based observations

    Science.gov (United States)

    Dmitriev, A. V.; Suvorova, A. V.

    2012-08-01

    Here, we present a case study of THEMIS and ground-based observations of the perturbed dayside magnetopause and the geomagnetic field in relation to the interaction of an interplanetary directional discontinuity (DD) with the magnetosphere on 16 June 2007. The interaction resulted in a large-scale local magnetopause distortion of an "expansion - compression - expansion" (ECE) sequence that lasted for ˜15 min. The compression was caused by a very dense, cold, and fast high-βmagnetosheath plasma flow, a so-called plasma jet, whose kinetic energy was approximately three times higher than the energy of the incident solar wind. The plasma jet resulted in the effective penetration of magnetosheath plasma inside the magnetosphere. A strong distortion of the Chapman-Ferraro current in the ECE sequence generated a tripolar magnetic pulse "decrease - peak- decrease" (DPD) that was observed at low and middle latitudes by some ground-based magnetometers of the INTERMAGNET network. The characteristics of the ECE sequence and the spatial-temporal dynamics of the DPD pulse were found to be very different from any reported patterns of DD interactions with the magnetosphere. The observed features only partially resembled structures such as FTE, hot flow anomalies, and transient density events. Thus, it is difficult to explain them in the context of existing models.

  3. DEA (data envelopment analysis)-assisted supporting measures for ground coupled heat pumps implementing in Italy: A case study

    International Nuclear Information System (INIS)

    Longo, L.; Colantoni, A.; Castellucci, S.; Carlini, M.; Vecchione, L.; Savuto, E.; Pallozzi, V.; Di Carlo, A.; Bocci, E.; Moneti, M.; Cocchi, S.; Boubaker, K.

    2015-01-01

    Nowadays, the increasing of the energy consumption is producing serious global warming issues. Mainly most of greenhouse gas emissions in developed countries come from building equipments. In this context, GCHPs (ground coupled heat pumps) are candidate solution as air conditioning systems in buildings due to their higher efficiency compared to conventional devices. Actually, ground coupled heat pump systems are widely, recognized among the most efficient and comfortable used systems. Nevertheless, economic efficiency of the ground coupled heat pumps has to be proved. In this study, DEA (data envelopment analyses) method is applied to a real case in Italy. - Highlights: • Original investigation in terms of energy demands in buildings. • Gathering conjoint classical and scientific analyses. • Presenting original DEA (data envelopment analysis) economic optimization scheme analyses. • Outlining economical feasibility of an efficient low enthalpy-geothermal plant with GCHP (ground coupled heat pump) exchangers.

  4. Methodological variation in economic evaluations conducted in low- and middle-income countries: information for reference case development.

    Directory of Open Access Journals (Sweden)

    Benjarin Santatiwongchai

    Full Text Available Information generated from economic evaluation is increasingly being used to inform health resource allocation decisions globally, including in low- and middle- income countries. However, a crucial consideration for users of the information at a policy level, e.g. funding agencies, is whether the studies are comparable, provide sufficient detail to inform policy decision making, and incorporate inputs from data sources that are reliable and relevant to the context. This review was conducted to inform a methodological standardisation workstream at the Bill and Melinda Gates Foundation (BMGF and assesses BMGF-funded cost-per-DALY economic evaluations in four programme areas (malaria, tuberculosis, HIV/AIDS and vaccines in terms of variation in methodology, use of evidence, and quality of reporting. The findings suggest that there is room for improvement in the three areas of assessment, and support the case for the introduction of a standardised methodology or reference case by the BMGF. The findings are also instructive for all institutions that fund economic evaluations in LMICs and who have a desire to improve the ability of economic evaluations to inform resource allocation decisions.

  5. Ground and satellite-based remote sensing of mineral dust using AERI spectra and MODIS thermal infrared window brightness temperatures

    Science.gov (United States)

    Hansell, Richard Allen, Jr.

    coverage for the Persian Gulf case compare reasonably well to those from the "Deep Blue" algorithm developed at NASA-GSFC. The nighttime dust/cloud detection for the cases surrounding Cape Verde and Niger, West Africa has been validated by comparing to coincident and collocated ground-based micro-pulse lidar measurements.

  6. A Generalizable Methodology for Quantifying User Satisfaction

    Science.gov (United States)

    Huang, Te-Yuan; Chen, Kuan-Ta; Huang, Polly; Lei, Chin-Laung

    Quantifying user satisfaction is essential, because the results can help service providers deliver better services. In this work, we propose a generalizable methodology, based on survival analysis, to quantify user satisfaction in terms of session times, i. e., the length of time users stay with an application. Unlike subjective human surveys, our methodology is based solely on passive measurement, which is more cost-efficient and better able to capture subconscious reactions. Furthermore, by using session times, rather than a specific performance indicator, such as the level of distortion of voice signals, the effects of other factors like loudness and sidetone, can also be captured by the developed models. Like survival analysis, our methodology is characterized by low complexity and a simple model-developing process. The feasibility of our methodology is demonstrated through case studies of ShenZhou Online, a commercial MMORPG in Taiwan, and the most prevalent VoIP application in the world, namely Skype. Through the model development process, we can also identify the most significant performance factors and their impacts on user satisfaction and discuss how they can be exploited to improve user experience and optimize resource allocation.

  7. A Goal based methodology for HAZOP analysis

    DEFF Research Database (Denmark)

    Rossing, Netta Liin; Lind, Morten; Jensen, Niels

    2010-01-01

    to nodes with simple functions such as liquid transport, gas transport, liquid storage, gas-liquid contacting etc. From the functions of the nodes the selection of relevant process variables and deviation variables follows directly. The knowledge required to perform the pre-meeting HAZOP task of dividing...... the plant along functional lines is that of chemical unit operations and transport processes plus a some familiarity with the plant a hand. Thus the preparatory work may be performed by a chemical engineer with just an introductory course in risk assessment. The goal based methodology lends itself directly...

  8. Methodology to design a municipal solid waste generation and composition map: A case study

    International Nuclear Information System (INIS)

    Gallardo, A.; Carlos, M.; Peris, M.; Colomer, F.J.

    2014-01-01

    Highlights: • To draw a waste generation and composition map of a town a lot of factors must be taken into account. • The methodology proposed offers two different depending on the available data combined with geographical information systems. • The methodology has been applied to a Spanish city with success. • The methodology will be a useful tool to organize the municipal solid waste management. - Abstract: The municipal solid waste (MSW) management is an important task that local governments as well as private companies must take into account to protect human health, the environment and to preserve natural resources. To design an adequate MSW management plan the first step consist in defining the waste generation and composition patterns of the town. As these patterns depend on several socio-economic factors it is advisable to organize them previously. Moreover, the waste generation and composition patterns may vary around the town and over the time. Generally, the data are not homogeneous around the city as the number of inhabitants is not constant nor it is the economic activity. Therefore, if all the information is showed in thematic maps, the final waste management decisions can be made more efficiently. The main aim of this paper is to present a structured methodology that allows local authorities or private companies who deal with MSW to design its own MSW management plan depending on the available data. According to these data, this paper proposes two ways of action: a direct way when detailed data are available and an indirect way when there is a lack of data and it is necessary to take into account bibliographic data. In any case, the amount of information needed is considerable. This paper combines the planning methodology with the Geographic Information Systems to present the final results in thematic maps that make easier to interpret them. The proposed methodology is a previous useful tool to organize the MSW collection routes including the

  9. Incremental Dynamic Analysis of Koyna Dam under Repeated Ground Motions

    Science.gov (United States)

    Zainab Nik Azizan, Nik; Majid, Taksiah A.; Nazri, Fadzli Mohamed; Maity, Damodar; Abdullah, Junaidah

    2018-03-01

    This paper discovers the incremental dynamic analysis (IDA) of concrete gravity dam under single and repeated earthquake loadings to identify the limit state of the dam. Seven ground motions with horizontal and vertical direction as seismic input considered in the nonlinear dynamic analysis based on the real repeated earthquake in the worldwide. All the ground motions convert to respond spectrum and scaled according to the developed elastic respond spectrum in order to match the characteristic of the ground motion to the soil type. The scaled was depends on the fundamental period, T1 of the dam. The Koyna dam has been selected as a case study for the purpose of the analysis by assuming that no sliding and rigid foundation, has been estimated. IDA curves for Koyna dam developed for single and repeated ground motions and the performance level of the dam identifies. The IDA curve of repeated ground motion shown stiffer rather than single ground motion. The ultimate state displacement for a single event is 45.59mm and decreased to 39.33mm under repeated events which are decreased about 14%. This showed that the performance level of the dam based on seismic loadings depend on ground motion pattern.

  10. Developing a Guideline for Reporting and Evaluating Grounded Theory Research Studies (GUREGT)

    DEFF Research Database (Denmark)

    Berthelsen, Connie Bøttcher; Grimshaw-Aagaard, Søsserr Lone Smilla; Hansen, Carrinna

    2018-01-01

    theory research studies. The study was conducted in three phases. Phase 1: A structured literature review in PubMed, CINAHL, Cochrane Libraries, PsycInfo and SCOPUS to identify recommendations for reporting and evaluating grounded theory. Phase 2: A selective review of the methodological grounded theory...

  11. Enhanced static ground power unit based on flying capacitor based h-bridge hybrid active-neutral-point-clamped converter

    DEFF Research Database (Denmark)

    Abarzadeh, Mostafa; Madadi Kojabadi, Hossein; Deng, Fujin

    2016-01-01

    Static power converters have various applications, such as static ground power units (GPUs) for airplanes. This study proposes a new configuration of a static GPU based on a novel nine-level flying capacitor h-bridge active-neutral-point-clamped (FCHB_ANPC) converter. The main advantages of the p......Static power converters have various applications, such as static ground power units (GPUs) for airplanes. This study proposes a new configuration of a static GPU based on a novel nine-level flying capacitor h-bridge active-neutral-point-clamped (FCHB_ANPC) converter. The main advantages...

  12. An Internet of Things Approach for Extracting Featured Data Using AIS Database: An Application Based on the Viewpoint of Connected Ships

    Directory of Open Access Journals (Sweden)

    Wei He

    2017-09-01

    Full Text Available Automatic Identification System (AIS, as a major data source of navigational data, is widely used in the application of connected ships for the purpose of implementing maritime situation awareness and evaluating maritime transportation. Efficiently extracting featured data from AIS database is always a challenge and time-consuming work for maritime administrators and researchers. In this paper, a novel approach was proposed to extract massive featured data from the AIS database. An Evidential Reasoning rule based methodology was proposed to simulate the procedure of extracting routes from AIS database artificially. First, the frequency distributions of ship dynamic attributes, such as the mean and variance of Speed over Ground, Course over Ground, are obtained, respectively, according to the verified AIS data samples. Subsequently, the correlations between the attributes and belief degrees of the categories are established based on likelihood modeling. In this case, the attributes were characterized into several pieces of evidence, and the evidence can be combined with the Evidential Reasoning rule. In addition, the weight coefficients were trained in a nonlinear optimization model to extract the AIS data more accurately. A real life case study was conducted at an intersection waterway, Yangtze River, Wuhan, China. The results show that the proposed methodology is able to extract data very precisely.

  13. Atomic oxygen effects on boron nitride and silicon nitride: A comparison of ground based and space flight data

    Science.gov (United States)

    Cross, J. B.; Lan, E. H.; Smith, C. A.; Whatley, W. J.

    1990-01-01

    The effects of atomic oxygen on boron nitride (BN) and silicon nitride (Si3N4) were evaluated in a low Earth orbit (LEO) flight experiment and in a ground based simulation facility. In both the inflight and ground based experiments, these materials were coated on thin (approx. 250A) silver films, and the electrical resistance of the silver was measured in situ to detect any penetration of atomic oxygen through the BN and Si3N4 materials. In the presence of atomic oxygen, silver oxidizes to form silver oxide, which has a much higher electrical resistance than pure silver. Permeation of atomic oxygen through BN, as indicated by an increase in the electrical resistance of the silver underneath, was observed in both the inflight and ground based experiments. In contrast, no permeation of atomic oxygen through Si3N4 was observed in either the inflight or ground based experiments. The ground based results show good qualitative correlation with the LEO flight results, indicating that ground based facilities such as the one at Los Alamos National Lab can reproduce space flight data from LEO.

  14. MS-based analytical methodologies to characterize genetically modified crops.

    Science.gov (United States)

    García-Cañas, Virginia; Simó, Carolina; León, Carlos; Ibáñez, Elena; Cifuentes, Alejandro

    2011-01-01

    The development of genetically modified crops has had a great impact on the agriculture and food industries. However, the development of any genetically modified organism (GMO) requires the application of analytical procedures to confirm the equivalence of the GMO compared to its isogenic non-transgenic counterpart. Moreover, the use of GMOs in foods and agriculture faces numerous criticisms from consumers and ecological organizations that have led some countries to regulate their production, growth, and commercialization. These regulations have brought about the need of new and more powerful analytical methods to face the complexity of this topic. In this regard, MS-based technologies are increasingly used for GMOs analysis to provide very useful information on GMO composition (e.g., metabolites, proteins). This review focuses on the MS-based analytical methodologies used to characterize genetically modified crops (also called transgenic crops). First, an overview on genetically modified crops development is provided, together with the main difficulties of their analysis. Next, the different MS-based analytical approaches applied to characterize GM crops are critically discussed, and include "-omics" approaches and target-based approaches. These methodologies allow the study of intended and unintended effects that result from the genetic transformation. This information is considered to be essential to corroborate (or not) the equivalence of the GM crop with its isogenic non-transgenic counterpart. Copyright © 2010 Wiley Periodicals, Inc.

  15. Traditional behaviour and fidelity to caribou calving grounds by barren-ground caribou

    Directory of Open Access Journals (Sweden)

    Anne Gunn

    1986-06-01

    Full Text Available Evidence for the fidelity of female barren-ground caribou (Rangifer tarandus spp. of each herd to specific calving grounds is convincing. Involvement of learned behaviour in the annual return of those cows to the same calving grounds implies such actions are a form of «traditional» behaviour. Even wide variations in population size have not yet knowingly led to marked changes in size or location of calving grounds or prolonged abandonment of established ones. Rarely is the adoption of new calving grounds reported and emigration to another herd's calving ground or interchange between calving grounds has not yet been unequivocally documented. The calving experience of individual caribou and environmental pressures may modify the cow's use patterns of her calving grounds. The current definition of herds based on traditional calving grounds may require modification, if increasing caribou numbers result in changes in traditions. However, current data do not contradict either the fidelity to traditional calving grounds or the concept of herd identity based on that fidelity.

  16. Finite Element Based Response Surface Methodology to Optimize Segmental Tunnel Lining

    Directory of Open Access Journals (Sweden)

    A. Rastbood

    2017-04-01

    Full Text Available The main objective of this paper is to optimize the geometrical and engineering characteristics of concrete segments of tunnel lining using Finite Element (FE based Response Surface Methodology (RSM. Input data for RSM statistical analysis were obtained using FEM. In RSM analysis, thickness (t and elasticity modulus of concrete segments (E, tunnel height (H, horizontal to vertical stress ratio (K and position of key segment in tunnel lining ring (θ were considered as input independent variables. Maximum values of Mises and Tresca stresses and tunnel ring displacement (UMAX were set as responses. Analysis of variance (ANOVA was carried out to investigate the influence of each input variable on the responses. Second-order polynomial equations in terms of influencing input variables were obtained for each response. It was found that elasticity modulus and key segment position variables were not included in yield stresses and ring displacement equations, and only tunnel height and stress ratio variables were included in ring displacement equation. Finally optimization analysis of tunnel lining ring was performed. Due to absence of elasticity modulus and key segment position variables in equations, their values were kept to average level and other variables were floated in related ranges. Response parameters were set to minimum. It was concluded that to obtain optimum values for responses, ring thickness and tunnel height must be near to their maximum and minimum values, respectively and ground state must be similar to hydrostatic conditions.

  17. A Diagnostic Model for Dementia in Clinical Practice-Case Methodology Assisting Dementia Diagnosis.

    Science.gov (United States)

    Londos, Elisabet

    2015-04-02

    Dementia diagnosis is important for many different reasons. Firstly, to separate dementia, or major neurocognitive disorder, from MCI (mild cognitive impairment), mild neurocognitive disorder. Secondly, to define the specific underlying brain disorder to aid treatment, prognosis and decisions regarding care needs and assistance. The diagnostic method of dementias is a puzzle of different data pieces to be fitted together in the best possible way to reach a clinical diagnosis. Using a modified case methodology concept, risk factors affecting cognitive reserve and symptoms constituting the basis of the brain damage hypothesis, can be visualized, balanced and reflected against test results as well as structural and biochemical markers. The model's origin is the case method initially described in Harvard business school, here modified to serve dementia diagnostics.

  18. Assessment of NASA airborne laser altimetry data using ground-based GPS data near Summit Station, Greenland

    Science.gov (United States)

    Brunt, Kelly M.; Hawley, Robert L.; Lutz, Eric R.; Studinger, Michael; Sonntag, John G.; Hofton, Michelle A.; Andrews, Lauren C.; Neumann, Thomas A.

    2017-03-01

    A series of NASA airborne lidars have been used in support of satellite laser altimetry missions. These airborne laser altimeters have been deployed for satellite instrument development, for spaceborne data validation, and to bridge the data gap between satellite missions. We used data from ground-based Global Positioning System (GPS) surveys of an 11 km long track near Summit Station, Greenland, to assess the surface-elevation bias and measurement precision of three airborne laser altimeters including the Airborne Topographic Mapper (ATM), the Land, Vegetation, and Ice Sensor (LVIS), and the Multiple Altimeter Beam Experimental Lidar (MABEL). Ground-based GPS data from the monthly ground-based traverses, which commenced in 2006, allowed for the assessment of nine airborne lidar surveys associated with ATM and LVIS between 2007 and 2016. Surface-elevation biases for these altimeters - over the flat, ice-sheet interior - are less than 0.12 m, while assessments of measurement precision are 0.09 m or better. Ground-based GPS positions determined both with and without differential post-processing techniques provided internally consistent solutions. Results from the analyses of ground-based and airborne data provide validation strategy guidance for the Ice, Cloud, and land Elevation Satellite 2 (ICESat-2) elevation and elevation-change data products.

  19. Approach to developing a ground-motion design basis for facilities important to safety at Yucca Mountain

    International Nuclear Information System (INIS)

    King, J.L.

    1990-01-01

    This paper discusses a methodology for developing a ground-motion design basis for prospective facilities at Yucca Mountain that are important to safety. The methodology utilizes a guasi-deterministic construct called the 10,000-year cumulative-slip earthquake that is designed to provide a conservative, robust, and reproducible estimate of ground motion that has a one-in-ten chance of occurring during the preclosure period. This estimate is intended to define a ground-motion level for which the seismic design would ensure minimal disruption to operations engineering analyses to ensure safe performance are included

  20. A Near-Term Concept for Trajectory Based Operations with Air/Ground Data Link Communication

    Science.gov (United States)

    McNally, David; Mueller, Eric; Thipphavong, David; Paielli, Russell; Cheng, Jinn-Hwei; Lee, Chuhan; Sahlman, Scott; Walton, Joe

    2010-01-01

    An operating concept and required system components for trajectory-based operations with air/ground data link for today's en route and transition airspace is proposed. Controllers are fully responsible for separation as they are today, and no new aircraft equipage is required. Trajectory automation computes integrated solutions to problems like metering, weather avoidance, traffic conflicts and the desire to find and fly more time/fuel efficient flight trajectories. A common ground-based system supports all levels of aircraft equipage and performance including those equipped and not equipped for data link. User interface functions for the radar controller's display make trajectory-based clearance advisories easy to visualize, modify if necessary, and implement. Laboratory simulations (without human operators) were conducted to test integrated operation of selected system components with uncertainty modeling. Results are based on 102 hours of Fort Worth Center traffic recordings involving over 37,000 individual flights. The presence of uncertainty had a marginal effect (5%) on minimum-delay conflict resolution performance, and windfavorable routes had no effect on detection and resolution metrics. Flight plan amendments and clearances were substantially reduced compared to today s operations. Top-of-descent prediction errors are the largest cause of failure indicating that better descent predictions are needed to reliably achieve fuel-efficient descent profiles in medium to heavy traffic. Improved conflict detections for climbing flights could enable substantially more continuous climbs to cruise altitude. Unlike today s Conflict Alert, tactical automation must alert when an altitude amendment is entered, but before the aircraft starts the maneuver. In every other failure case tactical automation prevented losses of separation. A real-time prototype trajectory trajectory-automation system is running now and could be made ready for operational testing at an en route