WorldWideScience

Sample records for methodology direct experience

  1. RESOURCE TRAINING AND METHODOLOGICAL CENTER FOR THE TRAINING OF PEOPLE WITH DISABILITIES: EXPERIENCE AND DIRECTION OF DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    A. A. Fedorov

    2018-01-01

    Full Text Available Introduction: The presented article is devoted to the new and actual direction in the system of higher education - the development of inclusive education. The experience of creating a resource training and methodological center (RТMC of the University of Minin in 2017 is presented, the directions of its activity in 2017 and the results are described. The article outlines the role of RТMC in the development of inclusive culture.Materials and methods: The method of analyzing the literature of domestic and foreign authors was used as the basis for writing the article; the monitoring data of the state of inclusive higher education, which was implemented within the framework of the State Contract dated 07.06.2016 No. 05.020.11 007 on the project «Monitoring Information and Analytical Support of Activities regional resource centers for higher education for disabled people».Results: Analyzing the results of the RТMC activity, the authors update the problems that arose during the project implementation and suggest ways of their solution. The authors see the development of the RТMC activity through the development of forms and mechanisms of interdepartmental, interregional and inter-institutional cooperation in order to achieve coherence of actions and effectiveness of all participants in the support of inclusion in higher education, taking into account the educational needs of entrants and labor market needs throughout the fixed territory. As a special mission of the RТMC, the authors see the management of the development of inclusive culture in the university. The system of higher education is considered as an instrument of fulfilling the social order for the formation of a generation of people who tolerate and organically perceive the fact of inclusion in all spheres of life.Discussion and conclusion: The role of the resource training and methodological center in the development of inclusive higher education is determined by the identification

  2. Applicability of the Directed Graph Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Huszti, Jozsef [Institute of Isotope of the Hungarian Academy of Sciences, Budapest (Hungary); Nemeth, Andras [ESRI Hungary, Budapest (Hungary); Vincze, Arpad [Hungarian Atomic Energy Authority, Budapest (Hungary)

    2012-06-15

    Possible methods to construct, visualize and analyse the 'map' of the State's nuclear infrastructure based on different directed graph approaches are proposed. The transportation and the flow network models are described in detail. The use of the possible evaluation methodologies and the use of available software tools to construct and maintain the nuclear 'map' using pre-defined standard building blocks (nuclear facilities) are introduced and discussed.

  3. Pipeline external corrosion direct assessment methodology: lessons learned - part 1

    Energy Technology Data Exchange (ETDEWEB)

    Kowalski, Angel R. [DNV Columbus, Inc., OH (United States)

    2009-07-01

    DNV Columbus (Former CC Technologies) played a key role in the development of Direct Assessment (DA) methodologies, providing leadership in the NACE technical committees charged with development of DA standards. Since the first publication of NACE Standard RP-0502-2002, External Corrosion Direct Assessment (ECDA) has been successfully applied over a great number of pipelines to evaluate the impact of external corrosion on the pipeline integrity. This paper summarizes the results of applying ECDA over a selected number of underground pipelines and presents interesting facts about the methodology. (author)

  4. Mechatronics methodology: 15 years of experience

    Directory of Open Access Journals (Sweden)

    Efren Gorrostieta

    2015-09-01

    Full Text Available This article presents a methodology to teach students to develop mechatronic projects. It was taught in higher education schools, in different universities in Mexico, in courses such as: Robotics, Control Systems, Mechatronic Systems, Artificial Intelligence, etc. The intention of this methodology is not only to achieve the integration of different subjects but also to accomplish synergy between them so that the final result may be the best possible in quality, time and robustness. Since its introduction into the educational area, this methodology was evaluated and modified for approximately five years, were substantial characteristics were adopted. For the next ten years, only minor alterations were carried out. Fifteen years of experience have proven that the methodology is useful not only for training but also for real projects. In this article, we first explain the methodology and its main characteristics, as well as a brief history of its teaching in different educational programs. Then, we present two cases were the methodology was successfully applied. The first project consisted in the design, construction and evaluation of a mobile robotic manipulator which aims to be used as an explosives ordnance device. In the second case, we document the results of a project assignment for robotics tasks carried out by students which were formerly taught with the methodology.

  5. Priority research directions in the area of qualitative methodology

    OpenAIRE

    Melnikova, Olga; Khoroshilov, Dmitry

    2010-01-01

    The basic directions of modern theoretical and practical research in the area of qualitative methodology in Russia are discussed in the article. The complexity of research is considered from three points of view: the development of methodology of qualitative analysis, qualitative methods, and verbal and nonverbal projective techniques. The authors present an integrative model of the qualitative analysis, the research on specificity of the use of discourse-analysis method and projective techni...

  6. Researching experiences of cancer: the importance of methodology.

    Science.gov (United States)

    Entwistle, V; Tritter, J Q; Calnan, M

    2002-09-01

    This paper draws on contributions to and discussions at a recent MRC HSRC-sponsored workshop 'Researching users' experiences of health care: the case of cancer'. We focus on the methodological and ethical challenges that currently face researchers who use self-report methods to investigate experiences of cancer and cancer care. These challenges relate to: the theoretical and conceptual underpinnings of research; participation rates and participant profiles; data collection methods (the retrospective nature of accounts, description and measurement, and data collection as intervention); social desirability considerations; relationship considerations; the experiences of contributing to research; and the synthesis and presentation of findings. We suggest that methodological research to tackle these challenges should be integrated into substantive research projects to promote the development of a strong knowledge base about experiences of cancer and cancer care.

  7. Investigating patients' experiences: methodological usefulness of interpretive interactionism.

    Science.gov (United States)

    Tower, Marion; Rowe, Jennifer; Wallis, Marianne

    2012-01-01

    To demonstrate the methodological usefulness of interpretive interactionism by applying it to the example of a study investigating healthcare experiences of women affected by domestic violence. Understanding patients' experiences of health, illness and health care is important to nurses. For many years, biomedical discourse has prevailed in healthcare language and research, and has influenced healthcare responses. Contemporary nursing scholarship can be developed by engaging with new ways of understanding therapeutic interactions with patients. Research that uses qualitative methods of inquiry is an important paradigm for nurses who seek to explain and understand or describe experiences rather than predict outcomes. Interpretive interactionism is an interpretive form of inquiry for conducting studies of social or personal problems that have healthcare policy implications. It puts the patient at the centre of the research process and makes visible the experiences of patients as they interact with the healthcare and social systems that surround them. Interpretive interactionism draws on concepts of symbolic interactionism, phenomenology and hermeneutics. Interpretive interactionism is a patient-centred methodology that provides an alternative way of understanding patients' experiences. It can contribute to policy and practice development by drawing on the perspectives and experiences of patients, who are central to the research process. It also allows research findings to be situated in and linked to healthcare policy, professional ethics and organisational approaches to care. Interpretive interactionism has methodological utility because it can contribute to policy and practice development by drawing on the perspectives and experiences of patients who are central to the research process. Interpretive interactionism allows research findings to be situated in and linked to health policy, professional ethics and organisational approaches to caring.

  8. Power-sharing Partnerships: Teachers' Experiences of Participatory Methodology.

    Science.gov (United States)

    Ferreira, Ronél; Ebersöhn, Liesel; Mbongwe, Bathsheba B

    2015-01-01

    This article reports on the experiences of teachers as coresearchers in a long-term partnership with university researchers, who participated in an asset-based intervention project known as Supportive Teachers, Assets and Resilience (STAR). In an attempt to inform participatory research methodology, the study investigated how coresearchers (teachers) experienced power relations. We utilized Gaventa's power cube as a theoretical framework and participatory research as our methodologic paradigm. Ten teachers of a primary school in the Eastern Cape and five teachers of a secondary school in a remote area in the Mpumalanga Province in South Africa participated (n=15). We employed multiple data generation techniques, namely Participatory Reflection and Action (PRA) activities, observation, focus group discussions, and semistructured interviews, using thematic analysis and categorical aggregation for data analysis. We identified three themes, related to the (1) nature of power in participatory partnerships, (2) coreasearchers' meaning making of power and partnerships, and their (3) role in taking agency. Based on these findings, we developed a framework of power sharing partnerships to extend Gaventa's power cube theory. This framework, and its five interrelated elements (leadership as power, identifying vision and mission, synergy, interdependent role of partners, and determination), provide insight into the way coresearchers shared their experiences of participatory research methodology. We theorise power-sharing partnerships as a complimentary platform hosting partners' shared strengths, skills, and experience, creating synergy in collaborative projects.

  9. QUALITY IMPROVEMENT IN MULTIRESPONSE EXPERIMENTS THROUGH ROBUST DESIGN METHODOLOGY

    Directory of Open Access Journals (Sweden)

    M. Shilpa

    2012-06-01

    Full Text Available Robust design methodology aims at reducing the variability in the product performance in the presence of noise factors. Experiments involving simultaneous optimization of more than one quality characteristic are known as multiresponse experiments which are used in the development and improvement of industrial processes and products. In this paper, robust design methodology is applied to optimize the process parameters during a particular operation of rotary driving shaft manufacturing process. The three important quality characteristics of the shaft considered here are of type Nominal-the-best, Smaller-the-better and Fraction defective. Simultaneous optimization of these responses is carried out by identifying the control parameters and conducting the experimentation using L9 orthogonal array.

  10. Directional Track Selection Technique in CR39 SSNTD for lowyield reaction experiments

    Science.gov (United States)

    Ingenito, Francesco; Andreoli, Pierluigi; Batani, Dimitri; Bonasera, Aldo; Boutoux, Guillaume; Burgy, Frederic; Cipriani, Mattia; Consoli, Fabrizio; Cristofari, Giuseppe; De Angelis, Riccardo; Di Giorgio, Giorgio; Ducret, Jean Eric; Giulietti, Danilo; Jakubowska, Katarzyna

    2018-01-01

    There is a great interest in the study of p-11B aneutronic nuclear fusion reactions, both for energy production and for determination of fusion cross-sections at low energies. In this context we performed experiments at CELIA in which energetic protons, accelerated by the laser ECLIPSE, were directed toward a solid Boron target. Because of the small cross-sections at these energies the number of expected reactions is low. CR39 Solid-State Nuclear Track Detectors (SSNTD) were used to detect the alpha particles produced. Because of the low expected yield, it is difficult to discriminate the tracks due to true fusion products from those due to natural background in the CR39. To this purpose we developed a methodology of particle recognition according to their direction with respect to the detector normal, able to determine the position of their source. We applied this to the specific experiment geometry, so to select from all the tracks those due to particles coming from the region of interaction between accelerated protons and solid boron target. This technique can be of great help on the analysis of SSNTD in experiments with low yield reactions, but can be also generally applied to any experiment where particles reach the track detector with known directions, and for example to improve the detection limit of particle spectrometers using CR39.

  11. Configuring NIF for direct drive experiments

    International Nuclear Information System (INIS)

    Eimerl, D.; Rothenberg, J.; Key, M.

    1995-01-01

    The National Ignition Facility (NIF) is a proposed 1.8 MJ laser facility for carrying out experiments in inertial confinement fusion, currently designed for indirect drive experiments. The direct drive approach is being pursued at the 30 kJ Omega facility at the University of Rochester. In this paper we discuss the modifications to the NIF laser that would be required for both indirect and direct drive experiments. A primary concern is the additional cost of adding direct drive capability to the facility

  12. Sensitivity analysis of critical experiment with direct perturbation compared to TSUNAMI-3D sensitivity analysis

    International Nuclear Information System (INIS)

    Barber, A. D.; Busch, R.

    2009-01-01

    The goal of this work is to obtain sensitivities from direct uncertainty analysis calculation and correlate those calculated values with the sensitivities produced from TSUNAMI-3D (Tools for Sensitivity and Uncertainty Analysis Methodology Implementation in Three Dimensions). A full sensitivity analysis is performed on a critical experiment to determine the overall uncertainty of the experiment. Small perturbation calculations are performed for all known uncertainties to obtain the total uncertainty of the experiment. The results from a critical experiment are only known as well as the geometric and material properties. The goal of this relationship is to simplify the uncertainty quantification process in assessing a critical experiment, while still considering all of the important parameters. (authors)

  13. Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD)

    Science.gov (United States)

    Generazio, Edward R.

    2015-01-01

    Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD) Manual v.1.2 The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that there is 95% confidence that the POD is greater than 90% (90/95 POD). Design of experiments for validating probability of detection capability of nondestructive evaluation (NDE) systems (DOEPOD) is a methodology that is implemented via software to serve as a diagnostic tool providing detailed analysis of POD test data, guidance on establishing data distribution requirements, and resolving test issues. DOEPOD demands utilization of observance of occurrences. The DOEPOD capability has been developed to provide an efficient and accurate methodology that yields observed POD and confidence bounds for both Hit-Miss or signal amplitude testing. DOEPOD does not assume prescribed POD logarithmic or similar functions with assumed adequacy over a wide range of flaw sizes and inspection system technologies, so that multi-parameter curve fitting or model optimization approaches to generate a POD curve are not required. DOEPOD applications for supporting inspector qualifications is included.

  14. Experience feedback from incidents: methodological and cultural aspects

    International Nuclear Information System (INIS)

    Perinet, R.

    2007-01-01

    EdF has designed some provisions to improve the reliability of human interventions: an increased number of training simulators, management of the quality of interventions, implementation of human factor consultants on each site, improvement in user documentation, development of communication practices, etc. However, despite efforts made in the right direction, the complexity of human behaviour and organisations make it obligatory to follow up the efficacy of these provisions over time in order to ensure that they produce the expected results on work practices. The in-depth analysis by IRSN of events that are significant for safety shows that experience feedback from incidents constitutes a real opportunity to ensure this follow-up. It also highlights the difficulty for licensees to define the temporal context of investigations to carry out, analysing errors committed more in depth and identifying ensuing problems. This article shows that these difficulties are the result of inappropriate methodologies and a lack of skills and availability to carry out the analysis. Finally, it shows that the incident leads to defensive behaviour among those participating in the system that blocks the compilation of information and limits the relevance of analyses. (author)

  15. INSTALLING AN ERP SYSTEM WITH A METHODOLOGY BASED ON THE PRINCIPLES OF GOAL DIRECTED PROJECT MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Ioannis Zafeiropoulos

    2010-01-01

    Full Text Available This paper describes a generic methodology to support the process of modelling, adaptation and implementation (MAI of Enterprise Resource Planning Systems (ERPS based on the principles of goal directed project management (GDPM. The proposed methodology guides the project manager through specific stages in order to successfully complete the ERPS implementation. The development of the proper MAI methodology is deemed necessary because it will simplify the installation process of ERPS. The goal directed project management method was chosen since it provides a way of focusing all changes towards a predetermined goal. The main stages of the methodology are the promotion and preparation steps, the proposal, the contract, the implementation and the completion. The methodology was applied as a pilot application by a major ERPS development company. Important benefits were the easy and effective guidance for all installation and analysis stages, the faster installation for the ERPS and the control and cost reduction for the installation, in terms of time, manpower, technological equipment and other resources.

  16. Affective methodologies and experimenting with affirmative critiques of educational leadership

    DEFF Research Database (Denmark)

    Staunæs, Dorthe

    concerns suitable methodologies for researching and experiments with affirmative critique of these new forms of educational leadership. In order not to just to ‘quote’, celebrate or reject this affective agenda, I ask how post-human, intra-active and performative approaches developed in Nordic feminist...... and anti-racist education studies may assist in experimenting with concepts and research formats, interrogate the (unforeseen) effects of affects and affective economies intertwined with new forms of educational leadership, and thereby formulate affirmative critiques of these new types of psy-leadership.......This paper will focus upon the identification of suitable and experimental methodologies for interrogating ‘the affective turn’ in European educational leadership. As an answer to the global GERM and the plea for improving learning outcomes, educational leadership in countries like Denmark seems...

  17. Hungarian experience in using the IAEA planning methodologies

    International Nuclear Information System (INIS)

    Bacsko, M.

    1997-01-01

    The Hungarian Power Companies Ltd. has been using the IAEA planning methodologies since 1985 when it acquired the WASP model. Since then this model has been applied on a regular basis to determine the least cost expansion plan of the power generating system of the country. This report describes this experience as well as the application of the WASP model for other types of studies. (author)

  18. Hungarian experience in using the IAEA planning methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Bacsko, M [Hungarian Power Companis Ltd, Budapest (Hungary)

    1997-09-01

    The Hungarian Power Companies Ltd. has been using the IAEA planning methodologies since 1985 when it acquired the WASP model. Since then this model has been applied on a regular basis to determine the least cost expansion plan of the power generating system of the country. This report describes this experience as well as the application of the WASP model for other types of studies. (author).

  19. Critical dialogical approach: A methodological direction for occupation-based social transformative work.

    Science.gov (United States)

    Farias, Lisette; Laliberte Rudman, Debbie; Pollard, Nick; Schiller, Sandra; Serrata Malfitano, Ana Paula; Thomas, Kerry; van Bruggen, Hanneke

    2018-05-03

    Calls for embracing the potential and responsibility of occupational therapy to address socio-political conditions that perpetuate occupational injustices have materialized in the literature. However, to reach beyond traditional frameworks informing practices, this social agenda requires the incorporation of diverse epistemological and methodological approaches to support action commensurate with social transformative goals. Our intent is to present a methodological approach that can help extend the ways of thinking or frameworks used in occupational therapy and science to support the ongoing development of practices with and for individuals and collectives affected by marginalizing conditions. We describe the epistemological and theoretical underpinnings of a methodological approach drawing on Freire and Bakhtin's work. Integrating our shared experience taking part in an example study, we discuss the unique advantages of co-generating data using two methods aligned with this approach; dialogical interviews and critical reflexivity. Key considerations when employing this approach are presented, based on its proposed epistemological and theoretical stance and our shared experiences engaging in it. A critical dialogical approach offers one way forward in expanding occupational therapy and science scholarship by promoting collaborative knowledge generation and examination of taken-for-granted understandings that shape individuals assumptions and actions.

  20. Direct photon experiments

    International Nuclear Information System (INIS)

    Boeggild, H.

    1986-11-01

    The author reviews the experiments on direct photon production in hadronic collisions. After a description of the experimental methods for the study of such processes he presents some results on differential cross sections and the γ/π 0 ratio in π - p, π + p, pp, and anti pp processes as well as in reactions of π - , π + , and p on carbon. (HSI)

  1. A Methodology to Institutionalise User Experience in Provincial Government

    Directory of Open Access Journals (Sweden)

    Marco Cobus Pretorius

    2014-12-01

    Full Text Available Problems experienced with website usability can prevent users from accessing and adopting technology, such as e-Government. At present, a number of guidelines exist for e-Government website user experience (UX design; however, the effectiveness of the implementation of these guidelines depends on the expertise of the website development team and on an organisation’s understanding of UX. Despite the highlighted importance of UX, guidelines are rarely applied in South African e-Government website designs. UX guidelines cannot be implemented if there is a lack of executive support, trained staff, budget and user-centred design processes. The goal of this research is to propose and evaluate a methodology (called the “Institutionalise UX in Government (IUXG methodology” to institutionalise UX in South African Provincial Governments (SAPGs. The Western Cape Government in South Africa was used as a case study to evaluate the proposed IUXG methodology. The results show that the IUXG methodology can assist SAPGs to establish UX as standard practice and improve the UX maturity levels.

  2. Implementation and adaptation of a macro-scale methodology to calculate direct economic losses

    Science.gov (United States)

    Natho, Stephanie; Thieken, Annegret

    2017-04-01

    As one of the 195 member countries of the United Nations, Germany signed the Sendai Framework for Disaster Risk Reduction 2015-2030 (SFDRR). With this, though voluntary and non-binding, Germany agreed to report on achievements to reduce disaster impacts. Among other targets, the SFDRR aims at reducing direct economic losses in relation to the global gross domestic product by 2030 - but how to measure this without a standardized approach? The United Nations Office for Disaster Risk Reduction (UNISDR) has hence proposed a methodology to estimate direct economic losses per event and country on the basis of the number of damaged or destroyed items in different sectors. The method bases on experiences from developing countries. However, its applicability in industrial countries has not been investigated so far. Therefore, this study presents the first implementation of this approach in Germany to test its applicability for the costliest natural hazards and suggests adaptations. The approach proposed by UNISDR considers assets in the sectors agriculture, industry, commerce, housing, and infrastructure by considering roads, medical and educational facilities. The asset values are estimated on the basis of sector and event specific number of affected items, sector specific mean sizes per item, their standardized construction costs per square meter and a loss ratio of 25%. The methodology was tested for the three costliest natural hazard types in Germany, i.e. floods, storms and hail storms, considering 13 case studies on the federal or state scale between 1984 and 2016. Not any complete calculation of all sectors necessary to describe the total direct economic loss was possible due to incomplete documentation. Therefore, the method was tested sector-wise. Three new modules were developed to better adapt this methodology to German conditions covering private transport (cars), forestry and paved roads. Unpaved roads in contrast were integrated into the agricultural and

  3. Laboratory experiments in innovation research: A methodological overview and a review of the current literature

    OpenAIRE

    Brüggemann, Julia; Bizer, Kilian

    2016-01-01

    Innovation research has developed a broad set of methodological approaches in recent decades. In this paper, we propose laboratory experiments as a fruitful methodological addition to the existing methods in innovation research. Therefore, we provide an overview of the existing methods, discuss the advantages and limitations of laboratory experiments, and review experimental studies dealing with different fields of innovation policy, namely intellectual property rights, financi...

  4. Current Direct Neutrino Mass Experiments

    Directory of Open Access Journals (Sweden)

    G. Drexlin

    2013-01-01

    Full Text Available In this contribution, we review the status and perspectives of direct neutrino mass experiments, which investigate the kinematics of β-decays of specific isotopes (3H, 187Re, 163Ho to derive model-independent information on the averaged electron (antineutrino mass. After discussing the kinematics of β-decay and the determination of the neutrino mass, we give a brief overview of past neutrino mass measurements (SN1987a-ToF studies, Mainz and Troitsk experiments for 3H, cryobolometers for 187Re. We then describe the Karlsruhe Tritium Neutrino (KATRIN experiment currently under construction at Karlsruhe Institute of Technology, which will use the MAC-E-Filter principle to push the sensitivity down to a value of 200 meV (90% C.L.. To do so, many technological challenges have to be solved related to source intensity and stability, as well as precision energy analysis and low background rate close to the kinematic endpoint of tritium β-decay at 18.6 keV. We then review new approaches such as the MARE, ECHO, and Project8 experiments, which offer the promise to perform an independent measurement of the neutrino mass in the sub-eV region. Altogether, the novel methods developed in direct neutrino mass experiments will provide vital information on the absolute mass scale of neutrinos.

  5. Effort Flow Analysis: A Methodology for Directed Product Evolution Using Rigid Body and Compliant Mechanisms

    National Research Council Canada - National Science Library

    Greer, James

    2002-01-01

    This dissertation presents a systematic design methodology for directed product evolution that uses both rigid body and compliant mechanisms to facilitate component combination in the domain of mechanical products...

  6. Improving the Quality of Experience Journals: Training Educational Psychology Students in Basic Qualitative Methodology

    Science.gov (United States)

    Reynolds-Keefer, Laura

    2010-01-01

    This study evaluates the impact of teaching basic qualitative methodology to preservice teachers enrolled in an educational psychology course in the quality of observation journals. Preservice teachers enrolled in an educational psychology course requiring 45 hr of field experience were given qualitative methodological training as a part of the…

  7. Leaders' mental health at work: Empirical, methodological, and policy directions.

    Science.gov (United States)

    Barling, Julian; Cloutier, Anika

    2017-07-01

    While employees' mental health is the focus of considerable attention from researchers, the public, and policymakers, leaders' mental health has almost escaped attention. We start by considering several reasons for this, followed by discussions of the effects of leaders' mental health on their own leadership behaviors, the emotional toll of high-quality leadership, and interventions to enhance leaders' mental health. We offer 8 possible directions for future research on leaders' mental health. Finally, we discuss methodological obstacles encountered when investigating leaders' mental health, and policy dilemmas raised by leaders' mental health. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  8. The effect of sensory brand experience and involvement on brand equity directly and indirectly through consumer brand engagement

    OpenAIRE

    Hepola, Janne; Karjaluoto, Heikki; Hintikka, Anni

    2017-01-01

    Purpose This study aims to examine the effect of sensory brand experience and involvement on brand equity directly and indirectly through cognitive, emotional and behavioral consumer brand engagement (CBE). Design/methodology/approach A survey was administered to the customers of a Finnish tableware brand using relevant Facebook channels. A total of 1,390 responses were analyzed using partial least squares structural equation modeling. Findings The empirical findings suggest ...

  9. Methodological issues involved in conducting qualitative research on support for nurses directly involved with women who chose to terminate their pregnancy

    Directory of Open Access Journals (Sweden)

    Antoinette Gmeiner

    2001-11-01

    Full Text Available The purpose of this article is to describe the methodological issues involved in conducting qualitative research to explore and describe nurses’ experience of being directly involved with termination of pregnancies and developing guidelines for support for these nurses. Opsomming Die doel van hierdie artikel is om die metodologiese vraagstukke te beskryf rondom die uitvoer van kwalitatiewe navorsing waar verpleegkundiges se ervaring van hul direkte betrokkenheid by terminasie van swangerskap verken en beskryf is. *Please note: This is a reduced version of the abstract. Please refer to PDF for full text.

  10. Current status of direct dark matter detection experiments

    Science.gov (United States)

    Liu, Jianglai; Chen, Xun; Ji, Xiangdong

    2017-03-01

    Much like ordinary matter, dark matter might consist of elementary particles, and weakly interacting massive particles are one of the prime suspects. During the past decade, the sensitivity of experiments trying to directly detect them has improved by three to four orders of magnitude, but solid evidence for their existence is yet to come. We overview the recent progress in direct dark matter detection experiments and discuss future directions.

  11. Methodological issues involved in conducting qualitative research ...

    African Journals Online (AJOL)

    The purpose of this article is to describe the methodological issues involved in conducting qualitative research to explore and describe nurses' experience of being directly involved with termination of pregnancies and developing guidelines for support for these nurses. The article points out the sensitivity and responsibility ...

  12. A new methodology for modeling of direct landslide costs for transportation infrastructures

    Science.gov (United States)

    Klose, Martin; Terhorst, Birgit

    2014-05-01

    The world's transportation infrastructure is at risk of landslides in many areas across the globe. A safe and affordable operation of traffic routes are the two main criteria for transportation planning in landslide-prone areas. The right balancing of these often conflicting priorities requires, amongst others, profound knowledge of the direct costs of landslide damage. These costs include capital investments for landslide repair and mitigation as well as operational expenditures for first response and maintenance works. This contribution presents a new methodology for ex post assessment of direct landslide costs for transportation infrastructures. The methodology includes tools to compile, model, and extrapolate landslide losses on different spatial scales over time. A landslide susceptibility model enables regional cost extrapolation by means of a cost figure obtained from local cost compilation for representative case study areas. On local level, cost survey is closely linked with cost modeling, a toolset for cost estimation based on landslide databases. Cost modeling uses Landslide Disaster Management Process Models (LDMMs) and cost modules to simulate and monetize cost factors for certain types of landslide damage. The landslide susceptibility model provides a regional exposure index and updates the cost figure to a cost index which describes the costs per km of traffic route at risk of landslides. Both indexes enable the regionalization of local landslide losses. The methodology is applied and tested in a cost assessment for highways in the Lower Saxon Uplands, NW Germany, in the period 1980 to 2010. The basis of this research is a regional subset of a landslide database for the Federal Republic of Germany. In the 7,000 km² large Lower Saxon Uplands, 77 km of highway are located in potential landslide hazard area. Annual average costs of 52k per km of highway at risk of landslides are identified as cost index for a local case study area in this region. The

  13. Graduate students' teaching experiences improve their methodological research skills.

    Science.gov (United States)

    Feldon, David F; Peugh, James; Timmerman, Briana E; Maher, Michelle A; Hurst, Melissa; Strickland, Denise; Gilmore, Joanna A; Stiegelmeyer, Cindy

    2011-08-19

    Science, technology, engineering, and mathematics (STEM) graduate students are often encouraged to maximize their engagement with supervised research and minimize teaching obligations. However, the process of teaching students engaged in inquiry provides practice in the application of important research skills. Using a performance rubric, we compared the quality of methodological skills demonstrated in written research proposals for two groups of early career graduate students (those with both teaching and research responsibilities and those with only research responsibilities) at the beginning and end of an academic year. After statistically controlling for preexisting differences between groups, students who both taught and conducted research demonstrate significantly greater improvement in their abilities to generate testable hypotheses and design valid experiments. These results indicate that teaching experience can contribute substantially to the improvement of essential research skills.

  14. Experiências de pesquisa: entre escolhas metodológicas e percursos individuais Research experiences: between methodological choices and individual paths

    Directory of Open Access Journals (Sweden)

    Ana Paula Serrata Malfitano

    2011-06-01

    Full Text Available O presente texto traz a apresentação de um relato de pesquisa, tendo como base a experiência metodológica desenvolvida em uma tese de doutorado. Objetiva-se discutir as possibilidades e os limites da realização de pesquisas por atores envolvidos com o objeto em estudo. Especificamente, parte-se da situação de ocupar o lugar de profissional participante da proposição, implementação e intervenção técnica em uma política social direcionada para crianças e adolescentes em situação de rua para, posteriormente, realizar uma pesquisa sobre a experiência em curso. Aportando-se na discussão sobre a metodologia materialista-histórica, a observação participante e a "objetivação participante", ou seja, a centralidade do método no ato de objetivar a participação para, consequentemente, compreender e buscar mudanças na realidade; defendem-se as potencialidades e as riquezas de investigações realizadas pelos próprios atores envolvidos no processo de trabalho. Faz-se importante, também, reconhecer a inexistência de imparcialidade no processo, inerente à compreensão de atuação da visão do pesquisador, bem como destacar a necessidade de embasamento teórico para possibilitar as reflexões que busquem ofertar uma apreensão aprofundada da realidade com a qual se está diretamente envolvido.The author of this text discusses the presentation of a research report, based on the methodological experience developed in a PhD dissertation. The aim is to discuss the possibilities and limitations of research being conducted by the actors involved with the object being studied. Specifically, the author departs from the situation of occupying the place of a professional who is a participant in the proposition, implementation and technical intervention in a social policy directed to children and teenagers living on the street, and then she carries out a research on the ongoing experience. By contributing to the discussion on the

  15. Direct cost analysis of intensive care unit stay in four European countries: applying a standardized costing methodology.

    Science.gov (United States)

    Tan, Siok Swan; Bakker, Jan; Hoogendoorn, Marga E; Kapila, Atul; Martin, Joerg; Pezzi, Angelo; Pittoni, Giovanni; Spronk, Peter E; Welte, Robert; Hakkaart-van Roijen, Leona

    2012-01-01

    The objective of the present study was to measure and compare the direct costs of intensive care unit (ICU) days at seven ICU departments in Germany, Italy, the Netherlands, and the United Kingdom by means of a standardized costing methodology. A retrospective cost analysis of ICU patients was performed from the hospital's perspective. The standardized costing methodology was developed on the basis of the availability of data at the seven ICU departments. It entailed the application of the bottom-up approach for "hotel and nutrition" and the top-down approach for "diagnostics," "consumables," and "labor." Direct costs per ICU day ranged from €1168 to €2025. Even though the distribution of costs varied by cost component, labor was the most important cost driver at all departments. The costs for "labor" amounted to €1629 at department G but were fairly similar at the other departments (€711 ± 115). Direct costs of ICU days vary widely between the seven departments. Our standardized costing methodology could serve as a valuable instrument to compare actual cost differences, such as those resulting from differences in patient case-mix. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  16. A Global Sensitivity Analysis Methodology for Multi-physics Applications

    Energy Technology Data Exchange (ETDEWEB)

    Tong, C H; Graziani, F R

    2007-02-02

    Experiments are conducted to draw inferences about an entire ensemble based on a selected number of observations. This applies to both physical experiments as well as computer experiments, the latter of which are performed by running the simulation models at different input configurations and analyzing the output responses. Computer experiments are instrumental in enabling model analyses such as uncertainty quantification and sensitivity analysis. This report focuses on a global sensitivity analysis methodology that relies on a divide-and-conquer strategy and uses intelligent computer experiments. The objective is to assess qualitatively and/or quantitatively how the variabilities of simulation output responses can be accounted for by input variabilities. We address global sensitivity analysis in three aspects: methodology, sampling/analysis strategies, and an implementation framework. The methodology consists of three major steps: (1) construct credible input ranges; (2) perform a parameter screening study; and (3) perform a quantitative sensitivity analysis on a reduced set of parameters. Once identified, research effort should be directed to the most sensitive parameters to reduce their uncertainty bounds. This process is repeated with tightened uncertainty bounds for the sensitive parameters until the output uncertainties become acceptable. To accommodate the needs of multi-physics application, this methodology should be recursively applied to individual physics modules. The methodology is also distinguished by an efficient technique for computing parameter interactions. Details for each step will be given using simple examples. Numerical results on large scale multi-physics applications will be available in another report. Computational techniques targeted for this methodology have been implemented in a software package called PSUADE.

  17. Analysis of kyoto university reactor physics critical experiments using NCNSRC calculation methodology

    International Nuclear Information System (INIS)

    Amin, E.; Hathout, A.M.; Shouman, S.

    1997-01-01

    The kyoto university reactor physics experiments on the university critical assembly is used to benchmark validate the NCNSRC calculations methodology. This methodology has two lines, diffusion and Monte Carlo. The diffusion line includes the codes WIMSD4 for cell calculations and the two dimensional diffusion code DIXY2 for core calculations. The transport line uses the MULTIKENO-Code vax Version. Analysis is performed for the criticality, and the temperature coefficients of reactivity (TCR) for the light water moderated and reflected cores, of the different cores utilized in the experiments. The results of both Eigen value and TCR approximately reproduced the experimental and theoretical Kyoto results. However, some conclusions are drawn about the adequacy of the standard wimsd4 library. This paper is an extension of the NCNSRC efforts to assess and validate computer tools and methods for both Et-R R-1 and Et-MMpr-2 research reactors. 7 figs., 1 tab

  18. Fusion integral experiments and analysis and the determination of design safety factors - I: Methodology

    International Nuclear Information System (INIS)

    Youssef, M.Z.; Kumar, A.; Abdou, M.A.; Oyama, Y.; Maekawa, H.

    1995-01-01

    The role of the neutronics experimentation and analysis in fusion neutronics research and development programs is discussed. A new methodology was developed to arrive at estimates to design safety factors based on the experimental and analytical results from design-oriented integral experiments. In this methodology, and for a particular nuclear response, R, a normalized density function (NDF) is constructed from the prediction uncertainties, and their associated standard deviations, as found in the various integral experiments where that response, R, is measured. Important statistical parameters are derived from the NDF, such as the global mean prediction uncertainty, and the possible spread around it. The method of deriving safety factors from many possible NDFs based on various calculational and measuring methods (among other variants) is also described. Associated with each safety factor is a confidence level, designers may choose to have, that the calculated response, R, will not exceed (or will not fall below) the actual measured value. An illustrative example is given on how to construct the NDFs. The methodology is applied in two areas, namely the line-integrated tritium production rate and bulk shielding integral experiments. Conditions under which these factors could be derived and the validity of the method are discussed. 72 refs., 17 figs., 4 tabs

  19. Methodology for measurement in schools and kindergartens: experiences

    International Nuclear Information System (INIS)

    Fotjikova, I.; Navratilova Rovenska, K.

    2015-01-01

    In more than 1500 schools and preschool facilities, long-term radon measurement was carried out in the last 3 y. The negative effect of thermal retrofitting on the resulting long-term radon averages is evident. In some of the facilities, low ventilation rates and correspondingly high radon levels were found, so it was recommended to change ventilation habits. However, some of the facilities had high radon levels due to its ingress from soil gas. Technical measures should be undertaken to reduce radon exposure in this case. The paper presents the long-term experiences with the two-stage measurement methodology for investigation of radon levels in school and preschool facilities and its possible improvements. (authors)

  20. Bridging Minds: A Mixed Methodology to Assess Networked Flow.

    Science.gov (United States)

    Galimberti, Carlo; Chirico, Alice; Brivio, Eleonora; Mazzoni, Elvis; Riva, Giuseppe; Milani, Luca; Gaggioli, Andrea

    2015-01-01

    The main goal of this contribution is to present a methodological framework to study Networked Flow, a bio-psycho-social theory of collective creativity applying it on creative processes occurring via a computer network. First, we draw on the definition of Networked Flow to identify the key methodological requirements of this model. Next, we present the rationale of a mixed methodology, which aims at combining qualitative, quantitative and structural analysis of group dynamics to obtain a rich longitudinal dataset. We argue that this integrated strategy holds potential for describing the complex dynamics of creative collaboration, by linking the experiential features of collaborative experience (flow, social presence), with the structural features of collaboration dynamics (network indexes) and the collaboration outcome (the creative product). Finally, we report on our experience with using this methodology in blended collaboration settings (including both face-to-face and virtual meetings), to identify open issues and provide future research directions.

  1. A Human-Centered Design Methodology to Enhance the Usability, Human Factors, and User Experience of Connected Health Systems: A Three-Phase Methodology

    Science.gov (United States)

    Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul MA; Scharf, Thomas; ÓLaighin, Gearóid

    2017-01-01

    Background Design processes such as human-centered design, which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of human-centered design can often present a challenge when design teams are faced with the necessary, rapid, product development life cycles associated with the competitive connected health industry. Objective We wanted to derive a structured methodology that followed the principles of human-centered design that would allow designers and developers to ensure that the needs of the user are taken into account throughout the design process, while maintaining a rapid pace of development. In this paper, we present the methodology and its rationale before outlining how it was applied to assess and enhance the usability, human factors, and user experience of a connected health system known as the Wireless Insole for Independent and Safe Elderly Living (WIISEL) system, a system designed to continuously assess fall risk by measuring gait and balance parameters associated with fall risk. Methods We derived a three-phase methodology. In Phase 1 we emphasized the construction of a use case document. This document can be used to detail the context of use of the system by utilizing storyboarding, paper prototypes, and mock-ups in conjunction with user interviews to gather insightful user feedback on different proposed concepts. In Phase 2 we emphasized the use of expert usability inspections such as heuristic evaluations and cognitive walkthroughs with small multidisciplinary groups to review the prototypes born out of the Phase 1 feedback. Finally, in Phase 3 we emphasized classical user testing with target end users, using various metrics to measure the user experience and improve the final prototypes. Results We report a successful implementation of the

  2. The Oil Drop Experiment: An Illustration of Scientific Research Methodology and its Implications for Physics Textbooks

    Science.gov (United States)

    Rodriguez, Maria A.; Niaz, Mansoor

    2004-01-01

    The objectives of this study are: (1) evaluation of the methodology used in recent search for particles with fractional electrical charge (quarks) and its implications for understanding the scientific research methodology of Millikan; (2) evaluation of 43 general physics textbooks and 11 laboratory manuals, with respect to the oil drop experiment,…

  3. Exploring light mediators with low-threshold direct detection experiments

    Energy Technology Data Exchange (ETDEWEB)

    Kahlhoefer, Felix [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); RWTH Aachen Univ. (Germany). Inst. for Theoretical Particle Physics and Cosmology; Kulkarni, Suchita [Oesterreichische Akademie der Wissenschaften, Vienna (Austria). Inst. fuer Hochenergiephysik; Wild, Sebastian [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2017-11-15

    We explore the potential of future cryogenic direct detection experiments to determine the properties of the mediator that communicates the interactions between dark matter and nuclei. Due to their low thresholds and large exposures, experiments like CRESST-III, SuperCDMS SNOLAB and EDELWEISS-III will have excellent capability to reconstruct mediator masses in the MeV range for a large class of models. Combining the information from several experiments further improves the parameter reconstruction, even when taking into account additional nuisance parameters related to background uncertainties and the dark matter velocity distribution. These observations may offer the intriguing possibility of studying dark matter self-interactions with direct detection experiments.

  4. Exploring light mediators with low-threshold direct detection experiments

    International Nuclear Information System (INIS)

    Kahlhoefer, Felix

    2017-11-01

    We explore the potential of future cryogenic direct detection experiments to determine the properties of the mediator that communicates the interactions between dark matter and nuclei. Due to their low thresholds and large exposures, experiments like CRESST-III, SuperCDMS SNOLAB and EDELWEISS-III will have excellent capability to reconstruct mediator masses in the MeV range for a large class of models. Combining the information from several experiments further improves the parameter reconstruction, even when taking into account additional nuisance parameters related to background uncertainties and the dark matter velocity distribution. These observations may offer the intriguing possibility of studying dark matter self-interactions with direct detection experiments.

  5. Omega experiments and preparation for moderate-gain direct-drive experiments on Nif

    International Nuclear Information System (INIS)

    Mr Crory, R.L.; Bahr, R.E.; Boehly, T.R.

    2000-01-01

    Direct-drive laser-fusion ignition experiments rely on detailed understanding and control of irradiation uniformity, Rayleigh-Taylor instability, and target fabrication. LLE is investigating various theoretical aspects of a direct-drive NIF ignition target based on an 'all-DT' design: a spherical target of ∼ 3.5 mm diameter, 1 to 2 μm if CH wall thickness, and a ∼ 350 μm DT-ice layer near the triple point of DT (μ19K). OMEGA experiments are designed to address the critical issues related to direct-drive laser fusion and to provide the necessary data to validate the predictive capability of LLE computer codes. The future cryogenic targets used on OMEGA are hydrodynamically equivalent to those planned for the NIF. The current experimental studies on OMEGA address all of the essential components of direct-drive laser fusion: irradiation uniformity and laser imprinting, Rayleigh-Taylor growth and saturation, compressed core performance and shell-fuel mixing, laser-plasma interactions and their effect on target performance, and cryogenic target fabrication and handling. (authors)

  6. A Human-Centered Design Methodology to Enhance the Usability, Human Factors, and User Experience of Connected Health Systems: A Three-Phase Methodology.

    Science.gov (United States)

    Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul Ma; Scharf, Thomas; Quinlan, Leo R; ÓLaighin, Gearóid

    2017-03-16

    Design processes such as human-centered design, which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of human-centered design can often present a challenge when design teams are faced with the necessary, rapid, product development life cycles associated with the competitive connected health industry. We wanted to derive a structured methodology that followed the principles of human-centered design that would allow designers and developers to ensure that the needs of the user are taken into account throughout the design process, while maintaining a rapid pace of development. In this paper, we present the methodology and its rationale before outlining how it was applied to assess and enhance the usability, human factors, and user experience of a connected health system known as the Wireless Insole for Independent and Safe Elderly Living (WIISEL) system, a system designed to continuously assess fall risk by measuring gait and balance parameters associated with fall risk. We derived a three-phase methodology. In Phase 1 we emphasized the construction of a use case document. This document can be used to detail the context of use of the system by utilizing storyboarding, paper prototypes, and mock-ups in conjunction with user interviews to gather insightful user feedback on different proposed concepts. In Phase 2 we emphasized the use of expert usability inspections such as heuristic evaluations and cognitive walkthroughs with small multidisciplinary groups to review the prototypes born out of the Phase 1 feedback. Finally, in Phase 3 we emphasized classical user testing with target end users, using various metrics to measure the user experience and improve the final prototypes. We report a successful implementation of the methodology for the design and development

  7. Dark matter spin determination with directional direct detection experiments

    Science.gov (United States)

    Catena, Riccardo; Conrad, Jan; Döring, Christian; Ferella, Alfredo Davide; Krauss, Martin B.

    2018-01-01

    If dark matter has spin 0, only two WIMP-nucleon interaction operators can arise as leading operators from the nonrelativistic reduction of renormalizable single-mediator models for dark matter-quark interactions. Based on this crucial observation, we show that about 100 signal events at next generation directional detection experiments can be enough to enable a 2 σ rejection of the spin 0 dark matter hypothesis in favor of alternative hypotheses where the dark matter particle has spin 1 /2 or 1. In this context, directional sensitivity is crucial since anisotropy patterns in the sphere of nuclear recoil directions depend on the spin of the dark matter particle. For comparison, about 100 signal events are expected in a CF4 detector operating at a pressure of 30 torr with an exposure of approximately 26,000 cubic-meter-detector days for WIMPs of 100 GeV mass and a WIMP-fluorine scattering cross section of 0.25 pb. Comparable exposures require an array of cubic meter time projection chamber detectors.

  8. A Dynamic Methodology for Improving the Search Experience

    Directory of Open Access Journals (Sweden)

    Marcia D. Kerchner

    2006-06-01

    Full Text Available In the early years of modern information retrieval, the fundamental way in which we understood and evaluated search performance was by measuring precision and recall. In recent decades, however, models of evaluation have expanded to incorporate the information-seeking task and the quality of its outcome, as well as the value of the information to the user. We have developed a systems engineering-based methodology for improving the whole search experience. The approach focuses on understanding users’ information-seeking problems, understanding who has the problems, and applying solutions that address these problems. This information is gathered through ongoing analysis of site-usage reports, satisfaction surveys, Help Desk reports, and a working relationship with the business owners.

  9. Experience and Meaning in Qualitative Research: A Conceptual Review and a Methodological Device Proposal

    Directory of Open Access Journals (Sweden)

    Marianne Daher

    2017-07-01

    Full Text Available The relevance of experience and meaning in qualitative research is mostly accepted and is common ground for qualitative studies. However, there is an increasing trend towards trivializing the use of these notions. As a consequence, a mechanistic use of these terms has emerged within qualitative analysis, which has resulted in the loss of the original richness derived from the theoretical roots of these concepts. In this article, we aim to recover these origins by reviewing theoretical postulates from phenomenological and hermeneutic traditions and to propose their convergence in a holistic perspective. The challenge is to find the local source of meanings that will enlighten on how to understand people's experiences. This discussion is the basis for the encounter context themes (ECT methodological device, which emphasizes the importance of studying experience and meaning as part of a larger whole: the participants' life-world. Hence, ECT seeks to complement the available methodological tools for qualitatively-oriented studies, recovering—rather than re-creating—a theoretical discussion useful for current qualitative research practices.

  10. Application of ASSET methodology and operational experience feedback of NPPs in China

    Energy Technology Data Exchange (ETDEWEB)

    Lan, Ziyong [The National Nuclear Safety Administration, Beijing (China)

    1997-10-01

    The introductive presentation of ASSET methodology to China started in March 1992, 3 experts from the IAEA held the ASSET Seminar in Wuhan, China. Three years later, an IAEA seminar on ASSET Method and Operational Experience Feedback proceeded in Beijing on 20-24 March 1995. Another ASSET seminar on Self-Assessment and Operational Experience Feedback was held at Guangdong NPP site on 2-6 December 1996, The NNSA and the GNPP hosted the seminar, 2 IAEA experts, 55 participants from the NPPs, research institutes, the regulatory body (NNSA) and its regional offices attended the seminar. 3 figs, 5 tabs.

  11. Application of ASSET methodology and operational experience feedback of NPPs in China

    International Nuclear Information System (INIS)

    Ziyong Lan

    1997-01-01

    The introductive presentation of ASSET methodology to China started in March 1992, 3 experts from the IAEA held the ASSET Seminar in Wuhan, China. Three years later, an IAEA seminar on ASSET Method and Operational Experience Feedback proceeded in Beijing on 20-24 March 1995. Another ASSET seminar on Self-Assessment and Operational Experience Feedback was held at Guangdong NPP site on 2-6 December 1996, The NNSA and the GNPP hosted the seminar, 2 IAEA experts, 55 participants from the NPPs, research institutes, the regulatory body (NNSA) and its regional offices attended the seminar. 3 figs, 5 tabs

  12. Evaluation and optimization of hepatocyte culture media factors by design of experiments (DoE) methodology.

    Science.gov (United States)

    Dong, Jia; Mandenius, Carl-Fredrik; Lübberstedt, Marc; Urbaniak, Thomas; Nüssler, Andreas K N; Knobeloch, Daniel; Gerlach, Jörg C; Zeilinger, Katrin

    2008-07-01

    Optimization of cell culture media based on statistical experimental design methodology is a widely used approach for improving cultivation conditions. We applied this methodology to refine the composition of an established culture medium for growth of a human hepatoma cell line, C3A. A selection of growth factors and nutrient supplements were systematically screened according to standard design of experiments (DoE) procedures. The results of the screening indicated that the medium additives hepatocyte growth factor, oncostatin M, and fibroblast growth factor 4 significantly influenced the metabolic activities of the C3A cell line. Surface response methodology revealed that the optimum levels for these factors were 30 ng/ml for hepatocyte growth factor and 35 ng/ml for oncostatin M. Additional experiments on primary human hepatocyte cultures showed high variance in metabolic activities between cells from different individuals, making determination of optimal levels of factors more difficult. Still, it was possible to conclude that hepatocyte growth factor, epidermal growth factor, and oncostatin M had decisive effects on the metabolic functions of primary human hepatocytes.

  13. Does direct experience matter?

    DEFF Research Database (Denmark)

    Miralles, Francesc; Giones, Ferran; Gozun, Brian

    2017-01-01

    of being engaged in entrepreneurial behavior on entrepreneurial intention. We aim to shed light on whether the direct experience reinforces an individual’s entrepreneurial intention or reduces it. Building on an extended version of the planned behavior theory, we use the behavioral reasoning theory...... and an individual’s intention by introducing behavioral reasoning theory. These results provide support to initiatives to adapt entrepreneurship promotion efforts to the specific characteristics of the participants.......Entrepreneurial behavior research has used intention models to explain how an individual’s beliefs shape the attitudes and motivations that influence entrepreneurial intention. Nevertheless, as entrepreneurship promotion initiatives become global, it becomes relevant to explore the consequences...

  14. Direct experience and the strength of the personal norm - behaviour relationship

    DEFF Research Database (Denmark)

    Thøgersen, John

    2002-01-01

    norms on behavior, and (ii) direct experience is a stronger moderator in this case than in the attitude-behavior case. The case in question is the purchase of organic red wine. It is found that the outcome of consumers' choice between organic and non-organic wine depends on their personal (moral) norms......This study investigates whether the behavioral influence of personal norms with regard to repeated pro-social behavior depends on direct experience of this behavior. Based on previous norm and attitude research, it is hypothesized that (i) direct experience strengthens the influence of personal......, after controlling for attitudes and subjective social norms. However, the influence of personal norms, though not of attitude, depends on whether the consumer has direct experience of buying organic red wine. Hence, both hypotheses are confirmed....

  15. A simple experiment with Microsoft Office 2010 and Windows 7 utilizing digital forensic methodology

    Directory of Open Access Journals (Sweden)

    Gregory H. Carlton

    2013-03-01

    Full Text Available Digital forensic examiners are tasked with retrieving data from digital storage devices, and frequently these examiners are expected to explain the circumstances that led to the data being in its current state. Through written reports or verbal, expert testimony delivered in court, digital forensic examiners are expected to describe whether data have been altered, and if so, then to what extent have data been altered. Addressing these expectations results from opinions digital forensic examiners reach concerning their understanding of electronic storage and retrieval methods. The credibility of these opinions evolves from the scientific basis from which they are drawn using forensic methodology.   Digital forensic methodology, being a scientific process, is derived from observations and repeatable findings in controlled environments. Furthermore, scientific research methods have established that causal conclusions can be drawn only when observed in controlled experiments. With this in mind, it seems beneficial that digital forensic examiners have a library of experiments from which they can perform, observe results, and derive conclusions. After having conducted an experiment on a specific topic, a digital forensic examiner will be in a better position to express with confidence the state of the current data and perhaps the conditions that led to its current state.   This study provides a simple experiment using the contemporary versions of the most widely used software applications running on the most commonly installed operation system. Here, using the Microsoft Office 2010 applications, a simple Word document, an Excel spreadsheet, a PowerPoint presentation, and an Access database are created and then modified. A forensic analysis is performed to determine the extent in which the changes to the data are identified. The value in this study is not that it yields new forensic analysis techniques, but rather that it illustrates a

  16. Experimental design and methodology for a new Moessbauer scan experiment: absorption line tracking

    International Nuclear Information System (INIS)

    Veiga, A.; Pasquevich, G. A.; Zelis, P. Mendoza; Sanchez, F. H.; Fernandez van Raap, M. B.; Martinez, N.

    2009-01-01

    A new experimental setup and methodology that allows the automatic tracking of a Moessbauer absorption line as its energy position varies during the experiment is introduced. As a test the sixth spectral line of FeSn 2 was tracked while temperature was varied between room temperature and a value slightly above its Neel temperature.

  17. Preview of BPM6 Methodology and Analysis of Foreign Direct Investment in 2015 in Croatia

    Directory of Open Access Journals (Sweden)

    Šlogar Helena

    2017-06-01

    Full Text Available Foreign direct investments include equity capital, reinvested earnings and debt relations between ownership-related residents and non-residents. Since 31 October 2014, the Croatian National Bank has started to publish information in the field of statistics Relations (balance of payments, foreign debt and the IIP in accordance with the methodology prescribed by the sixth edition of the Manual on Balance of Payments (Eng. Balance of Payments and International Investment Position Manual, BPM6, thus changing the presentational form of direct investment. Direct investments are not classified according to the so-called direction of investments (Eng. directional principle on direct investment in Croatia and direct investment abroad anymore, but according to BPM6 apply the socalled principle of assets and liabilities (Eng. Assets / Liabilities principle. The aim is to point out the differences between the standards BPM5 and BPM6 and determine which activities and which countries are the most represented in the structure of direct investments in Croatia. By identifying relevant activities and countries in the structure of foreign direct investment, relevant information is obtained about the macroeconomic state of the Republic of Croatia and about the opportunities and potential dangers that certain activities and countries provide.

  18. Proposition of a modeling and an analysis methodology of integrated reverse logistics chain in the direct chain

    Directory of Open Access Journals (Sweden)

    Faycal Mimouni

    2016-04-01

    Full Text Available Purpose: Propose a modeling and analysis methodology based on the combination of Bayesian networks and Petri networks of the reverse logistics integrated the direct supply chain. Design/methodology/approach: Network modeling by combining Petri and Bayesian network. Findings: Modeling with Bayesian network complimented with Petri network to break the cycle problem in the Bayesian network. Research limitations/implications: Demands are independent from returns. Practical implications: Model can only be used on nonperishable products. Social implications: Legislation aspects: Recycling laws; Protection of environment; Client satisfaction via after sale service. Originality/value: Bayesian network with a cycle combined with the Petri Network.

  19. Direct and Indirect Harassment Experiences and Burnout among Academic Faculty in Japan.

    Science.gov (United States)

    Takeuchi, Masumi; Nomura, Kyoko; Horie, Saki; Okinaga, Hiroko; Perumalswami, Chithra R; Jagsi, Reshma

    2018-05-01

    The purpose of this study is three-fold: (1) to compare harassment (sexual, gender, and academic harassment both directly and indirectly experienced - i.e. "directly harassed" and "have seen or heard of someone who experienced harassment", respectively) experienced by males and females, (2) to investigate whether such experiences correlate with burnout, and (3) to explore whether social support might mitigate any such relationship between harassment and burnout. This cross-sectional study was conducted at a private university in Japan in February 2014 and is based on a work-life balance survey obtained from 330 academic faculty members. We investigated the association between each of the six subcategories of harassment (direct and indirect forms of each of the three types) and burnout using general linear regression models; we then evaluated interactions between harassment and social support in these models. The prevalence of direct and indirect experiences of harassment was higher in females than in males for all three types of harassment. Males showed higher burnout scores if they had direct experiences of harassment. There were significant interactions between social support and the direct experience of harassment; high social support mitigated the effect size of direct harassment on burnout among males. Females showed higher burnout scores if they had indirect experiences of harassment. However, the same buffering effect of social support on burnout as observed in males was not observed in females. Direct harassment experiences increased the risk of burnout in males, and indirect harassment experiences increased burnout in females.

  20. The use of the methodology of gender in the Constitutional Law. An experience of the first courses of Law Degree.

    OpenAIRE

    Pérez-Villalobos, María

    2014-01-01

    The fulfillment of the objectives of quality demanded by the EHEA, require a methodological renovation of university education, and advance in the implementation of new methodologies, in our case, in the legal sphere. This work shows a teaching experience in which efforts have been made to incorporate a new methodology to legal studies: methodology of gender. This methodology should be to produce and interpret legal norms incorporating the genuinely feminine values to the set of values of the...

  1. Strategic environmental noise mapping: methodological issues concerning the implementation of the EU Environmental Noise Directive and their policy implications.

    LENUS (Irish Health Repository)

    Murphy, E

    2010-04-01

    This paper explores methodological issues and policy implications concerning the implementation of the EU Environmental Noise Directive (END) across Member States. Methodologically, the paper focuses on two key thematic issues relevant to the Directive: (1) calculation methods and (2) mapping methods. For (1), the paper focuses, in particular, on how differing calculation methods influence noise prediction results as well as the value of the EU noise indicator L(den) and its associated implications for comparability of noise data across EU states. With regard to (2), emphasis is placed on identifying the issues affecting strategic noise mapping, estimating population exposure, noise action planning and dissemination of noise mapping results to the general public. The implication of these issues for future environmental noise policy is also examined.

  2. Experience-Sampling Methodology with a Mobile Device in Fibromyalgia

    Directory of Open Access Journals (Sweden)

    Castilla Diana

    2012-01-01

    Full Text Available This work describes the usability studies conducted in the development of an experience-sampling methodology (ESM system running in a mobile device. The goal of the system is to improve the accuracy and ecology in gathering daily self-report data in individuals suffering a chronic pain condition, fibromyalgia. The usability studies showed that the developed software to conduct ESM with mobile devices (smartphones, cell phones can be successfully used by individuals with fibromyalgia of different ages and with low level of expertise in the use of information and communication technologies. 100% of users completed the tasks successfully, although some have completely illiterate. Also there seems to be a clear difference in the way of interaction obtained in the two studies carried out.

  3. Abstract knowledge versus direct experience in processing of binomial expressions.

    Science.gov (United States)

    Morgan, Emily; Levy, Roger

    2016-12-01

    We ask whether word order preferences for binomial expressions of the form A and B (e.g. bread and butter) are driven by abstract linguistic knowledge of ordering constraints referencing the semantic, phonological, and lexical properties of the constituent words, or by prior direct experience with the specific items in questions. Using forced-choice and self-paced reading tasks, we demonstrate that online processing of never-before-seen binomials is influenced by abstract knowledge of ordering constraints, which we estimate with a probabilistic model. In contrast, online processing of highly frequent binomials is primarily driven by direct experience, which we estimate from corpus frequency counts. We propose a trade-off wherein processing of novel expressions relies upon abstract knowledge, while reliance upon direct experience increases with increased exposure to an expression. Our findings support theories of language processing in which both compositional generation and direct, holistic reuse of multi-word expressions play crucial roles. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  4. Strategic environmental noise mapping: methodological issues concerning the implementation of the EU Environmental Noise Directive and their policy implications.

    Science.gov (United States)

    Murphy, E; King, E A

    2010-04-01

    This paper explores methodological issues and policy implications concerning the implementation of the EU Environmental Noise Directive (END) across Member States. Methodologically, the paper focuses on two key thematic issues relevant to the Directive: (1) calculation methods and (2) mapping methods. For (1), the paper focuses, in particular, on how differing calculation methods influence noise prediction results as well as the value of the EU noise indicator L(den) and its associated implications for comparability of noise data across EU states. With regard to (2), emphasis is placed on identifying the issues affecting strategic noise mapping, estimating population exposure, noise action planning and dissemination of noise mapping results to the general public. The implication of these issues for future environmental noise policy is also examined. Copyright (c) 2009 Elsevier Ltd. All rights reserved.

  5. Direct experience while eating: Laboratory outcomes among individuals with eating disorders versus healthy controls.

    Science.gov (United States)

    Elices, Matilde; Carmona, Cristina; Narváez, Vanessa; Seto, Victoria; Martin-Blanco, Ana; Pascual, Juan C; Soriano, José; Soler, Joaquim

    2017-12-01

    To compare individuals with eating disorders (EDs) to healthy controls (HCs) to assess for differences in direct engagement in the eating process. Participants (n=58) were asked to eat an orange slice. To assess the degree of direct engagement with the eating process, participants were asked to write down 10 thoughts about the experience of eating the orange slice. Next, the participants were instructed to classify the main focus of each thought as either experiential ("direct experience") or analytical ("thinking about"). A direct experience index (DEI) was computed by dividing the number of times that participants classified an experience as a "direct experience" (the numerator) by the total number of all observations (i.e., direct experience+thinking about). Participants also completed the Five Facet Mindfulness Questionnaire (FFMQ) and the Experiences Questionnaire (EQ) to assess mindfulness facets and decentering, respectively. Compared to controls, participants in the EDs group presented significantly lower levels of direct experience during the eating task (EDs group: mean=43.54, SD=29.64; HCs group: mean=66.17, SD=22.23, p=0.03). Participants in the EDs group also scored significantly lower on other mindfulness-related variables. These findings suggest that engagement with the direct experience of eating is lower in individuals with EDs. Future research should investigate the role of mindfulness-based interventions to address direct experience while eating in individuals with EDs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. On the directional selectivity of tunneling experiments

    International Nuclear Information System (INIS)

    Beuermann, G.; Goettingen Univ.

    1981-01-01

    Using realistic parameters in a simplified model the directional selectivity of tunneling experiments is discussed. Although perfect surfaces and barriers are assumed, quasiparticles coming from a wide solid angle may contribute essentially to the tunnel current. This must be taken into consideration in the case of gap anisotropy. (orig.)

  7. Answering the question, "what is a clinical nurse leader?": transition experience of four direct-entry master's students.

    Science.gov (United States)

    Bombard, Emily; Chapman, Kimberly; Doyle, Marcy; Wright, Danielle K; Shippee-Rice, Raelene V; Kasik, Dot Radius

    2010-01-01

    Understanding the experience of students learning the clinical nurse leader (CNL) role can be useful for faculty, preceptors, staff nurses, and interdisciplinary team members who guide them. This article analyzes the experience of four direct-entry master's students in the first cohort to complete the CNL curriculum and to sit for the pilot CNL certification examination. Using action research methodology, the students worked with the clinical immersion practicum faculty and a writing consultant to develop the study purpose, collect and analyze data, and prepare a manuscript. The main theme that emerged was, answering the question, "what is a CNL?" Subthemes supporting the main theme involved coming to the edge, trusting the process, rounding the corner, and valuing becoming. The analysis confirmed the value the CNL offers as a new vision to nursing education and practice. The students offered suggestions for the CNL curriculum and practicum. Copyright © 2010 Elsevier Inc. All rights reserved.

  8. A Methodological Problem Associated with Researching Women Entrepreneurs

    OpenAIRE

    Beatrice E. Avolio

    2011-01-01

    This article highlights one of the most significant methodological problems of researching women entrepreneurs and understanding the female entrepreneurial experience, which is related to the definition of what is a women entrepreneur. The article outlines the state of research on women entrepreneurs, presents the diverse definitions used in research, conceptualizes the different aspects related to the definition of a woman entrepreneur, and proposes future directions for developing research ...

  9. Methodology for full comparative assessment of direct gross glycerin combustion in a flame tube furnace

    Energy Technology Data Exchange (ETDEWEB)

    Maturana, Aymer Yeferson; Pagliuso, Josmar D. [Dept. of Mechanical Engineering. Sao Carlos School of Engineering. University of Sao Paulo, Sao Carlos, SP (Brazil)], e-mails: aymermat@sc.usp.br, josmar@sc.usp.br

    2010-07-01

    This study is to develop a methodology to identify and evaluate the emissions and heat transfer associated to combustion of gross glycerin a by-product of the Brazilian biodiesel manufacture process as alternative energy source. It aims to increase the present knowledge on the matter and to contribute to the improvement of the economic and environmental perspective of biodiesel industry. This methodology was considered to be used for assessment of gross glycerin combustion from three different types of biodiesel (bovine tallow, palm and soy). The procedures for evaluation and quantification of emissions of sulphur and nitrogen oxides, total hydrocarbons, carbon monoxide, carbon dioxide, and acrolein were analyzed, described and standardized. Experimental techniques for mutagenic and toxic effects assessment of gases similarly were analyzed and standardized, as well as the calorific power, the associate heat transfer and fundamentals operational parameters. The methodology was developed, using a full-instrumented flame tube furnace, continuous gas analyzers, a chromatograph, automatic data acquisition systems and other auxiliary equipment. The mutagenic and toxic effects of the study was based on Tradescantia clone KU-20, using chambers of intoxication and biological analytical techniques previously developed and others were specially adapted. The benchmark for the initial set up was based on the performance evaluation of the previous equipment tested with diesel considering its behavior during direct combustion. Finally, the following factors were defined for the combustion of crude glycerin, configurations of equipment types, operational parameters such as air fuel ratio adiabatic temperature and other necessary aspect for successful application of the methodology. The developed and integrated methodology was made available to the concern industry, environmental authorities and researchers as procedures to access the viability of gross glycerin or similar fuels as

  10. Quality Assurance and Its Impact from Higher Education Institutions' Perspectives: Methodological Approaches, Experiences and Expectations

    Science.gov (United States)

    Bejan, Stelian Andrei; Janatuinen, Tero; Jurvelin, Jouni; Klöpping, Susanne; Malinen, Heikki; Minke, Bernhard; Vacareanu, Radu

    2015-01-01

    This paper reports on methodological approaches, experiences and expectations referring to impact analysis of quality assurance from the perspective of three higher education institutions (students, teaching staff, quality managers) from Germany, Finland and Romania. The presentations of the three sample institutions focus on discussing the core…

  11. Polar-direct-drive experiments on the National Ignition Facility

    Energy Technology Data Exchange (ETDEWEB)

    Hohenberger, M.; Radha, P. B.; Myatt, J. F.; Marozas, J. A.; Marshall, F. J.; Michel, D. T.; Regan, S. P.; Seka, W.; Shvydky, A.; Sangster, T. C.; Betti, R.; Boehly, T. R.; Bonino, M. J.; Collins, T. J. B.; Craxton, R. S.; Delettrez, J. A.; Edgell, D. H.; Epstein, R.; Fiksel, G.; Froula, D. H. [Laboratory for Laser Energetics, University of Rochester, Rochester, New York 14623-1299 (United States); and others

    2015-05-15

    To support direct-drive inertial confinement fusion experiments at the National Ignition Facility (NIF) [G. H. Miller, E. I. Moses, and C. R. Wuest, Opt. Eng. 43, 2841 (2004)] in its indirect-drive beam configuration, the polar-direct-drive (PDD) concept [S. Skupsky et al., Phys. Plasmas 11, 2763 (2004)] has been proposed. Ignition in PDD geometry requires direct-drive–specific beam smoothing, phase plates, and repointing the NIF beams toward the equator to ensure symmetric target irradiation. First experiments to study the energetics and preheat in PDD implosions at the NIF have been performed. These experiments utilize the NIF in its current configuration, including beam geometry, phase plates, and beam smoothing. Room-temperature, 2.2-mm-diam plastic shells filled with D{sub 2} gas were imploded with total drive energies ranging from ∼500 to 750 kJ with peak powers of 120 to 180 TW and peak on-target irradiances at the initial target radius from 8 × 10{sup 14} to 1.2 × 10{sup 15 }W/cm{sup 2}. Results from these initial experiments are presented, including measurements of shell trajectory, implosion symmetry, and the level of hot-electron preheat in plastic and Si ablators. Experiments are simulated with the 2-D hydrodynamics code DRACO including a full 3-D ray-trace to model oblique beams, and models for nonlocal electron transport and cross-beam energy transport (CBET). These simulations indicate that CBET affects the shell symmetry and leads to a loss of energy imparted onto the shell, consistent with the experimental data.

  12. Dark matter effective field theory scattering in direct detection experiments

    Energy Technology Data Exchange (ETDEWEB)

    Schneck, K.; Cabrera, B.; Cerdeño, D. G.; Mandic, V.; Rogers, H. E.; Agnese, R.; Anderson, A. J.; Asai, M.; Balakishiyeva, D.; Barker, D.; Basu Thakur, R.; Bauer, D. A.; Billard, J.; Borgland, A.; Brandt, D.; Brink, P. L.; Bunker, R.; Caldwell, D. O.; Calkins, R.; Chagani, H.; Chen, Y.; Cooley, J.; Cornell, B.; Crewdson, C. H.; Cushman, P.; Daal, M.; Di Stefano, P. C. F.; Doughty, T.; Esteban, L.; Fallows, S.; Figueroa-Feliciano, E.; Godfrey, G. L.; Golwala, S. R.; Hall, J.; Harris, H. R.; Hofer, T.; Holmgren, D.; Hsu, L.; Huber, M. E.; Jardin, D. M.; Jastram, A.; Kamaev, O.; Kara, B.; Kelsey, M. H.; Kennedy, A.; Leder, A.; Loer, B.; Lopez Asamar, E.; Lukens, P.; Mahapatra, R.; McCarthy, K. A.; Mirabolfathi, N.; Moffatt, R. A.; Morales Mendoza, J. D.; Oser, S. M.; Page, K.; Page, W. A.; Partridge, R.; Pepin, M.; Phipps, A.; Prasad, K.; Pyle, M.; Qiu, H.; Rau, W.; Redl, P.; Reisetter, A.; Ricci, Y.; Roberts, A.; Saab, T.; Sadoulet, B.; Sander, J.; Schnee, R. W.; Scorza, S.; Serfass, B.; Shank, B.; Speller, D.; Toback, D.; Upadhyayula, S.; Villano, A. N.; Welliver, B.; Wilson, J. S.; Wright, D. H.; Yang, X.; Yellin, S.; Yen, J. J.; Young, B. A.; Zhang, J.

    2015-05-18

    We examine the consequences of the effective field theory (EFT) of dark matter–nucleon scattering for current and proposed direct detection experiments. Exclusion limits on EFT coupling constants computed using the optimum interval method are presented for SuperCDMS Soudan, CDMS II, and LUX, and the necessity of combining results from multiple experiments in order to determine dark matter parameters is discussed. We demonstrate that spectral differences between the standard dark matter model and a general EFT interaction can produce a bias when calculating exclusion limits and when developing signal models for likelihood and machine learning techniques. We also discuss the implications of the EFT for the next-generation (G2) direct detection experiments and point out regions of complementarity in the EFT parameter space.

  13. Current status and future directions of development of PR/PP evaluation methodology

    International Nuclear Information System (INIS)

    Kim, D. Y.; Kwon, E. H.; Kim, H. D.

    2012-01-01

    A mandatory design requirement for the introduction of generation IV nuclear energy systems (NESs) is defined as the characteristic of a nuclear energy system that impedes the diversion or undeclared production of nuclear material, or misuse of technology, by State in order to acquire nuclear weapons or other nuclear explosive devices. The same report also defines physical protection (PP) as the use of technical, administrative, and operational measures to prevent the theft of nuclear/radioactive material for the purpose of producing nuclear weapons, producing nuclear devices for nuclear terrorism, or using the facility or transportation system for radiological sabotage. Since the early 1970s right after the Indian nuclear test, the international community has recognized the limits of political and diplomatic means to prevent overt proliferation by states and looked for ways to incorporate technical features that are inherent in NESs. As a first step, active research has been conducted to develop a methodology to evaluate PR and PP components of NESs and has now been reduced to two main R and D streams: the Generation IV International Forum (GIF) and International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO). (Currently, GIF and INPRO are leading the debate as major projects for PR and PP evaluation methods.) This paper presents an overview of the R and D accomplishments during the development of PR and PP evaluation methodology. It also suggests some directions for future research

  14. Current status and future directions of development of PR/PP evaluation methodology

    Energy Technology Data Exchange (ETDEWEB)

    Kim, D. Y.; Kwon, E. H.; Kim, H. D. [KAERI, Daejeon (Korea, Republic of)

    2012-10-15

    A mandatory design requirement for the introduction of generation IV nuclear energy systems (NESs) is defined as the characteristic of a nuclear energy system that impedes the diversion or undeclared production of nuclear material, or misuse of technology, by State in order to acquire nuclear weapons or other nuclear explosive devices. The same report also defines physical protection (PP) as the use of technical, administrative, and operational measures to prevent the theft of nuclear/radioactive material for the purpose of producing nuclear weapons, producing nuclear devices for nuclear terrorism, or using the facility or transportation system for radiological sabotage. Since the early 1970s right after the Indian nuclear test, the international community has recognized the limits of political and diplomatic means to prevent overt proliferation by states and looked for ways to incorporate technical features that are inherent in NESs. As a first step, active research has been conducted to develop a methodology to evaluate PR and PP components of NESs and has now been reduced to two main R and D streams: the Generation IV International Forum (GIF) and International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO). (Currently, GIF and INPRO are leading the debate as major projects for PR and PP evaluation methods.) This paper presents an overview of the R and D accomplishments during the development of PR and PP evaluation methodology. It also suggests some directions for future research.

  15. The Diurnal Variation of the Wimp Detection Event Rates in Directional Experiments

    CERN Document Server

    Vergados, J D

    2009-01-01

    The recent WMAP data have confirmed that exotic dark matter together with the vacuum energy (cosmological constant) dominate in the flat Universe. Modern particle theories naturally provide viable cold dark matter candidates with masses in the GeV-TeV region. Supersymmetry provides the lightest supersymmetric particle (LSP), theories in extra dimensions supply the lightest Kaluza-Klein particle (LKP) etc. The nature of dark matter can only be unraveled only by its direct detection in the laboratory. All such candidates will be called WIMPs (Weakly Interacting Massive Particles). In any case the direct dark matter search, which amounts to detecting the recoiling nucleus, following its collision with WIMP, is central to particle physics and cosmology. In this work we briefly review the theoretical elements relevant to the direct dark matter detection experiments, paying particular attention to directional experiments. i.e experiments in which, not only the energy but the direction of the recoiling nucleus is ob...

  16. A methodology for direct quantification of over-ranging length in helical computed tomography with real-time dosimetry.

    Science.gov (United States)

    Tien, Christopher J; Winslow, James F; Hintenlang, David E

    2011-01-31

    In helical computed tomography (CT), reconstruction information from volumes adjacent to the clinical volume of interest (VOI) is required for proper reconstruction. Previous studies have relied upon either operator console readings or indirect extrapolation of measurements in order to determine the over-ranging length of a scan. This paper presents a methodology for the direct quantification of over-ranging dose contributions using real-time dosimetry. A Siemens SOMATOM Sensation 16 multislice helical CT scanner is used with a novel real-time "point" fiber-optic dosimeter system with 10 ms temporal resolution to measure over-ranging length, which is also expressed in dose-length-product (DLP). Film was used to benchmark the exact length of over-ranging. Over-ranging length varied from 4.38 cm at pitch of 0.5 to 6.72 cm at a pitch of 1.5, which corresponds to DLP of 131 to 202 mGy-cm. The dose-extrapolation method of Van der Molen et al. yielded results within 3%, while the console reading method of Tzedakis et al. yielded consistently larger over-ranging lengths. From film measurements, it was determined that Tzedakis et al. overestimated over-ranging lengths by one-half of beam collimation width. Over-ranging length measured as a function of reconstruction slice thicknesses produced two linear regions similar to previous publications. Over-ranging is quantified with both absolute length and DLP, which contributes about 60 mGy-cm or about 10% of DLP for a routine abdominal scan. This paper presents a direct physical measurement of over-ranging length within 10% of previous methodologies. Current uncertainties are less than 1%, in comparison with 5% in other methodologies. Clinical implantation can be increased by using only one dosimeter if codependence with console readings is acceptable, with an uncertainty of 1.1% This methodology will be applied to different vendors, models, and postprocessing methods--which have been shown to produce over-ranging lengths

  17. Particle Discrimination Experiment for Direct Energy Conversion

    International Nuclear Information System (INIS)

    Yasaka, Y.; Kiriyama, Y.; Yamamoto, S.; Takeno, H.; Ishikawa, M.

    2005-01-01

    A direct energy conversion system designed for D- 3 He fusion reactor based on a field reversed configuration employs a venetian-blind type converter for thermal ions to produce DC power and a traveling wave type converter for fusion protons to produce RF power. It is therefore necessary to separate, discriminate, and guide the particle species. For this purpose, a cusp magnetic field is proposed, in which the electrons are deflected and guided along the field line to the line cusp, while the ions pass through the point cusp. A small-scale experimental device was used to study the basic characteristics of discrimination of electrons and ions in the cusp magnetic field. Ions separated from electrons are guided to an ion collector, which is operated as a one-stage direct energy converter. The conversion efficiency was measured for cases with different values of mean and spread of ion energy. These experiments successfully demonstrate direct energy conversion from plasma beams using particle discrimination by a cusp magnetic field

  18. A proposed experiment for studying the direct neutron-neutron interaction

    International Nuclear Information System (INIS)

    Hassan Fikry, A.R.; Maayouf, R.M.A.

    1979-01-01

    An experiment for studying the direct neutron-neutron interaction is suggested. The experiment is based on the combined use of an accelerator, e.g., an electron linear accelerator, together with a mobile pulsed reactor; or using a pulsed beam reactor together with a mobile neutron generator

  19. A methodology for the stochastic generation of hourly synthetic direct normal irradiation time series

    Science.gov (United States)

    Larrañeta, M.; Moreno-Tejera, S.; Lillo-Bravo, I.; Silva-Pérez, M. A.

    2018-02-01

    Many of the available solar radiation databases only provide global horizontal irradiance (GHI) while there is a growing need of extensive databases of direct normal radiation (DNI) mainly for the development of concentrated solar power and concentrated photovoltaic technologies. In the present work, we propose a methodology for the generation of synthetic DNI hourly data from the hourly average GHI values by dividing the irradiance into a deterministic and stochastic component intending to emulate the dynamics of the solar radiation. The deterministic component is modeled through a simple classical model. The stochastic component is fitted to measured data in order to maintain the consistency of the synthetic data with the state of the sky, generating statistically significant DNI data with a cumulative frequency distribution very similar to the measured data. The adaptation and application of the model to the location of Seville shows significant improvements in terms of frequency distribution over the classical models. The proposed methodology applied to other locations with different climatological characteristics better results than the classical models in terms of frequency distribution reaching a reduction of the 50% in the Finkelstein-Schafer (FS) and Kolmogorov-Smirnov test integral (KSI) statistics.

  20. An experiment for determining the Euler load by direct computation

    Science.gov (United States)

    Thurston, Gaylen A.; Stein, Peter A.

    1986-01-01

    A direct algorithm is presented for computing the Euler load of a column from experimental data. The method is based on exact inextensional theory for imperfect columns, which predicts two distinct deflected shapes at loads near the Euler load. The bending stiffness of the column appears in the expression for the Euler load along with the column length, therefore the experimental data allows a direct computation of bending stiffness. Experiments on graphite-epoxy columns of rectangular cross-section are reported in the paper. The bending stiffness of each composite column computed from experiment is compared with predictions from laminated plate theory.

  1. Polar-Direct-Drive Experiments on OMEGA

    International Nuclear Information System (INIS)

    Marshall, F.J.; Craxton, R.S.; Bonino, M.J.; Epstein, R.; Glebov, V.Yu.; Jacobs-Perkins, D.; Knauer, J.P.; Marozas, J.A.; McKenty, P.W.; Noyes, S.G.; Radha, P.B.; Seka, W.; Skupsky, S.; Smalyuk

    2006-01-01

    Polar direct drive (PDD), a promising ignition path for the NIF while the beams are in the indirect-drive configuration, is currently being investigated on the OMEGA laser system by using 40 beams in six rings repointed to more uniformly illuminate the target. The OMEGA experiments are being performed with standard, ''warm'' targets with and without the use of an equatorial ''Saturn-like'' toroidally shaped CH ring. Target implosion symmetry is diagnosed with framed x-ray backlighting using additional OMEGA beams and by time-integrated x-ray imaging of the stagnating core

  2. Methodological Issues and Practical Strategies in Research on Child Maltreatment Victims' Abilities and Experiences as Witnesses

    Science.gov (United States)

    Chae, Yoojin; Goodman, Gail S.; Bederian-Gardner, Daniel; Lindsay, Adam

    2011-01-01

    Scientific studies of child maltreatment victims' memory abilities and court experiences have important legal, psychological, and clinical implications. However, state-of-the-art research on child witnesses is often hindered by methodological challenges. In this paper, we address specific problems investigators may encounter when attempting such…

  3. Brand Experience in Banking Industry: Direct and Indirect Relationship to Loyalty

    Directory of Open Access Journals (Sweden)

    Nuri WULANDARI

    2016-02-01

    Full Text Available In marketing, the meaning of value is rapidly shifting from service and relationships to experiences. It is believed that the traditional value proposition is no longer effective to compete in the market and to gain customer loyalty. By adapting the brand experience model, this study tries to validate the model in the banking industry, which is currently facing intense competition to retain customers. The brand experience construct is tested for its direct and indirect relationship toward loyalty. It is postulated that satisfaction and brand authenticity might be instrumental in mediating brand experience to loyalty. Research was conducted via in-depth interview and quantitative survey, targeting bank customers in Jakarta. The result confirmed that brand experience has direct and indirect contribution to loyalty in significant and positive manner. The research contributes in validating previous studies with a rare emphasis in banking sector. The result implies that brand experience is an important value to lead to customer loyalty in this area and subject to growing research on experience marketing.

  4. Both Direct and Vicarious Experiences of Nature Affect Children's Willingness to Conserve Biodiversity.

    Science.gov (United States)

    Soga, Masashi; Gaston, Kevin J; Yamaura, Yuichi; Kurisu, Kiyo; Hanaki, Keisuke

    2016-05-25

    Children are becoming less likely to have direct contact with nature. This ongoing loss of human interactions with nature, the extinction of experience, is viewed as one of the most fundamental obstacles to addressing global environmental challenges. However, the consequences for biodiversity conservation have been examined very little. Here, we conducted a questionnaire survey of elementary schoolchildren and investigated effects of the frequency of direct (participating in nature-based activities) and vicarious experiences of nature (reading books or watching TV programs about nature and talking about nature with parents or friends) on their affective attitudes (individuals' emotional feelings) toward and willingness to conserve biodiversity. A total of 397 children participated in the surveys in Tokyo. Children's affective attitudes and willingness to conserve biodiversity were positively associated with the frequency of both direct and vicarious experiences of nature. Path analysis showed that effects of direct and vicarious experiences on children's willingness to conserve biodiversity were mediated by their affective attitudes. This study demonstrates that children who frequently experience nature are likely to develop greater emotional affinity to and support for protecting biodiversity. We suggest that children should be encouraged to experience nature and be provided with various types of these experiences.

  5. OMEGA ICF experiments and preparations for direct drive on NIF

    International Nuclear Information System (INIS)

    McCrory, R.L.; Bahr, R.E.; Betti, R.

    2001-01-01

    Direct-drive laser-fusion ignition experiments rely on detailed understanding and control of irradiation uniformity, the Rayleigh-Taylor instability, and target fabrication. LLE is investigating various theoretical aspects of a direct-drive NIF ignition target based on an 'all-DT' design: a spherical target of ∼3.4-mm diameter, 1 to 2 μm of CH wall thickness, and an ∼340-μm DT-ice layer near the triple point of DT (∼19 K). OMEGA experiments are designed to address the critical issues related to direct-drive laser fusion and to provide the necessary data to validate the predictive capability of LLE computer codes. The cryogenic targets to be used on OMEGA are hydrodynamically equivalent to those planned for the NIF. The current experimental studies on OMEGA address the essential components of direct-drive laser fusion: irradiation uniformity and laser imprinting, Rayleigh-Taylor growth and saturation, compressed core performance and shell fuel mixing, laser plasma interactions and their effect on target performance, and cryogenic target fabrication and handling. (author)

  6. Sociocultural Meanings of Nanotechnology: Research Methodologies

    Science.gov (United States)

    Bainbridge, William Sims

    2004-06-01

    This article identifies six social-science research methodologies that will be useful for charting the sociocultural meaning of nanotechnology: web-based questionnaires, vignette experiments, analysis of web linkages, recommender systems, quantitative content analysis, and qualitative textual analysis. Data from a range of sources are used to illustrate how the methods can delineate the intellectual content and institutional structure of the emerging nanotechnology culture. Such methods will make it possible in future to test hypotheses such as that there are two competing definitions of nanotechnology - the technical-scientific and the science-fiction - that are influencing public perceptions by different routes and in different directions.

  7. Some historical tendencies of the methodological work direction in the education municipal level at the cuban context

    Directory of Open Access Journals (Sweden)

    Orlando Ramos-Álvarez

    2017-11-01

    Full Text Available This work leaves from the theoretical and empiric systematizing that shows the necessity to investigate how it has evolved and what factors they have conditioned the behavior of the methodological work direction in the education municipal level. The basic budgets of the educational are approached that serve of referents for the analysis of the relationships of the organizational functional levels of the methodological work. The study is carried out in the period 1959-2017, starting in 1959 as an important milestone in the conception of the Cuban educational system. When the Revolution triumphed there were two conceptions of Education, in Cuban educational practice, on the one hand the abandoned national education system, essentially pragmatic, totally incongruous and on the other the aspirations of the Revolutionary Government, which had been forged in the revolutionary struggle between 1953 and 1959.

  8. Creativity in phenomenological methodology

    DEFF Research Database (Denmark)

    Dreyer, Pia; Martinsen, Bente; Norlyk, Annelise

    2014-01-01

    on the methodologies of van Manen, Dahlberg, Lindseth & Norberg, the aim of this paper is to argue that the increased focus on creativity and arts in research methodology is valuable to gain a deeper insight into lived experiences. We illustrate this point through examples from empirical nursing studies, and discuss......Nursing research is often concerned with lived experiences in human life using phenomenological and hermeneutic approaches. These empirical studies may use different creative expressions and art-forms to describe and enhance an embodied and personalised understanding of lived experiences. Drawing...... may support a respectful renewal of phenomenological research traditions in nursing research....

  9. Execution of a self-directed risk assessment methodology to address HIPAA data security requirements

    Science.gov (United States)

    Coleman, Johnathan

    2003-05-01

    This paper analyzes the method and training of a self directed risk assessment methodology entitled OCTAVE (Operationally Critical Threat Asset and Vulnerability Evaluation) at over 170 DOD medical treatment facilities. It focuses specifically on how OCTAVE built interdisciplinary, inter-hierarchical consensus and enhanced local capabilities to perform Health Information Assurance. The Risk Assessment Methodology was developed by the Software Engineering Institute at Carnegie Mellon University as part of the Defense Health Information Assurance Program (DHIAP). The basis for its success is the combination of analysis of organizational practices and technological vulnerabilities. Together, these areas address the core implications behind the HIPAA Security Rule and can be used to develop Organizational Protection Strategies and Technological Mitigation Plans. A key component of OCTAVE is the inter-disciplinary composition of the analysis team (Patient Administration, IT staff and Clinician). It is this unique composition of analysis team members, along with organizational and technical analysis of business practices, assets and threats, which enables facilities to create sound and effective security policies. The Risk Assessment is conducted in-house, and therefore the process, results and knowledge remain within the organization, helping to build consensus in an environment of differing organizational and disciplinary perspectives on Health Information Assurance.

  10. Hunting electroweakinos at future hadron colliders and direct detection experiments

    Energy Technology Data Exchange (ETDEWEB)

    Cortona, Giovanni Grilli di [SISSA - International School for Advanced Studies,Via Bonomea 265, I-34136 Trieste (Italy); INFN - Sezione di Trieste,via Valerio 2, I-34127 Trieste (Italy)

    2015-05-07

    We analyse the mass reach for electroweakinos at future hadron colliders and their interplay with direct detection experiments. Motivated by the LHC data, we focus on split supersymmetry models with different electroweakino spectra. We find for example that a 100 TeV collider may explore Winos up to ∼7 TeV in low scale gauge mediation models or thermal Wino dark matter around 3 TeV in models of anomaly mediation with long-lived Winos. We show moreover how collider searches and direct detection experiments have the potential to cover large part of the parameter space even in scenarios where the lightest neutralino does not contribute to the whole dark matter relic density.

  11. AEGIS methodology and a perspective from AEGIS methodology demonstrations

    International Nuclear Information System (INIS)

    Dove, F.H.

    1981-03-01

    Objectives of AEGIS (Assessment of Effectiveness of Geologic Isolation Systems) are to develop the capabilities needed to assess the post-closure safety of waste isolation in geologic formation; demonstrate these capabilities on reference sites; apply the assessment methodology to assist the NWTS program in site selection, waste package and repository design; and perform repository site analyses for the licensing needs of NWTS. This paper summarizes the AEGIS methodology, the experience gained from methodology demonstrations, and provides an overview in the following areas: estimation of the response of a repository to perturbing geologic and hydrologic events; estimation of the transport of radionuclides from a repository to man; and assessment of uncertainties

  12. Optimal color design of psychological counseling room by design of experiments and response surface methodology.

    Science.gov (United States)

    Liu, Wenjuan; Ji, Jianlin; Chen, Hua; Ye, Chenyu

    2014-01-01

    Color is one of the most powerful aspects of a psychological counseling environment. Little scientific research has been conducted on color design and much of the existing literature is based on observational studies. Using design of experiments and response surface methodology, this paper proposes an optimal color design approach for transforming patients' perception into color elements. Six indices, pleasant-unpleasant, interesting-uninteresting, exciting-boring, relaxing-distressing, safe-fearful, and active-inactive, were used to assess patients' impression. A total of 75 patients participated, including 42 for Experiment 1 and 33 for Experiment 2. 27 representative color samples were designed in Experiment 1, and the color sample (L = 75, a = 0, b = -60) was the most preferred one. In Experiment 2, this color sample was set as the 'central point', and three color attributes were optimized to maximize the patients' satisfaction. The experimental results show that the proposed method can get the optimal solution for color design of a counseling room.

  13. Directed Graph Methodology for Acquisition Path Analysis: a possible tool to support the state-level approach

    International Nuclear Information System (INIS)

    Vincze, Arpad; Nemeth, Andras

    2013-01-01

    According to a recent statement, the IAEA seeks to develop a more effective safeguards system to achieve greater deterrence, because deterrence of proliferation is much more effective than detection. To achieve this goal, a less predictive safeguards system is being developed based on the advanced state-level approach that is driven by all available safeguards-relevant information. The 'directed graph analysis' is recommended as a possible methodology to implement acquisition path analysis by the IAEA to support the State evaluation process. The basic methodology is simple, well established, powerful, and its adaptation to the modelling of the nuclear profile of a State requires minimum software development. Based on this methodology the material flow network model has been developed under the Hungarian Support Programme to the IAEA, which is described in detail. In the proposed model, materials in different chemical and physical form can flow through pipes representing declared processes, material transports, diversions or undeclared processes. The nodes of the network are the material types, while the edges of the network are the pipes. A state parameter (p) is assigned to each node and edge representing the probability of their existence in the State. The possible application of this model in the State-level analytical approach will be discussed and outlook for further work will be given. The paper is followed by the slides of the presentation

  14. METHODOLOGY OF PROFESSIONAL PEDAGOGICAL EDUCATION: THEORY AND PRACTICE (theoretical and methodological foundations of vocational teacher education

    Directory of Open Access Journals (Sweden)

    Evgeny M. Dorozhkin

    2014-01-01

    Full Text Available The study is aimed at investigating a justification of the new approach to the problem of vocational education development through the prism of interdependence research methodology and practice. This conceptual setup allows determining the main directions for teacher training modernization of vocational schools. The authors note that the current socio-economic situation in our country has actualized the problem of personnel training. Politicians, economists and scientists’ speeches are all about the shortage of skilled personnel. They see the main reason of this catastrophic situation in the present system of primary and secondary vocational education. At least they concern over the current practice of pedagogical personnel training of vocational education who are to restore the system of vocational education. Our country, Russia has a great positive experience in solving this problem. Scientific-methodological centre for vocational teacher education is the Russian State Vocational Pedagogical University under the scientific direction of Academician of the Russian Academy of Education, G. M. Romantsev. The reflection of scientifictheoretical bases of this education led the authors to the analysis and designing (formation of existent and new professional and pedagogical methodology. Methods. The fundamental position of A. M. Novikov on the generality of the research (scientific and practical activity methodology has become the theoretical platform of the present study. Conceptual field, conceptual statements and professional model are presented as the whole system (or integrating factor. The theoretical framework has determined the logic of the study and its results. Scientific and educational methodology differentiation in terms of the subject of cognitive activity has allowed identifying the main scientific and practical disciplines of vocational teacher education. The creative concept as the subject ground is instrumental analysis of

  15. 42 CFR 441.472 - Budget methodology.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Budget methodology. 441.472 Section 441.472 Public... Self-Directed Personal Assistance Services Program § 441.472 Budget methodology. (a) The State shall set forth a budget methodology that ensures service authorization resides with the State and meets the...

  16. From experience: applying the risk diagnosing methodology

    NARCIS (Netherlands)

    Keizer, Jimme A.; Halman, Johannes I.M.; Song, Michael

    2002-01-01

    No risk, no reward. Companies must take risks to launch new products speedily and successfully. The ability to diagnose and manage risks is increasingly considered of vital importance in high-risk innovation. This article presents the Risk Diagnosing Methodology (RDM), which aims to identify and

  17. From experience : applying the risk diagnosing methodology

    NARCIS (Netherlands)

    Keizer, J.A.; Halman, J.I.M.; Song, X.M.

    2002-01-01

    No risk, no reward. Companies must take risks to launch new products speedily and successfully. The ability to diagnose and manage risks is increasingly considered of vital importance in high-risk innovation. This article presents the Risk Diagnosing Methodology (RDM), which aims to identify and

  18. Both Direct and Vicarious Experiences of Nature Affect Children’s Willingness to Conserve Biodiversity

    Directory of Open Access Journals (Sweden)

    Masashi Soga

    2016-05-01

    Full Text Available Children are becoming less likely to have direct contact with nature. This ongoing loss of human interactions with nature, the extinction of experience, is viewed as one of the most fundamental obstacles to addressing global environmental challenges. However, the consequences for biodiversity conservation have been examined very little. Here, we conducted a questionnaire survey of elementary schoolchildren and investigated effects of the frequency of direct (participating in nature-based activities and vicarious experiences of nature (reading books or watching TV programs about nature and talking about nature with parents or friends on their affective attitudes (individuals’ emotional feelings toward and willingness to conserve biodiversity. A total of 397 children participated in the surveys in Tokyo. Children’s affective attitudes and willingness to conserve biodiversity were positively associated with the frequency of both direct and vicarious experiences of nature. Path analysis showed that effects of direct and vicarious experiences on children’s willingness to conserve biodiversity were mediated by their affective attitudes. This study demonstrates that children who frequently experience nature are likely to develop greater emotional affinity to and support for protecting biodiversity. We suggest that children should be encouraged to experience nature and be provided with various types of these experiences.

  19. Methodologies and applications for critical infrastructure protection: State-of-the-art

    International Nuclear Information System (INIS)

    Yusta, Jose M.; Correa, Gabriel J.; Lacal-Arantegui, Roberto

    2011-01-01

    This work provides an update of the state-of-the-art on energy security relating to critical infrastructure protection. For this purpose, this survey is based upon the conceptual view of OECD countries, and specifically in accordance with EU Directive 114/08/EC on the identification and designation of European critical infrastructures, and on the 2009 US National Infrastructure Protection Plan. The review discusses the different definitions of energy security, critical infrastructure and key resources, and shows some of the experie'nces in countries considered as international reference on the subject, including some information-sharing issues. In addition, the paper carries out a complete review of current methodologies, software applications and modelling techniques around critical infrastructure protection in accordance with their functionality in a risk management framework. The study of threats and vulnerabilities in critical infrastructure systems shows two important trends in methodologies and modelling. A first trend relates to the identification of methods, techniques, tools and diagrams to describe the current state of infrastructure. The other trend accomplishes a dynamic behaviour of the infrastructure systems by means of simulation techniques including systems dynamics, Monte Carlo simulation, multi-agent systems, etc. - Highlights: → We examine critical infrastructure protection experiences, systems and applications. → Some international experiences are reviewed, including EU EPCIP Plan and the US NIPP programme. → We discuss current methodologies and applications on critical infrastructure protection, with emphasis in electric networks.

  20. A global view on ARAMIS, a risk assessment methodology for industries in the framework of the SEVESO II directive

    International Nuclear Information System (INIS)

    Salvi, Olivier; Debray, Bruno

    2006-01-01

    The ARAMIS methodology was developed in an European project co-funded in the fifth Framework Programme of the European Commission with the objective to answer the specific requirements of the SEVESO II directive. It offers an alternative to purely deterministic and probabilistic approaches to risk assessment of process plants. It also answers the needs of the various stakeholders interested by the results of the risk assessment for land use or emergency planning, enforcement or, more generally, public decision-making. The methodology is divided into the following major steps: identification of major accident hazards (MIMAH), identification of the safety barriers and assessment of their performances, evaluation of safety management efficiency to barrier reliability, identification of reference accident scenarios (MIRAS), assessment and mapping of the risk severity of reference scenarios and of the vulnerability of the plant surroundings. The methodology was tested during five case studies, which provided useful information about the applicability of the method and, by identifying the most sensitive parts of it opened way to new research activity for an improved industrial safety

  1. Proposition of a modeling and an analysis methodology of integrated reverse logistics chain in the direct chain

    Energy Technology Data Exchange (ETDEWEB)

    Mimouni, F.; Abouabdellah, A.

    2016-07-01

    Propose a modeling and analysis methodology based on the combination of Bayesian networks and Petri networks of the reverse logistics integrated the direct supply chain. Network modeling by combining Petri and Bayesian network. Modeling with Bayesian network complimented with Petri network to break the cycle problem in the Bayesian network. Demands are independent from returns. Model can only be used on nonperishable products. Legislation aspects: Recycling laws; Protection of environment; Client satisfaction via after sale service. Bayesian network with a cycle combined with the Petri Network. (Author)

  2. Developing knowledge management systems with an active expert methodology

    International Nuclear Information System (INIS)

    Sandahl, K.

    1992-01-01

    Knowledge management, understood as the ability to store, distribute and utilize human knowledge in an organization, is the subject of this dissertation. In particular we have studied the design of methods and supporting software for this process. Detailed and systematic description of the design and development processes of three case-study implementations of knowledge management software are provided. The outcome of the projects is explained in terms of an active expert development methodology, which is centered around support for a domain expert to take substantial responsibility for the design and maintenance of a knowledge management system in a given area of application. Based on the experiences from the case studies and the resulting methodology, an environment for automatically supporting knowledge management was designed in the KNOWLEDGE-LINKER research project. The vital part of this architecture is a knowledge acquisition tool, used directly by the experts in creating and maintaining a knowledge base. An elaborated version of the active expert development methodology was then formulated as the result of applying the KNOWLEDGE-LINKER approach in a fourth case study. This version of the methodology is also accounted for and evaluated together within the supporting KNOWLEDGE-LINKER architecture. (au)

  3. Menopause and Methodological Doubt

    Science.gov (United States)

    Spence, Sheila

    2005-01-01

    Menopause and methodological doubt begins by making a tongue-in-cheek comparison between Descartes' methodological doubt and the self-doubt that can arise around menopause. A hermeneutic approach is taken in which Cartesian dualism and its implications for the way women are viewed in society are examined, both through the experiences of women…

  4. Fuel coolant interaction experiment by direct electrical heating method

    International Nuclear Information System (INIS)

    Takeda, Tsuneo; Hirano, Kenmei

    1979-01-01

    In the PCM (Power Cooling Mismatch) experiments, the FCI (Fuel Coolant Interaction) test is one of necessary tests in order to predict various phenomena that occur during PCM in the core. A direct electrical heating method is used for the FCI tests for fuel pellet temperature of over 1000 0 C. Therefore, preheating is required before initiating the direct electrical heating. The fuel pin used in the FCI tests is typical LWR fuel element, which is surrounded by coolant water. It is undersirable to heat up the coolant water during preheating of the fuel pin. Therefore, a zirconia (ZrO 2 ) pellet which is similar to a UO 2 pellet in physical and chemical properties is used. Electric property (electric conductivity) of ZrO 2 is particularly suitable for direct electrical heating as in the case of UO 2 . In this experiment, ZrO 2 pellet (melting point 2500 0 C) melting was achieved by use of both preheating and direct electrical heating. Temperature changes of coolant and fuel surface, as well as the pressure change of coolant water, were measured. The molten fuel interacted with the coolant and generated shock waves. A portion of this molten fuel fragmented into small particles during this interaction. The peak pressure of the observed shock wave was about 35 bars. The damaged fuel pin was photographed after disassembly. This report shows the measured coolant pressure changes and the coolant temperature changes, as well as photographs of damaged fuel pin and fuel fragments. (author)

  5. The relative weights of direct and indirect experiences in the formation of environmental risk beliefs.

    Science.gov (United States)

    Viscusi, W Kip; Zeckhauser, Richard J

    2015-02-01

    Direct experiences, we find, influence environmental risk beliefs more than the indirect experiences derived from outcomes to others. This disparity could have a rational basis. Or it could be based on behavioral proclivities in accord with the well-established availability heuristic or the vested-interest heuristic, which we introduce in this article. Using original data from a large, nationally representative sample, this article examines the perception of, and responses to, morbidity risks from tap water. Direct experiences have a stronger and more consistent effect on different measures of risk belief. Direct experiences also boost the precautionary response of drinking bottled water and drinking filtered water, while indirect experiences do not. These results are consistent with the hypothesized neglect of indirect experiences in other risk contexts, such as climate change. © 2014 Society for Risk Analysis.

  6. METHODOLOGY OF PROFESSIONAL PEDAGOGICAL EDUCATION: THEORY AND PRACTICE (THEORETICAL AND METHODOLOGICAL FOUNDATIONS OF VOCATIONAL TEACHER EDUCATION

    Directory of Open Access Journals (Sweden)

    E. M. Dorozhkin

    2014-01-01

    Full Text Available The study is aimed at investigating a justification of the new approach to the problem of vocational education development through the prism of interdependence research methodology and practice. This conceptual setup allows determining the main directions for teacher training modernization of vocational schools.The authors note that the current socio-economic situation in our country has actualized the problem of personnel training. Politicians, economists and scientists’ speeches are all about the shortage of skilled personnel. They see the main reason of this catastrophic situation in the present system of primary andsecondary vocational education. At least they concern over the current practice of pedagogical personnel training of vocational education who are to restore the system of vocational education. Our country, Russia has a great positive experience in solving this problem. Scientific-methodological centre for vocational teacher education is the Russian State Vocational Pedagogical University under the scientific direction of Academicianof the Russian Academy of Education, G. M. Romantsev. The reflection of scientific-theoretical bases of this education led the authors to the analysis and designing (formation of existent and new professional and pedagogical methodology. Methods. The fundamental position of A. M. Novikov on the generality of theresearch (scientific and practical activity methodology has become the theoretical platform of the present study. Conceptual field, conceptual statements and professional model are presented as the whole system (or integrating factor. The theoretical framework has determined the logic of the study and its results.Scientific and educational methodology differentiation in terms of the subject of cogni live activity has allowed identifying the main scientific and practical disciplines of vocational teacher education. The creative concept as the subject ground is instrumental

  7. Results from the DCH-1 [Direct Containment Heating] experiment

    International Nuclear Information System (INIS)

    Tarbell, W.W.; Brockmann, J.E.; Pilch, M.; Ross, J.E.; Oliver, M.S.; Lucero, D.A.; Kerley, T.E.; Arellano, F.E.; Gomez, R.D.

    1987-05-01

    The DCH-1 (Direct Containment Heating) test was the first experiment performed in the Surtsey Direct Heating Test Facility. The test involved 20 kg of molten core debris simulant ejected into a 1:10 scale model of the Zion reactor cavity. The melt was produced by a metallothermic reaction of iron oxide and aluminum powders to yield molten iron and alumina. The cavity model was placed so that the emerging debris propagated directly upwards along the vertical centerline of the chamber. Results from the experiment showed that the molten material was ejected from the caviity as a cloud of particles and aerosol. The dispersed debris caused a rapid pressurization of the 103-m 3 chamber atmosphere. Peak pressure from the six transducers ranged from 0.09 to 0.13 MPa (13.4 to 19.4 psig) above the initial value in the chamber. Posttest debris collection yielded 11.6 kg of material outside the cavity, of which approximately 1.6 kg was attributed to the uptake of oxygen by the iron particles. Mechanical sieving of the recovered debris showed a lognormal size distribution with a mass mean size of 0.55 mm. Aerosol measurements indicated a subsantial portion (2 to 16%) of the ejected mass was in the size range less than 10 m aerodynamic equivalent diameter

  8. Nuclear power plant simulation facility evaluation methodology

    International Nuclear Information System (INIS)

    Haas, P.M.; Carter, R.J.; Laughery, K.R. Jr.

    1985-01-01

    A methodology for evaluation of nuclear power plant simulation facilities with regard to their acceptability for use in the US Nuclear Regulatory Commission (NRC) operator licensing exam is described. The evaluation is based primarily on simulator fidelity, but incorporates some aspects of direct operator/trainee performance measurement. The panel presentation and paper discuss data requirements, data collection, data analysis and criteria for conclusions regarding the fidelity evaluation, and summarize the proposed use of direct performance measurment. While field testing and refinement of the methodology are recommended, this initial effort provides a firm basis for NRC to fully develop the necessary methodology

  9. Direct potable reuse microbial risk assessment methodology: Sensitivity analysis and application to State log credit allocations.

    Science.gov (United States)

    Soller, Jeffrey A; Eftim, Sorina E; Nappier, Sharon P

    2018-01-01

    Understanding pathogen risks is a critically important consideration in the design of water treatment, particularly for potable reuse projects. As an extension to our published microbial risk assessment methodology to estimate infection risks associated with Direct Potable Reuse (DPR) treatment train unit process combinations, herein, we (1) provide an updated compilation of pathogen density data in raw wastewater and dose-response models; (2) conduct a series of sensitivity analyses to consider potential risk implications using updated data; (3) evaluate the risks associated with log credit allocations in the United States; and (4) identify reference pathogen reductions needed to consistently meet currently applied benchmark risk levels. Sensitivity analyses illustrated changes in cumulative annual risks estimates, the significance of which depends on the pathogen group driving the risk for a given treatment train. For example, updates to norovirus (NoV) raw wastewater values and use of a NoV dose-response approach, capturing the full range of uncertainty, increased risks associated with one of the treatment trains evaluated, but not the other. Additionally, compared to traditional log-credit allocation approaches, our results indicate that the risk methodology provides more nuanced information about how consistently public health benchmarks are achieved. Our results indicate that viruses need to be reduced by 14 logs or more to consistently achieve currently applied benchmark levels of protection associated with DPR. The refined methodology, updated model inputs, and log credit allocation comparisons will be useful to regulators considering DPR projects and design engineers as they consider which unit treatment processes should be employed for particular projects. Published by Elsevier Ltd.

  10. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  11. First Human Experience with Directly Image-able Iodinated Embolization Microbeads

    Energy Technology Data Exchange (ETDEWEB)

    Levy, Elliot B., E-mail: levyeb@cc.nih.gov; Krishnasamy, Venkatesh P. [National Institutes of Health, Center for Interventional Oncology (United States); Lewis, Andrew L.; Willis, Sean; Macfarlane, Chelsea [Biocompatibles, UK Ltd, A BTG International Group Company (United Kingdom); Anderson, Victoria [National Institutes of Health, Center for Interventional Oncology (United States); Bom, Imramsjah MJ van der [Clinical Science IGT Systems North & Latin America, Philips, Philips, Image Guided Interventions (United States); Radaelli, Alessandro [Image-Guided Therapy Systems, Philips, Philips, Image Guided Interventions (Netherlands); Dreher, Matthew R. [Biocompatibles, UK Ltd, A BTG International Group Company (United Kingdom); Sharma, Karun V. [Children’s National Medical Center (United States); Negussie, Ayele; Mikhail, Andrew S. [National Institutes of Health, Center for Interventional Oncology (United States); Geschwind, Jean-Francois H. [Department of Radiology and Biomedical Imaging (United States); Wood, Bradford J. [National Institutes of Health, Center for Interventional Oncology (United States)

    2016-08-15

    PurposeTo describe first clinical experience with a directly image-able, inherently radio-opaque microspherical embolic agent for transarterial embolization of liver tumors.MethodologyLC Bead LUMI™ is a new product based upon sulfonate-modified polyvinyl alcohol hydrogel microbeads with covalently bound iodine (~260 mg I/ml). 70–150 μ LC Bead LUMI™ iodinated microbeads were injected selectively via a 2.8 Fr microcatheter to near complete flow stasis into hepatic arteries in three patients with hepatocellular carcinoma, carcinoid, or neuroendocrine tumor. A custom imaging platform tuned for LC LUMI™ microbead conspicuity using a cone beam CT (CBCT)/angiographic C-arm system (Allura Clarity FD20, Philips) was used along with CBCT embolization treatment planning software (EmboGuide, Philips).ResultsLC Bead LUMI™ image-able microbeads were easily delivered and monitored during the procedure using fluoroscopy, single-shot radiography (SSD), digital subtraction angiography (DSA), dual-phase enhanced and unenhanced CBCT, and unenhanced conventional CT obtained 48 h after the procedure. Intra-procedural imaging demonstrated tumor at risk for potential under-treatment, defined as paucity of image-able microbeads within a portion of the tumor which was confirmed at 48 h CT imaging. Fusion of pre- and post-embolization CBCT identified vessels without beads that corresponded to enhancing tumor tissue in the same location on follow-up imaging (48 h post).ConclusionLC Bead LUMI™ image-able microbeads provide real-time feedback and geographic localization of treatment in real time during treatment. The distribution and density of image-able beads within a tumor need further evaluation as an additional endpoint for embolization.

  12. Optimizing the conditions for the microwave-assisted direct liquefaction of Ulva prolifera for bio-oil production using response surface methodology

    International Nuclear Information System (INIS)

    Liu, Junhai; Zhuang, Yingbin; Li, Yan; Chen, Limei; Guo, Jingxue; Li, Demao; Ye, Naihao

    2013-01-01

    Microwave-assisted direct liquefaction (MADL) of Ulva prolifera was performed in ethylene glycol (EG) using sulfuric acid (H 2 SO 4 ) as a catalyst. Response Surface Methodology (RSM) based on central composite rotatable design (CCRD) was employed to optimize the conditions of three independent variables (catalyst content, solvent-to-feedstock ratio and temperature) for the liquefaction yield. And the bio-oil was analyzed by elementary analysis, Fourier transform infrared spectroscopic analysis (FT-IR) and gas chromatography–mass spectrometry (GC–MS). The maximum liquefaction yield was 93.17%, which was obtained under a microwave power of 600 W for 30 min at 165 °C with a solvent-to-feedstock ratio of 18.87:1 and 4.93% sulfuric acid. The bio-oil was mainly composed of phthalic acid esters, alkenes and a fatty acid methyl ester with a long chain from C 16 to C 20 . - Highlights: • Ulva prolifera was converted to bio-oil through microwave-assisted direct liquefaction. • Response surface methodology was used to optimize the liquefaction technology. • A maximum liquefaction rate of 93.17 wt% bio-oil was obtained. • The bio-oil was composed of carboxylic acids and esters

  13. A methodology for the design of experiments in computational intelligence with multiple regression models.

    Science.gov (United States)

    Fernandez-Lozano, Carlos; Gestal, Marcos; Munteanu, Cristian R; Dorado, Julian; Pazos, Alejandro

    2016-01-01

    The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  14. A methodology for the design of experiments in computational intelligence with multiple regression models

    Directory of Open Access Journals (Sweden)

    Carlos Fernandez-Lozano

    2016-12-01

    Full Text Available The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  15. Archetype modeling methodology.

    Science.gov (United States)

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. A proposed methodology for the calculation of direct consumption of fossil fuels and electricity for livestock breeding, and its application to Cyprus

    International Nuclear Information System (INIS)

    Kythreotou, Nicoletta; Florides, Georgios; Tassou, Savvas A.

    2012-01-01

    On-farm energy consumption is becoming increasingly important in the context of rising energy costs and concerns over greenhouse gas emissions. For farmers throughout the world, energy inputs represent a major and rapidly increasing cost. In many countries such as Cyprus, however, there is lack of systematic research on energy use in agriculture, which hinders benchmarking end evaluation of approaches and investment decisions for energy improvement. This study established a methodology for the estimation of the direct consumption of fossil fuels and electricity for livestock breeding, excluding transport, for locations where full data sets are not available. This methodology was then used to estimate fossil fuel and electricity consumption for livestock breeding in Cyprus. For 2008, this energy was found to be equivalent to 40.3 GWh that corresponds to 8% of the energy used in agriculture. Differences between the energy consumption per animal in Cyprus and other countries was found to be mainly due to differences in climatic conditions and technologies used in the farms. -- Highlights: ► A methodology to calculate energy consumption in farming applied to Cyprus. ► Annual consumption per animal was estimated to be 565 kWh/cow, 537 kWh/sow and 0.677 kWh/chicken. ► Direct energy consumption in livestock breeding is estimated at 40.3 GWh in 2008.

  17. A methodology for software documentation

    OpenAIRE

    Torres Júnior, Roberto Dias; Ahlert, Hubert

    2000-01-01

    With the growing complexity of window based software and the use of object-oriented, the development of software is getting more complex than ever. Based on that, this article intends to present a methodology for software documentation and to analyze our experience and how this methodology can aid the software maintenance

  18. Development of methodology and direction of practice administrative neuromarketing

    OpenAIRE

    Glushchenko V.; Glushchenko I.

    2018-01-01

    Development of methodology and practical aspects of application of administrative neuromarketing acts as a subject of work, subject of article is administrative neuromarketing in the organization, in article the concept and content of administrative neuromarketing, philosophy, culture, functions, tasks and the principles of administrative neuromarketing are investigated, the technique of the logical analysis of a possibility of application of methods of administrative neuromarketing for incre...

  19. First, Get Your Feet Wet: The Effects of Learning from Direct and Indirect Experience on Team Creativity

    Science.gov (United States)

    Gino, Francesca; Argote, Linda; Miron-Spektor, Ella; Todorova, Gergana

    2010-01-01

    How does prior experience influence team creativity? We address this question by examining the effects of task experience acquired directly and task experience acquired vicariously from others on team creativity in a product-development task. Across three laboratory studies, we find that direct task experience leads to higher levels of team…

  20. A Methodological Report: Adapting the 505 Change-of-Direction Speed Test Specific to American Football.

    Science.gov (United States)

    Lockie, Robert G; Farzad, Jalilvand; Orjalo, Ashley J; Giuliano, Dominic V; Moreno, Matthew R; Wright, Glenn A

    2017-02-01

    Lockie, RG, Jalilvand, F, Orjalo, AJ, Giuliano, DV, Moreno, MR, and Wright, GA. A methodological report: Adapting the 505 change-of-direction speed test specific to American football. J Strength Cond Res 31(2): 539-547, 2017-The 505 involves a 10-m sprint past a timing gate, followed by a 180° change-of-direction (COD) performed over 5 m. This methodological report investigated an adapted 505 (A505) designed to be football-specific by changing the distances to 10 and 5 yd. Twenty-five high school football players (6 linemen [LM]; 8 quarterbacks, running backs, and linebackers [QB/RB/LB]; 11 receivers and defensive backs [R/DB]) completed the A505 and 40-yd sprint. The difference between A505 and 0 to 10-yd time determined the COD deficit for each leg. In a follow-up session, 10 subjects completed the A505 again and 10 subjects completed the 505. Reliability was analyzed by t-tests to determine between-session differences, typical error (TE), and coefficient of variation. Test usefulness was examined via TE and smallest worthwhile change (SWC) differences. Pearson's correlations calculated relationships between the A505 and 505, and A505 and COD deficit with the 40-yd sprint. A 1-way analysis of variance (p ≤ 0.05) derived between-position differences in the A505 and COD deficit. There were no between-session differences for the A505 (p = 0.45-0.76; intraclass correlation coefficient = 0.84-0.95; TE = 2.03-4.13%). Additionally, the A505 was capable of detecting moderate performance changes (SWC0.5 > TE). The A505 correlated with the 505 and 40-yard sprint (r = 0.58-0.92), suggesting the modified version assessed similar qualities. Receivers and defensive backs were faster than LM in the A505 for both legs, and right-leg COD deficit. Quarterbacks, running backs, and linebackers were faster than LM in the right-leg A505. The A505 is reliable, can detect moderate performance changes, and can discriminate between football position groups.

  1. Methodological Characteristics and Future Directions for Plyometric Jump Training Research: A Scoping Review.

    Science.gov (United States)

    Ramirez-Campillo, Rodrigo; Álvarez, Cristian; García-Hermoso, Antonio; Ramírez-Vélez, Robinson; Gentil, Paulo; Asadi, Abbas; Chaabene, Helmi; Moran, Jason; Meylan, Cesar; García-de-Alcaraz, Antonio; Sanchez-Sanchez, Javier; Nakamura, Fabio Y; Granacher, Urs; Kraemer, William; Izquierdo, Mikel

    2018-05-01

    Recently, there has been a proliferation of published articles on the effect of plyometric jump training, including several review articles and meta-analyses. However, these types of research articles are generally of narrow scope. Furthermore, methodological limitations among studies (e.g., a lack of active/passive control groups) prevent the generalization of results, and these factors need to be addressed by researchers. On that basis, the aims of this scoping review were to (1) characterize the main elements of plyometric jump training studies (e.g., training protocols) and (2) provide future directions for research. From 648 potentially relevant articles, 242 were eligible for inclusion in this review. The main issues identified related to an insufficient number of studies conducted in females, youths, and individual sports (~ 24.0, ~ 37.0, and ~ 12.0% of overall studies, respectively); insufficient reporting of effect size values and training prescription (~ 34.0 and ~ 55.0% of overall studies, respectively); and studies missing an active/passive control group and randomization (~ 40.0 and ~ 20.0% of overall studies, respectively). Furthermore, plyometric jump training was often combined with other training methods and added to participants' daily training routines (~ 47.0 and ~ 39.0% of overall studies, respectively), thus distorting conclusions on its independent effects. Additionally, most studies lasted no longer than 7 weeks. In future, researchers are advised to conduct plyometric training studies of high methodological quality (e.g., randomized controlled trials). More research is needed in females, youth, and individual sports. Finally, the identification of specific dose-response relationships following plyometric training is needed to specifically tailor intervention programs, particularly in the long term.

  2. Characterising dark matter searches at colliders and direct detection experiments: Vector mediators

    International Nuclear Information System (INIS)

    Buchmueller, Oliver; Dolan, Matthew J.; Malik, Sarah A.; McCabe, Christopher

    2015-01-01

    We introduce a Minimal Simplified Dark Matter (MSDM) framework to quantitatively characterise dark matter (DM) searches at the LHC. We study two MSDM models where the DM is a Dirac fermion which interacts with a vector and axial-vector mediator. The models are characterised by four parameters: m DM , M med, g DM and g q , the DM and mediator masses, and the mediator couplings to DM and quarks respectively. The MSDM models accurately capture the full event kinematics, and the dependence on all masses and couplings can be systematically studied. The interpretation of mono-jet searches in this framework can be used to establish an equal-footing comparison with direct detection experiments. For theories with a vector mediator, LHC mono-jet searches possess better sensitivity than direct detection searches for light DM masses (≲5 GeV). For axial-vector mediators, LHC and direct detection searches generally probe orthogonal directions in the parameter space. We explore the projected limits of these searches from the ultimate reach of the LHC and multi-ton xenon direct detection experiments, and find that the complementarity of the searches remains. In conclusion, we provide a comparison of limits in the MSDM and effective field theory (EFT) frameworks to highlight the deficiencies of the EFT framework, particularly when exploring the complementarity of mono-jet and direct detection searches

  3. Application of Direct Assessment Approaches and Methodologies to Cathodically Protected Nuclear Waste Transfer Lines

    International Nuclear Information System (INIS)

    Dahl, Megan M.; Pikas, Joseph; Edgemon, Glenn L.; Philo, Sarah

    2013-01-01

    The U.S. Department of Energy's (DOE) Hanford Site is responsible for the safe storage, retrieval, treatment, and disposal of approximately 54 million gallons (204 million liters) of radioactive waste generated since the site's inception in 1943. Today, the major structures involved in waste management at Hanford include 149 carbon steel single-shell tanks, 28 carbon-steel double-shell tanks, plus a network of buried metallic transfer lines and ancillary systems (pits, vaults, catch tanks, etc.) required to store, retrieve, and transfer waste within the tank farm system. Many of the waste management systems at Hanford are still in use today. In response to uncertainties regarding the structural integrity of these systems,' an independent, comprehensive integrity assessment of the Hanford Site piping system was performed. It was found that regulators do not require the cathodically protected pipelines located within the Hanford Site to be assessed by External Corrosion Direct Assessment (ECDA) or any other method used to ensure integrity. However, a case study is presented discussing the application of the direct assessment process on pipelines in such a nuclear environment. Assessment methodology and assessment results are contained herein. An approach is described for the monitoring, integration of outside data, and analysis of this information in order to identify whether coating deterioration accompanied by external corrosion is a threat for these waste transfer lines

  4. Transcranial direct current stimulation (tDCS in behavioral and food addiction: A systematic review of efficacy, technical and methodological issues

    Directory of Open Access Journals (Sweden)

    Anne eSauvaget

    2015-10-01

    Full Text Available Objectives.Behavioral addictions (BA are complex disorders for which pharmacological and psychotherapeutic treatments have shown their limits. Non-invasive brain stimulation, among which transcranial direct current stimulation (tDCS, has opened up new perspectives in addiction treatment. The purpose of this work is to conduct a critical and systematic review of tDCS efficacy, and of technical and methodological considerations in the field of BA.Methods.A bibliographic search has been conducted on the Medline and ScienceDirect databases until December 2014, based on the following selection criteria: clinical studies on tDCS and BA (namely eating disorders, compulsive buying, Internet addiction, pathological gambling, sexual addiction, sports addiction, video games addiction. Study selection, data analysis and reporting were conducted according to the PRISMA guidelines.Results.Out of 402 potential articles, seven studies were selected. So far focusing essentially on abnormal eating, these studies suggest that tDCS (right prefrontal anode / left prefrontal cathode reduces food craving induced by visual stimuli.ConclusionsDespite methodological and technical differences between studies, the results are promising. So far, only few studies of tDCS in BA have been conducted. New research is recommended on the use of tDCS in BA, other than eating disorders.

  5. Czechoslovakia's participation in IAEA's INIS/AGRIS direct access experiment

    International Nuclear Information System (INIS)

    Stanik, Z.

    1980-01-01

    The task of establishing direct access to the INIS data base is being implemented in Czechoslovakia by the Nuclear Information Centre in Prague-Zbraslav. The aim and meaning of the experiment is to build a Czechoslovak network of terminals linked to the IAEA with the possibility of future connections to other data bases. The first stage is characterized by the use of a dial-up line. (M.S.)

  6. Stages of Learning during a Self-Directed Stress Management Experience

    Science.gov (United States)

    Larson, Karl L.

    2015-01-01

    Purpose: The purpose of the study was to document the stages of learning reflected through student journaling during a self-directed experience in stress management, and the relationship of those stages to a historical model. Methods: College students participating in a full-semester course in stress management theory were required to select a…

  7. 24 CFR 904.205 - Training methodology.

    Science.gov (United States)

    2010-04-01

    ... Training methodology. Equal in importance to the content of the pre- and post-occupancy training is the training methodology. Because groups vary, there should be adaptability in the communication and learning experience. Methods to be utilized may include group presentations, small discussion groups, special classes...

  8. The NASA Carbon Airborne Flux Experiment (CARAFE): instrumentation and methodology

    Science.gov (United States)

    Wolfe, Glenn M.; Kawa, S. Randy; Hanisco, Thomas F.; Hannun, Reem A.; Newman, Paul A.; Swanson, Andrew; Bailey, Steve; Barrick, John; Thornhill, K. Lee; Diskin, Glenn; DiGangi, Josh; Nowak, John B.; Sorenson, Carl; Bland, Geoffrey; Yungel, James K.; Swenson, Craig A.

    2018-03-01

    The exchange of trace gases between the Earth's surface and atmosphere strongly influences atmospheric composition. Airborne eddy covariance can quantify surface fluxes at local to regional scales (1-1000 km), potentially helping to bridge gaps between top-down and bottom-up flux estimates and offering novel insights into biophysical and biogeochemical processes. The NASA Carbon Airborne Flux Experiment (CARAFE) utilizes the NASA C-23 Sherpa aircraft with a suite of commercial and custom instrumentation to acquire fluxes of carbon dioxide, methane, sensible heat, and latent heat at high spatial resolution. Key components of the CARAFE payload are described, including the meteorological, greenhouse gas, water vapor, and surface imaging systems. Continuous wavelet transforms deliver spatially resolved fluxes along aircraft flight tracks. Flux analysis methodology is discussed in depth, with special emphasis on quantification of uncertainties. Typical uncertainties in derived surface fluxes are 40-90 % for a nominal resolution of 2 km or 16-35 % when averaged over a full leg (typically 30-40 km). CARAFE has successfully flown two missions in the eastern US in 2016 and 2017, quantifying fluxes over forest, cropland, wetlands, and water. Preliminary results from these campaigns are presented to highlight the performance of this system.

  9. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology

  10. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology.

  11. A Virtual Reconstruction Methodology for Archaeological Heritage in East Asia – Practical Experience from the Re-relic Program in China

    Directory of Open Access Journals (Sweden)

    Yan He

    2013-11-01

    Full Text Available There is as much abundance of archaeological heritage in East Asia as there is diversity in the methodology for its reconstruction and representation. The Re-relic program in China recognizes the uniqueness of archaeological heritage in East Asia and has developed a tailored virtual reconstruction methodology that is both scientifically robust and popular for public interpretation. The theoretical consideration and field experience over the years shall contribute to the global understanding of the value and technique in virtual reconstruction, while testifying to the very principles of Seville Charter.

  12. PIA and REWIND: Two New Methodologies for Cross Section Adjustment

    Energy Technology Data Exchange (ETDEWEB)

    Palmiotti, G.; Salvatores, M.

    2017-02-01

    This paper presents two new cross section adjustment methodologies intended for coping with the problem of compensations. The first one PIA, Progressive Incremental Adjustment, gives priority to the utilization of experiments of elemental type (those sensitive to a specific cross section), following a definite hierarchy on which type of experiment to use. Once the adjustment is performed, both the new adjusted data and the new covariance matrix are kept. The second methodology is called REWIND (Ranking Experiments by Weighting for Improved Nuclear Data). This new proposed approach tries to establish a methodology for ranking experiments by looking at the potential gain they can produce in an adjustment. Practical applications for different adjustments illustrate the results of the two methodologies against the current one and show the potential improvement for reducing uncertainties in target reactors.

  13. Experience sampling methodology in mental health research: new insights and technical developments.

    Science.gov (United States)

    Myin-Germeys, Inez; Kasanova, Zuzana; Vaessen, Thomas; Vachon, Hugo; Kirtley, Olivia; Viechtbauer, Wolfgang; Reininghaus, Ulrich

    2018-06-01

    In the mental health field, there is a growing awareness that the study of psychiatric symptoms in the context of everyday life, using experience sampling methodology (ESM), may provide a powerful and necessary addition to more conventional research approaches. ESM, a structured self-report diary technique, allows the investigation of experiences within, and in interaction with, the real-world context. This paper provides an overview of how zooming in on the micro-level of experience and behaviour using ESM adds new insights and additional perspectives to standard approaches. More specifically, it discusses how ESM: a) contributes to a deeper understanding of psychopathological phenomena, b) allows to capture variability over time, c) aids in identifying internal and situational determinants of variability in symptomatology, and d) enables a thorough investigation of the interaction between the person and his/her environment and of real-life social interactions. Next to improving assessment of psychopathology and its underlying mechanisms, ESM contributes to advancing and changing clinical practice by allowing a more fine-grained evaluation of treatment effects as well as by providing the opportunity for extending treatment beyond the clinical setting into real life with the development of ecological momentary interventions. Furthermore, this paper provides an overview of the technical details of setting up an ESM study in terms of design, questionnaire development and statistical approaches. Overall, although a number of considerations and challenges remain, ESM offers one of the best opportunities for personalized medicine in psychiatry, from both a research and a clinical perspective. © 2018 World Psychiatric Association.

  14. Symbol Grounding Without Direct Experience: Do Words Inherit Sensorimotor Activation From Purely Linguistic Context?

    Science.gov (United States)

    Günther, Fritz; Dudschig, Carolin; Kaup, Barbara

    2018-05-01

    Theories of embodied cognition assume that concepts are grounded in non-linguistic, sensorimotor experience. In support of this assumption, previous studies have shown that upwards response movements are faster than downwards movements after participants have been presented with words whose referents are typically located in the upper vertical space (and vice versa for downwards responses). This is taken as evidence that processing these words reactivates sensorimotor experiential traces. This congruency effect was also found for novel words, after participants learned these words as labels for novel objects that they encountered either in their upper or lower visual field. While this indicates that direct experience with a word's referent is sufficient to evoke said congruency effects, the present study investigates whether this direct experience is also a necessary condition. To this end, we conducted five experiments in which participants learned novel words from purely linguistic input: Novel words were presented in pairs with real up- or down-words (Experiment 1); they were presented in natural sentences where they replaced these real words (Experiment 2); they were presented as new labels for these real words (Experiment 3); and they were presented as labels for novel combined concepts based on these real words (Experiment 4 and 5). In all five experiments, we did not find any congruency effects elicited by the novel words; however, participants were always able to make correct explicit judgements about the vertical dimension associated to the novel words. These results suggest that direct experience is necessary for reactivating experiential traces, but this reactivation is not a necessary condition for understanding (in the sense of storing and accessing) the corresponding aspects of word meaning. Copyright © 2017 Cognitive Science Society, Inc.

  15. Changing public perceptions of genetically modified foods: Effects of consumer information and direct product experience

    DEFF Research Database (Denmark)

    Scholderer, Joachim; Bech-Larsen, Tino; Grunert, Klaus G.

    and values. Two policies can be adopted in such a situation: (a) consumers can be actively informed regarding the risks and benefits and (b) consumers can be given the opportunity to evaluate products on the basis of direct experience. The effectiveness of both policies was tested in two experiments....... In experiment 1, attitude change experiments were conducted with consumers from Denmark, Germany, Italy and the UK (N=1650). Different information strategies were tested against a control group for their ability to change consumers' attitudes and their influence on product choice. Results indicate...... that no attitude change occured. Instead, all stategies seemed to bolster pre-existing attitudes, thereby significantly decreasing consumers' preferences for GM products. The effect did not occur when consumers only saw a labeled product example. In experiment 2, we tested the effects of direct experience...

  16. Methodological Issues and Practices in Qualitative Research.

    Science.gov (United States)

    Bradley, Jana

    1993-01-01

    Discusses methodological issues concerning qualitative research and describes research practices that qualitative researchers use to address these methodological issues. Topics discussed include the researcher as interpreter, the emergent nature of qualitative research, understanding the experience of others, trustworthiness in qualitative…

  17. GENESIS OF METHODOLOGY OF MANAGEMENT BY DEVELOPMENT OF ORGANIZATIONS

    Directory of Open Access Journals (Sweden)

    Z.N. Varlamova

    2007-06-01

    Full Text Available In clause the genesis of methodology of management of development of organizations as sets of the used methodological approaches and methods is investigated. The results of the comparative analysis of the methodological approaches to management of organizational development are submitted. The traditional methodological approaches are complemented strategic experiment and methodology case studies. The approaches to formation of new methodology and technique of research of sources of competitive advantages of organization are considered.

  18. Factors that affect implementation of a nurse staffing directive: results from a qualitative multi-case evaluation.

    Science.gov (United States)

    Robinson, Claire H; Annis, Ann M; Forman, Jane; Krein, Sarah L; Yankey, Nicholas; Duffy, Sonia A; Taylor, Beth; Sales, Anne E

    2016-08-01

    To assess implementation of the Veterans Health Administration staffing methodology directive. In 2010 the Veterans Health Administration promulgated a staffing methodology directive for inpatient nursing units to address staffing and budget forecasting. A qualitative multi-case evaluation approach assessed staffing methodology implementation. Semi-structured telephone interviews were conducted from March - June 2014 with Nurse Executives and their teams at 21 facilities. Interviews focused on the budgeting process, implementation experiences, use of data, leadership support, and training. An implementation score was created for each facility using a 4-point rating scale. The scores were used to select three facilities (low, medium and high implementation) for more detailed case studies. After analysing interview summaries, the evaluation team developed a four domain scoring structure: (1) integration of staffing methodology into budget development; (2) implementation of the Directive elements; (3) engagement of leadership and staff; and (4) use of data to support the staffing methodology process. The high implementation facility had leadership understanding and endorsement of staffing methodology, confidence in and ability to work with data, and integration of staffing methodology results into the budgeting process. The low implementation facility reported poor leadership engagement and little understanding of data sources and interpretation. Implementation varies widely across facilities. Implementing staffing methodology in facilities with complex and changing staffing needs requires substantial commitment at all organizational levels especially for facilities that have traditionally relied on historical levels to budget for staffing. Published 2016. This article is a U.S. Government work and is in the public domain in the USA.

  19. Improved best estimate plus uncertainty methodology including advanced validation concepts to license evolving nuclear reactors

    International Nuclear Information System (INIS)

    Unal, Cetin; Williams, Brian; McClure, Patrick; Nelson, Ralph A.

    2010-01-01

    Many evolving nuclear energy programs plan to use advanced predictive multi-scale multi-physics simulation and modeling capabilities to reduce cost and time from design through licensing. Historically, the role of experiments was primary tool for design and understanding of nuclear system behavior while modeling and simulation played the subordinate role of supporting experiments. In the new era of multi-scale multi-physics computational based technology development, the experiments will still be needed but they will be performed at different scales to calibrate and validate models leading predictive simulations. Cost saving goals of programs will require us to minimize the required number of validation experiments. Utilization of more multi-scale multi-physics models introduces complexities in the validation of predictive tools. Traditional methodologies will have to be modified to address these arising issues. This paper lays out the basic aspects of a methodology that can be potentially used to address these new challenges in design and licensing of evolving nuclear technology programs. The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. An enhanced calibration concept is introduced and is accomplished through data assimilation. The goal is to enable best-estimate prediction of system behaviors in both normal and safety related environments. To achieve this goal requires the additional steps of estimating the domain of validation and quantification of uncertainties that allow for extension of results to areas of the validation domain that are not directly tested with experiments, which might include extension of the modeling and simulation (M and S) capabilities for application to full-scale systems. The new methodology suggests a formalism to quantify an adequate level of validation (predictive maturity) with respect to required selective data so that required testing can be minimized for

  20. Improved best estimate plus uncertainty methodology including advanced validation concepts to license evolving nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Unal, Cetin [Los Alamos National Laboratory; Williams, Brian [Los Alamos National Laboratory; Mc Clure, Patrick [Los Alamos National Laboratory; Nelson, Ralph A [IDAHO NATIONAL LAB

    2010-01-01

    Many evolving nuclear energy programs plan to use advanced predictive multi-scale multi-physics simulation and modeling capabilities to reduce cost and time from design through licensing. Historically, the role of experiments was primary tool for design and understanding of nuclear system behavior while modeling and simulation played the subordinate role of supporting experiments. In the new era of multi-scale multi-physics computational based technology development, the experiments will still be needed but they will be performed at different scales to calibrate and validate models leading predictive simulations. Cost saving goals of programs will require us to minimize the required number of validation experiments. Utilization of more multi-scale multi-physics models introduces complexities in the validation of predictive tools. Traditional methodologies will have to be modified to address these arising issues. This paper lays out the basic aspects of a methodology that can be potentially used to address these new challenges in design and licensing of evolving nuclear technology programs. The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. An enhanced calibration concept is introduced and is accomplished through data assimilation. The goal is to enable best-estimate prediction of system behaviors in both normal and safety related environments. To achieve this goal requires the additional steps of estimating the domain of validation and quantification of uncertainties that allow for extension of results to areas of the validation domain that are not directly tested with experiments, which might include extension of the modeling and simulation (M&S) capabilities for application to full-scale systems. The new methodology suggests a formalism to quantify an adequate level of validation (predictive maturity) with respect to required selective data so that required testing can be minimized for cost

  1. The NASA Carbon Airborne Flux Experiment (CARAFE: instrumentation and methodology

    Directory of Open Access Journals (Sweden)

    G. M. Wolfe

    2018-03-01

    Full Text Available The exchange of trace gases between the Earth's surface and atmosphere strongly influences atmospheric composition. Airborne eddy covariance can quantify surface fluxes at local to regional scales (1–1000 km, potentially helping to bridge gaps between top-down and bottom-up flux estimates and offering novel insights into biophysical and biogeochemical processes. The NASA Carbon Airborne Flux Experiment (CARAFE utilizes the NASA C-23 Sherpa aircraft with a suite of commercial and custom instrumentation to acquire fluxes of carbon dioxide, methane, sensible heat, and latent heat at high spatial resolution. Key components of the CARAFE payload are described, including the meteorological, greenhouse gas, water vapor, and surface imaging systems. Continuous wavelet transforms deliver spatially resolved fluxes along aircraft flight tracks. Flux analysis methodology is discussed in depth, with special emphasis on quantification of uncertainties. Typical uncertainties in derived surface fluxes are 40–90 % for a nominal resolution of 2 km or 16–35 % when averaged over a full leg (typically 30–40 km. CARAFE has successfully flown two missions in the eastern US in 2016 and 2017, quantifying fluxes over forest, cropland, wetlands, and water. Preliminary results from these campaigns are presented to highlight the performance of this system.

  2. Direct experience and the course of eating disorders in patients on partial hospitalization: a pilot study.

    Science.gov (United States)

    Soler, Joaquim; Soriano, José; Ferraz, Liliana; Grasa, Eva; Carmona, Cristina; Portella, Maria J; Seto, Victoria; Alvarez, Enric; Pérez, Víctor

    2013-09-01

    Awareness of sensory experience in the present moment is central to mindfulness practice. This type of information processing, in contrast to an analytical evaluative style of processing, could be more beneficial for the course of those psychiatric disorders characterized by ruminative and content-centred processing, such as eating disorders (EDs). We performed a pilot study to assess the relation between patients' approach to information processing and the duration and severity of EDs. Fifty-seven patients with a diagnosed ED were included in the study and participated in a self-guided eating activity to asses the primary information processing mode based on mindfulness concepts of 'Direct Experience' and 'Thinking About'. Additionally, dispositional mindfulness was assessed by the Five Factors Mindfulness Questionnaire, and anxiety during the experiment was determined by means of a 10-point visual analogue scale. We found that a higher level of self-reported Direct Experience was inversely associated with several severity variables and with anxiety levels. Direct Experience was predicted by a low anxiety level, less severe illness, and higher scores on one mindfulness facet (Observing). Our results suggest that a Direct Experience processing approach is associated with better ED outcomes. Future studies should be carried out to clarify the repercussion of mindfulness training on EDs. Copyright © 2013 John Wiley & Sons, Ltd and Eating Disorders Association.

  3. Direct and Indirect Experiences with Heterosexism: How Slurs Impact All Students

    Science.gov (United States)

    Norris, Alyssa L.; McGuire, Jenifer K.; Stolz, Cheryl

    2018-01-01

    Students targeted by homophobic discrimination are at risk for poor academic outcomes, yet few studies have examined how witnessing discrimination affects students. This study examined the impact of direct and indirect experiences of heterosexism on feelings of safety, belongingness, and connectedness among a sample of 1,702 students at a public…

  4. Privacy and CHI : methodologies for studying privacy issues

    NARCIS (Netherlands)

    Patil, S.; Romero, N.A.; Karat, J.

    2006-01-01

    This workshop aims to reflect on methodologies to empirically study privacy issues related to advanced technology. The goal is to address methodological concerns by drawing upon both theoretical perspectives as well as practical experiences.

  5. The GPT methodology. New fields of application

    International Nuclear Information System (INIS)

    Gandini, A.; Gomit, J.M.; Abramytchev, V.

    1996-01-01

    The GPT (Generalized Perturbation Theory) methodology is described, and a new application is discussed. The results obtained for a simple model (zero dimension, six parameters of interest) show that the expressions obtained using the GPT methodology, lead to results close to those obtained through direct calculations. The GPT methodology is useful to be used for radioactive waste disposal problems. The potentiality of the method linked to zero dimension model can be extended to radionuclide migration problems with space description. (K.A.)

  6. The Plateau Experience: Maslow's Unfinished Theory

    OpenAIRE

    Buckler, Scott

    2011-01-01

    Abraham Maslow (1908-1970) was a leading psychologist whose hierarchy of needs has resonated throughout various disciplines. The pinnacle of Maslow's hierarchy was self-actualisation, characterised by the peak experience. However there are a series of definitional, theoretical and methodological issues related to the hierarchy and self-actualisation. Maslow specifically refuted his own theory, instead suggesting that research should be directed towards self-transcendence as characterised by t...

  7. A Functional HAZOP Methodology

    DEFF Research Database (Denmark)

    Liin, Netta; Lind, Morten; Jensen, Niels

    2010-01-01

    A HAZOP methodology is presented where a functional plant model assists in a goal oriented decomposition of the plant purpose into the means of achieving the purpose. This approach leads to nodes with simple functions from which the selection of process and deviation variables follow directly...

  8. Unifying Pore Network Modeling, Continuous Time Random Walk Theory and Experiment - Accomplishments and Future Directions

    Science.gov (United States)

    Bijeljic, B.

    2008-05-01

    This talk will describe and highlight the advantages offered by a methodology that unifies pore network modeling, CTRW theory and experiment in description of solute dispersion in porous media. Solute transport in a porous medium is characterized by the interplay of advection and diffusion (described by Peclet number, Pe) that cause spreading of solute particles. This spreading is traditionally described by dispersion coefficients, D, defined by σ 2 = 2Dt, where σ 2 is the variance of the solute position and t is the time. Using a pore-scale network model based on particle tracking, the rich Peclet- number dependence of dispersion coefficient is predicted from first principles and is shown to compare well with experimental data for restricted diffusion, transition, power-law and mechanical dispersion regimes in the asymptotic limit. In the asymptotic limit D is constant and can be used in an averaged advection-dispersion equation. However, it is highly important to recognize that, until the velocity field is fully sampled, the particle transport is non-Gaussian and D possesses temporal or spatial variation. Furthermore, temporal probability density functions (PDF) of tracer particles are studied in pore networks and an excellent agreement for the spectrum of transition times for particles from pore to pore is obtained between network model results and CTRW theory. Based on the truncated power-law interpretation of PDF-s, the physical origin of the power-law scaling of dispersion coefficient vs. Peclet number has been explained for unconsolidated porous media, sands and a number of sandstones, arriving at the same conclusion from numerical network modelling, analytic CTRW theory and experiment. Future directions for further applications of the methodology presented are discussed in relation to the scale- dependent solute dispersion and reactive transport. Significance of pre-asymptotic dispersion in porous media is addressed from pore-scale upwards and the impact

  9. Preliminary experiment of fast neutron imaging with direct-film method

    International Nuclear Information System (INIS)

    Pei Yuyang; Tang Guoyou; Guo Zhiyu; Zhang Guohui

    2005-01-01

    A preliminary experiment is conducted with direct-film method under the condition that fast neutron is generated by the reaction of 9 Be(d, n) on the Beijing University 4.5 MV Van de Graaff, whose energy is lower than 7 MeV. Basic characteristics of direct-film neutron radiography system are investigated with the help of samples in different materials, different thickness and holes of different diameter. The fast neutron converter, which is vital for fast neutron imaging, is produced with the materials made in China. The result indicates that fast neutron converter can meet the requirement of fast neutron imaging; further research of fast neutron imaging can be conducted on the accelerator and neutron-generator in China. (authors)

  10. Comparative study on software development methodologies

    Directory of Open Access Journals (Sweden)

    Mihai Liviu DESPA

    2014-12-01

    Full Text Available This paper focuses on the current state of knowledge in the field of software development methodologies. It aims to set the stage for the formalization of a software development methodology dedicated to innovation orientated IT projects. The paper starts by depicting specific characteristics in software development project management. Managing software development projects involves techniques and skills that are proprietary to the IT industry. Also the software development project manager handles challenges and risks that are predominantly encountered in business and research areas that involve state of the art technology. Conventional software development stages are defined and briefly described. Development stages are the building blocks of any software development methodology so it is important to properly research this aspect. Current software development methodologies are presented. Development stages are defined for every showcased methodology. For each methodology a graphic representation is illustrated in order to better individualize its structure. Software development methodologies are compared by highlighting strengths and weaknesses from the stakeholder's point of view. Conclusions are formulated and a research direction aimed at formalizing a software development methodology dedicated to innovation orientated IT projects is enunciated.

  11. A methodology for the transfer of probabilities between accident severity categories

    International Nuclear Information System (INIS)

    Whitlow, J.D.; Neuhauser, K.S.

    1991-01-01

    A methodology has been developed which allows the accident probabilities associated with one accident-severity category scheme to be transferred to another severity category scheme. The methodology requires that the schemes use a common set of parameters to define the categories. The transfer of accident probabilities is based on the relationships between probability of occurrence and each of the parameters used to define the categories. Because of the lack of historical data describing accident environments in engineering terms, these relationships may be difficult to obtain directly for some parameters. Numerical models or experienced judgement are often needed to obtain the relationships. These relationships, even if they are not exact, allow the accident probability associated with any severity category to be distributed within that category in a manner consistent with accident experience, which in turn will allow the accident probability to be appropriately transferred to a different category scheme

  12. Inelastic Boosted Dark Matter at direct detection experiments

    Science.gov (United States)

    Giudice, Gian F.; Kim, Doojin; Park, Jong-Chul; Shin, Seodong

    2018-05-01

    We explore a novel class of multi-particle dark sectors, called Inelastic Boosted Dark Matter (iBDM). These models are constructed by combining properties of particles that scatter off matter by making transitions to heavier states (Inelastic Dark Matter) with properties of particles that are produced with a large Lorentz boost in annihilation processes in the galactic halo (Boosted Dark Matter). This combination leads to new signals that can be observed at ordinary direct detection experiments, but require unconventional searches for energetic recoil electrons in coincidence with displaced multi-track events. Related experimental strategies can also be used to probe MeV-range boosted dark matter via their interactions with electrons inside the target material.

  13. Self-generated magnetic fields in direct-drive implosion experiments

    Energy Technology Data Exchange (ETDEWEB)

    Igumenshchev, I. V.; Nilson, P. M.; Goncharov, V. N. [Laboratory for Laser Energetics, University of Rochester, 250 East River Road, Rochester, New York 14623 (United States); Zylstra, A. B.; Li, C. K.; Petrasso, R. D. [Plasma Science and Fusion Center, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139 (United States)

    2014-06-15

    Electric and self-generated magnetic fields in direct-drive implosion experiments on the OMEGA Laser Facility were investigated employing radiography with ∼10- to 60-MeV protons. The experiment used plastic-shell targets with imposed surface defects (glue spots, wires, and mount stalks), which enhance self-generated fields. The fields were measured during the 1-ns laser drive with an on-target intensity ∼10{sup 15} W/cm{sup 2}. Proton radiographs show multiple ring-like structures produced by electric fields ∼10{sup 7} V/cm and fine structures from surface defects, indicating self-generated fields up to ∼3 MG. These electric and magnetic fields show good agreement with two-dimensional magnetohydrodynamic simulations when the latter include the ∇T{sub e} × ∇n{sub e} source, Nernst convection, and anisotropic resistivity. The simulations predict that self-generated fields affect heat fluxes in the conduction zone and, through this, affect the growth of local perturbations.

  14. Qualitative methodology in developmental psychology

    DEFF Research Database (Denmark)

    Demuth, Carolin; Mey, Günter

    2015-01-01

    Qualitative methodology presently is gaining increasing recognition in developmental psychology. Although the founders of developmental psychology to a large extent already used qualitative procedures, the field was long dominated by a (post) positivistic quantitative paradigm. The increasing rec...... in qualitative research offers a promising avenue to advance the field in this direction.......Qualitative methodology presently is gaining increasing recognition in developmental psychology. Although the founders of developmental psychology to a large extent already used qualitative procedures, the field was long dominated by a (post) positivistic quantitative paradigm. The increasing...

  15. Setting priorities for space research: An experiment in methodology

    Science.gov (United States)

    1995-01-01

    In 1989, the Space Studies Board created the Task Group on Priorities in Space Research to determine whether scientists should take a role in recommending priorities for long-term space research initiatives and, if so, to analyze the priority-setting problem in this context and develop a method by which such priorities could be established. After answering the first question in the affirmative in a previous report, the task group set out to accomplish the second task. The basic assumption in developing a priority-setting process is that a reasoned and structured approach for ordering competing initiatives will yield better results than other ways of proceeding. The task group proceeded from the principle that the central criterion for evaluating a research initiative must be its scientific merit -- the value of the initiative to the proposing discipline and to science generally. The group developed a two-stage methodology for priority setting and constructed a procedure and format to support the methodology. The first of two instruments developed was a standard format for structuring proposals for space research initiatives. The second instrument was a formal, semiquantitative appraisal procedure for evaluating competing proposals. This report makes available complete templates for the methodology, including the advocacy statement and evaluation forms, as well as an 11-step schema for a priority-setting process. From the beginning of its work, the task group was mindful that the issue of priority setting increasingly pervades all of federally supported science and that its work would have implications extending beyond space research. Thus, although the present report makes no recommendations for action by NASA or other government agencies, it provides the results of the task group's work for the use of others who may study priority-setting procedures or take up the challenge of implementing them in the future.

  16. Can tonne-scale direct detection experiments discover nuclear dark matter?

    Energy Technology Data Exchange (ETDEWEB)

    Butcher, Alistair; Kirk, Russell; Monroe, Jocelyn; West, Stephen M., E-mail: Alistair.Butcher.2010@live.rhul.ac.uk, E-mail: Russell.Kirk.2008@live.rhul.ac.uk, E-mail: Jocelyn.Monroe@rhul.ac.uk, E-mail: Stephen.West@rhul.ac.uk [Department of Physics, Royal Holloway University of London, Egham, Surrey, TW20 0EX (United Kingdom)

    2017-10-01

    Models of nuclear dark matter propose that the dark sector contains large composite states consisting of dark nucleons in analogy to Standard Model nuclei. We examine the direct detection phenomenology of a particular class of nuclear dark matter model at the current generation of tonne-scale liquid noble experiments, in particular DEAP-3600 and XENON1T. In our chosen nuclear dark matter scenario distinctive features arise in the recoil energy spectra due to the non-point-like nature of the composite dark matter state. We calculate the number of events required to distinguish these spectra from those of a standard point-like WIMP state with a decaying exponential recoil spectrum. In the most favourable regions of nuclear dark matter parameter space, we find that a few tens of events are needed to distinguish nuclear dark matter from WIMPs at the 3 σ level in a single experiment. Given the total exposure time of DEAP-3600 and XENON1T we find that at best a 2 σ distinction is possible by these experiments individually, while 3 σ sensitivity is reached for a range of parameters by the combination of the two experiments. We show that future upgrades of these experiments have potential to distinguish a large range of nuclear dark matter models from that of a WIMP at greater than 3 σ .

  17. Can tonne-scale direct detection experiments discover nuclear dark matter?

    International Nuclear Information System (INIS)

    Butcher, Alistair; Kirk, Russell; Monroe, Jocelyn; West, Stephen M.

    2017-01-01

    Models of nuclear dark matter propose that the dark sector contains large composite states consisting of dark nucleons in analogy to Standard Model nuclei. We examine the direct detection phenomenology of a particular class of nuclear dark matter model at the current generation of tonne-scale liquid noble experiments, in particular DEAP-3600 and XENON1T. In our chosen nuclear dark matter scenario distinctive features arise in the recoil energy spectra due to the non-point-like nature of the composite dark matter state. We calculate the number of events required to distinguish these spectra from those of a standard point-like WIMP state with a decaying exponential recoil spectrum. In the most favourable regions of nuclear dark matter parameter space, we find that a few tens of events are needed to distinguish nuclear dark matter from WIMPs at the 3 σ level in a single experiment. Given the total exposure time of DEAP-3600 and XENON1T we find that at best a 2 σ distinction is possible by these experiments individually, while 3 σ sensitivity is reached for a range of parameters by the combination of the two experiments. We show that future upgrades of these experiments have potential to distinguish a large range of nuclear dark matter models from that of a WIMP at greater than 3 σ .

  18. Methodology and analysis for effects of energy and angular distributions of secondary neutrons in fusion blankets and application to integral beryllium experiments

    International Nuclear Information System (INIS)

    Song, P.M.

    1990-01-01

    The main objective of the US/JAERI (Japan Atomic Energy Research Institute) collaborative experiment program on Fusion Breeder Neutronics is to estimate the uncertainties involved in predicting the TBR in Li 2 O. Beryllium has been used as a neutron multiplier in several experiments performed in that program. The shape of the C/E values (calculation/experiment) for the tritium production rate (TPR) from 6 Li, T 6 observed in these experiments indicate that there is underestimation in T 6 just behind the Be layer. This feature could be related to the Be cross-sections, especially in secondary energy (SED) and angular distribution (SAD) of emitted neutrons from reaction. These SED and SAD of the 9 Be(n,2n) cross-sections are subject to large uncertainties because of inadequate representation of the energy/angle distribution of the emitted neutrons. To access the uncertainty in predicting TPR that results from the current uncertainties in the cross-section data and SED/SAD of emitted neutrons, an extensive two-dimensional sensitivity/uncertainty analysis was performed with the current FORSS module, and new codes, JULIX, VARIX, and UNGSS which were specifically developed to incorporate the new methodology of the present work in treating SED/SAD sensitivity analyses. When analyses were performed, it was found that the local standard deviation in T 6 is 2.1 to 9.3% in the integrated cross-section, 5.2 to 11.2% in the SED (direct variation with the ENDF/BLANL), and 0.14% to 1.37% in the SAD, respectively, with largest uncertainties occurring inside the beryllium layer. The uncertainty in T 6 is mainly attributed to the current uncertainties in 9 Be(n,elastic), 16 O and 7 Li data

  19. A Methodology for the Assessment of Experiential Learning Lean: The Lean Experience Factory Study

    Science.gov (United States)

    De Zan, Giovanni; De Toni, Alberto Felice; Fornasier, Andrea; Battistella, Cinzia

    2015-01-01

    Purpose: The purpose of this paper is to present a methodology to assess the experiential learning processes of learning lean in an innovative learning environment: the lean model factories. Design/methodology/approach: A literature review on learning and lean management literatures was carried out to design the methodology. Then, a case study…

  20. Systematic methodology for estimating direct capital costs for blanket tritium processing systems

    International Nuclear Information System (INIS)

    Finn, P.A.

    1985-01-01

    This paper describes the methodology developed for estimating the relative capital costs of blanket processing systems. The capital costs of the nine blanket concepts selected in the Blanket Comparison and Selection Study are presented and compared

  1. Recovery of Vanadium from Magnetite Ore Using Direct Acid Leaching: Optimization of Parameters by Plackett-Burman and Response Surface Methodologies

    Science.gov (United States)

    Nejad, Davood Ghoddocy; Khanchi, Ali Reza; Taghizadeh, Majid

    2018-06-01

    Recovery of vanadium from magnetite ore by direct acid leaching is discussed. The proposed process, which employs a mixture of nitric and sulfuric acids, avoids pyrometallurgical treatments since such treatment consumes a high amount of energy. To determine the optimum conditions of vanadium recovery, the leaching process is optimized through Plackett-Burman (P-B) design and response surface methodology (RSM). In this respect, temperature (80-95°C), liquid to solid ratio (L/S) (3-10 mL g-1), sulfuric acid concentration (3-6 M), nitric acid concentration (5-10 vol.%) and time (4-8 h) are considered as the independent variables. According to the P-B approach, temperature and acid concentrations are, respectively, the most effective parameters in the leaching process. These parameters are optimized using RSM to maximize recovery of vanadium by direct acid leaching. In this way, 86.7% of vanadium can be extracted from magnetic ore.

  2. Direct 13C-detected NMR experiments for mapping and characterization of hydrogen bonds in RNA

    International Nuclear Information System (INIS)

    Fürtig, Boris; Schnieders, Robbin; Richter, Christian; Zetzsche, Heidi; Keyhani, Sara; Helmling, Christina; Kovacs, Helena; Schwalbe, Harald

    2016-01-01

    In RNA secondary structure determination, it is essential to determine whether a nucleotide is base-paired and not. Base-pairing of nucleotides is mediated by hydrogen bonds. The NMR characterization of hydrogen bonds relies on experiments correlating the NMR resonances of exchangeable protons and can be best performed for structured parts of the RNA, where labile hydrogen atoms are protected from solvent exchange. Functionally important regions in RNA, however, frequently reveal increased dynamic disorder which often leads to NMR signals of exchangeable protons that are broadened beyond 1 H detection. Here, we develop 13 C direct detected experiments to observe all nucleotides in RNA irrespective of whether they are involved in hydrogen bonds or not. Exploiting the self-decoupling of scalar couplings due to the exchange process, the hydrogen bonding behavior of the hydrogen bond donor of each individual nucleotide can be determined. Furthermore, the adaption of HNN-COSY experiments for 13 C direct detection allows correlations of donor–acceptor pairs and the localization of hydrogen-bond acceptor nucleotides. The proposed 13 C direct detected experiments therefore provide information about molecular sites not amenable by conventional proton-detected methods. Such information makes the RNA secondary structure determination by NMR more accurate and helps to validate secondary structure predictions based on bioinformatics.

  3. Methodological quality and descriptive characteristics of prosthodontic-related systematic reviews.

    Science.gov (United States)

    Aziz, T; Compton, S; Nassar, U; Matthews, D; Ansari, K; Flores-Mir, C

    2013-04-01

    Ideally, healthcare systematic reviews (SRs) should be beneficial to practicing professionals in making evidence-based clinical decisions. However, the conclusions drawn from SRs are directly related to the quality of the SR and of the included studies. The aim was to investigate the methodological quality and key descriptive characteristics of SRs published in prosthodontics. Methodological quality was analysed using the Assessment of Multiple Reviews (AMSTAR) tool. Several electronic resources (MEDLINE, EMBASE, Web of Science and American Dental Association's Evidence-based Dentistry website) were searched. In total 106 SRs were located. Key descriptive characteristics and methodological quality features were gathered and assessed, and descriptive and inferential statistical testing performed. Most SRs in this sample originated from the European continent followed by North America. Two to five authors conducted most SRs; the majority was affiliated with academic institutions and had prior experience publishing SRs. The majority of SRs were published in specialty dentistry journals, with implant or implant-related topics, the primary topics of interest for most. According to AMSTAR, most quality aspects were adequately fulfilled by less than half of the reviews. Publication bias and grey literature searches were the most poorly adhered components. Overall, the methodological quality of the prosthodontic-related systematic was deemed limited. Future recommendations would include authors to have prior training in conducting SRs and for journals to include a universal checklist that should be adhered to address all key characteristics of an unbiased SR process. © 2013 Blackwell Publishing Ltd.

  4. Concurrent measurement of "real-world" stress and arousal in individuals with psychosis: assessing the feasibility and validity of a novel methodology.

    Science.gov (United States)

    Kimhy, David; Delespaul, Philippe; Ahn, Hongshik; Cai, Shengnan; Shikhman, Marina; Lieberman, Jeffrey A; Malaspina, Dolores; Sloan, Richard P

    2010-11-01

    Psychosis has been repeatedly suggested to be affected by increases in stress and arousal. However, there is a dearth of evidence supporting the temporal link between stress, arousal, and psychosis during "real-world" functioning. This paucity of evidence may stem from limitations of current research methodologies. Our aim is to the test the feasibility and validity of a novel methodology designed to measure concurrent stress and arousal in individuals with psychosis during "real-world" daily functioning. Twenty patients with psychosis completed a 36-hour ambulatory assessment of stress and arousal. We used experience sampling method with palm computers to assess stress (10 times per day, 10 AM → 10 PM) along with concurrent ambulatory measurement of cardiac autonomic regulation using a Holter monitor. The clocks of the palm computer and Holter monitor were synchronized, allowing the temporal linking of the stress and arousal data. We used power spectral analysis to determine the parasympathetic contributions to autonomic regulation and sympathovagal balance during 5 minutes before and after each experience sample. Patients completed 79% of the experience samples (75% with a valid concurrent arousal data). Momentary increases in stress had inverse correlation with concurrent parasympathetic activity (ρ = -.27, P feasibility and validity of our methodology in individuals with psychosis. The methodology offers a novel way to study in high time resolution the concurrent, "real-world" interactions between stress, arousal, and psychosis. The authors discuss the methodology's potential applications and future research directions.

  5. Hanford Site baseline risk assessment methodology

    International Nuclear Information System (INIS)

    1993-03-01

    This methodology has been developed to prepare human health and environmental evaluations of risk as part of the Comprehensive Environmental Response, Compensation, and Liability Act remedial investigations (RIs) and the Resource Conservation and Recovery Act facility investigations (FIs) performed at the Hanford Site pursuant to the Hanford Federal Facility Agreement and Consent Order referred to as the Tri-Party Agreement. Development of the methodology has been undertaken so that Hanford Site risk assessments are consistent with current regulations and guidance, while providing direction on flexible, ambiguous, or undefined aspects of the guidance. The methodology identifies Site-specific risk assessment considerations and integrates them with approaches for evaluating human and environmental risk that can be factored into the risk assessment program supporting the Hanford Site cleanup mission. Consequently, the methodology will enhance the preparation and review of individual risk assessments at the Hanford Site

  6. Least squares methodology applied to LWR-PV damage dosimetry, experience and expectations

    International Nuclear Information System (INIS)

    Wagschal, J.J.; Broadhead, B.L.; Maerker, R.E.

    1979-01-01

    The development of an advanced methodology for Light Water Reactors (LWR) Pressure Vessel (PV) damage dosimetry applications is the subject of an ongoing EPRI-sponsored research project at ORNL. This methodology includes a generalized least squares approach to a combination of data. The data include measured foil activations, evaluated cross sections and calculated fluxes. The uncertainties associated with the data as well as with the calculational methods are an essential component of this methodology. Activation measurements in two NBS benchmark neutron fields ( 252 Cf ISNF) and in a prototypic reactor field (Oak Ridge Pool Critical Assembly - PCA) are being analyzed using a generalized least squares method. The sensitivity of the results to the representation of the uncertainties (covariances) was carefully checked. Cross element covariances were found to be of utmost importance

  7. Optimization of poorly compactable drug tablets manufactured by direct compression using the mixture experimental design.

    Science.gov (United States)

    Martinello, Tiago; Kaneko, Telma Mary; Velasco, Maria Valéria Robles; Taqueda, Maria Elena Santos; Consiglieri, Vladi O

    2006-09-28

    The poor flowability and bad compressibility characteristics of paracetamol are well known. As a result, the production of paracetamol tablets is almost exclusively by wet granulation, a disadvantageous method when compared to direct compression. The development of a new tablet formulation is still based on a large number of experiments and often relies merely on the experience of the analyst. The purpose of this study was to apply experimental design methodology (DOE) to the development and optimization of tablet formulations containing high amounts of paracetamol (more than 70%) and manufactured by direct compression. Nineteen formulations, screened by DOE methodology, were produced with different proportions of Microcel 102, Kollydon VA 64, Flowlac, Kollydon CL 30, PEG 4000, Aerosil, and magnesium stearate. Tablet properties, except friability, were in accordance with the USP 28th ed. requirements. These results were used to generate plots for optimization, mainly for friability. The physical-chemical data found from the optimized formulation were very close to those from the regression analysis, demonstrating that the mixture project is a great tool for the research and development of new formulations.

  8. The Application Strategy of Iterative Solution Methodology to Matrix Equations in Hydraulic Solver Package, SPACE

    International Nuclear Information System (INIS)

    Na, Y. W.; Park, C. E.; Lee, S. Y.

    2009-01-01

    As a part of the Ministry of Knowledge Economy (MKE) project, 'Development of safety analysis codes for nuclear power plants', KOPEC has been developing the hydraulic solver code package applicable to the safety analyses of nuclear power plants (NPP's). The matrices of the hydraulic solver are usually sparse and may be asymmetric. In the earlier stage of this project, typical direct matrix solver packages MA48 and MA28 had been tested as matrix solver for the hydraulic solver code, SPACE. The selection was based on the reasonably reliable performance experience from their former version MA18 in RELAP computer code. In the later stage of this project, the iterative methodologies have been being tested in the SPACE code. Among a few candidate iterative solution methodologies tested so far, the biconjugate gradient stabilization methodology (BICGSTAB) has shown the best performance in the applicability test and in the application to the SPACE code. Regardless of all the merits of using the direct solver packages, there are some other aspects of tackling the iterative solution methodologies. The algorithm is much simpler and easier to handle. The potential problems related to the robustness of the iterative solution methodologies have been resolved by applying pre-conditioning methods adjusted and modified as appropriate to the application in the SPACE code. The application strategy of conjugate gradient method was introduced in detail by Schewchuk, Golub and Saad in the middle of 1990's. The application of his methodology to nuclear engineering in Korea started about the same time and is still going on and there are quite a few examples of application to neutronics. Besides, Yang introduced a conjugate gradient method programmed in C++ language. The purpose of this study is to assess the performance and behavior of the iterative solution methodology compared to those of the direct solution methodology still being preferred due to its robustness and reliability. The

  9. Steps towards the international regulatory acceptance of non-animal methodology in safety assessment.

    Science.gov (United States)

    Sewell, Fiona; Doe, John; Gellatly, Nichola; Ragan, Ian; Burden, Natalie

    2017-10-01

    The current animal-based paradigm for safety assessment must change. In September 2016, the UK National Centre for Replacement, Refinement and Reduction of Animals in Research (NC3Rs) brought together scientists from regulatory authorities, academia and industry to review progress in bringing new methodology into regulatory use, and to identify ways to expedite progress. Progress has been slow. Science is advancing to make this possible but changes are necessary. The new paradigm should allow new methodology to be adopted once it is developed rather than being based on a fixed set of studies. Regulatory authorities can help by developing Performance-Based Standards. The most pressing need is in repeat dose toxicology, although setting standards will be more complex than in areas such as sensitization. Performance standards should be aimed directly at human safety, not at reproducing the results of animal studies. Regulatory authorities can also aid progress towards the acceptance of non-animal based methodology by promoting "safe-haven" trials where traditional and new methodology data can be submitted in parallel to build up experience in the new methods. Industry can play its part in the acceptance of new methodology, by contributing to the setting of performance standards and by actively contributing to "safe-haven" trials. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  10. Assessing the Impact of Direct Experience on Individual Preferences and Attitudes for Electric Vehicles

    DEFF Research Database (Denmark)

    Jensen, Anders Fjendbo

    Over the last decades, several studies have focused on understanding what drives the demand for electric vehicles (EVs). However, EVs still face large difficulties in developing into a mass market product. It is now recognised that individuals make choices based on a mixture of strategies...... elasticity and the diffusion of the EV into the car market. In particular the thesis (1) proposes a methodology to collect adequate data on choices before and after respondents obtain real-life experience with EVs; (2) uses advanced hybrid choice models estimated jointly on the before and the after data......, and (iv) a number of statements to measure the attitudes of environmental concern, appreciation of car features, interest in technology, general opinions towards EVs and scepticism. The same survey was then repeated in wave 2. First, a SC experiment was built with orthogonal design and tested...

  11. Assesing the Impact of Direct Experience on Individual Preferences and Attitudes for Electric Vehicles

    DEFF Research Database (Denmark)

    Jensen, Anders Fjendbo

    Over the last decades, several studies have focused on understanding what drives the demand for electric vehicles (EVs). However, EVs still face large difficulties in developing into a mass market product. It is now recognised that individuals make choices based on a mixture of strategies...... elasticity and the diffusion of the EV into the car market. In particular the thesis (1) proposes a methodology to collect adequate data on choices before and after respondents obtain experience with EVs; (2) uses advanced hybrid choice models estimated jointly on the before and the after data to model...... of statements to measure the attitudes of environmental concern, appreciation of car features, interest in technology, general opinions towards EVs and scepticism. The same survey was then repeated in wave 2. First, a SC experiment was built with orthogonal design and tested with a sample of 369 individuals...

  12. Measurement of direct CP violation in the NA48 experiment; Mise en evidence de la violation directe de CP par l'experience NA48

    Energy Technology Data Exchange (ETDEWEB)

    Formica, A

    2001-10-01

    The 2 first chapters of this thesis are dedicated to the theoretical and experimental aspects of CP violation. The NA48 experiment is a third generation experiment like KTeV, NA48 has been designed to collect data concerning the simultaneous detection of the 4 decay modes: K{sub L,S} {yields} {pi}{pi} and to provide the measurement of the parameter of direct CP violation with an uncertainty nearing 2*10{sup -4}. The third chapter describes the experimental equipment of NA48 in CERN: the production of K{sub L} and K{sub S} beams, the tagging system, the detection system for K {yields} {pi}{sup +}{pi}{sup -}, the detection system for K {yields} {pi}{sup 0}{pi}{sup 0}, the data acquisition system, and the trigger system. Chapter 4 is dedicated to the selection and identification of events. Chapter 5 deals with specific problems concerning the detection of {pi}{sup +}{pi}{sup -}, it means: the dead time in the triggering system, the overflow of the chamber reading system and the inefficiency of shift chambers. Chapter 6 lists the different corrections and systematic errors concerning the double ratio R, and gives the following result: Re({epsilon}'/{epsilon}=(14.4{+-}2.6)*10{sup -4}) which is by itself, for the first time, an evidence of direct violation. (A.C.)

  13. Laser induced plasma methodology for ignition control in direct injection sprays

    International Nuclear Information System (INIS)

    Pastor, José V.; García-Oliver, José M.; García, Antonio; Pinotti, Mattia

    2016-01-01

    Highlights: • Laser Induced Plasma Ignition system is designed and applied to a Diesel Spray. • A method for quantification of the system effectiveness and reliability is proposed. • The ignition system is optimized in atmospheric and engine-like conditions. • Higher system effectiveness is reached with higher ambient density. • The system is able to stabilize Diesel combustion compared to auto-ignition cases. - Abstract: New combustion modes for internal combustion engines represent one of the main fields of investigation for emissions control in transportation Industry. However, the implementation of lean fuel mixture condition and low temperature combustion in real engines is limited by different unsolved practical issues. To achieve an appropriate combustion phasing and cycle-to-cycle control of the process, the laser plasma ignition system arises as a valid alternative to the traditional electrical spark ignition system. This paper proposes a methodology to set-up and optimize a laser induced plasma ignition system that allows ensuring reliability through the quantification of the system effectiveness in the plasma generation and positional stability, in order to reach optimal ignition performance. For this purpose, experimental tests have been carried out in an optical test rig. At first the system has been optimized in an atmospheric environment, based on the statistical analysis of the plasma records taken with a high speed camera to evaluate the induction effectiveness and consequently regulate and control the system settings. The same optimization method has then been applied under engine-like conditions, analyzing the effect of thermodynamic ambient conditions on the plasma induction success and repeatability, which have shown to depend mainly on ambient density. Once optimized for selected engine conditions, the laser plasma induction system has been used to ignite a direct injection Diesel spray, and to compare the evolution of combustion

  14. Methodological developments and materials in salt-rock preparation for irradiation experiments

    International Nuclear Information System (INIS)

    Garcia Celma, A.; Van Wees, H.; Miralles, L.

    1991-01-01

    For the first time synthetic salt-rock samples have been produced. Production and preparation of those samples and of other types of rock-salt for experiments and observation require many special handlings. We applied technical knowledge already developed by the HPT Laboratory of the Geology Department of the Rijksuniversiteit Utrecht (high pressure techniques, salt-rock preparation), and by the workshops of the ECN, Petten, and FDO, Amsterdam (mechanical precision). Procedures have been applied and/or modified to solve specific problems. Many of them were never reported before. Moreover, new techniques have been developed. Rock-salt samples have been machined, sawn, ground, glued, etc., with a maximum of precision, a minimum of damage and in dry conditions (without water). Etching, peeling and thin section production has been carried out on irradiated and unirradiated samples. Valves, end pieces, jackets, etc. have been tested and/or produced. These handlings were directed to produce samples for the HAW experiment. Their development required not only knowledge, but also a lot of trial, failures and time. To avoid repetition of this effort, the procedures, materials, instruments and their characteristics are described in detail in this report

  15. New build methodology approach by Iberdrola Ingenieria y Construccion

    International Nuclear Information System (INIS)

    Zornoza Garcia-Andrade, Javier; Ignacio Diaz Prade, Jose; Martinez Gozalo, Ignacio; Merino Teillet, Alejandro

    2014-01-01

    Iberdrola Ingenieria y Construccion (hereinafter IEC) has developed a significant number of projects in several nuclear plants worldwide. Some of these plants are under operation, while others are under construction. Nowadays, IEC is actively participating in the design and construction of some safety and non-safety class systems of Flamanville 3, for EDF, has actively participated for OL4 (Finland) in one of the Nuclear Island (NI) proposal development covering for example complete NI supply chain and Licence Feasibility Study stages. Also, in the UK IEC has developed site characterization, licensing and heat sink analysis and supply chain development activities for NUGEN and is performing support engineering activities for HGNE for Wylfa. In addition, IEC performed the project for the refurbishment of Laguna Verde 1 and 2 turbine islands in Mexico, with a budget above 600 M USD. Nowadays, successful development of a new nuclear plant project in Europe, with stringent regulatory requirements for licensing and permitting which directly impact in the development of design, supply, construction and commissioning, requires successful integration of these processes by Regulatory Authority, Customer, Main Supplier and its entire supply chain. The soft integration of design, supply chain, construction and commissioning in the entire project development, will permit the application of advanced construction and engineering methodologies, as for example 'Open Top' construction or modular design, while minimizing regulatory delays. This integration represents one of the main challenges to minimize nuclear projects risk in order to successfully develop the project of a new plant considering a construction period of less than 5 year. Current paper provides a short description of proposed methodology and experiences of IEC, exposing several examples taken from actual experience in the design, supply and installation of systems in Flamanville 3 and from Olkiluoto 4

  16. Displaying results of direct detection dark matter experiments free of astrophysical uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Rauch, Ludwig [Max Planck Institut fuer Kernphysik, Heidelberg (Germany); Collaboration: Collaboration XENON 100

    2015-07-01

    A number of experiments try to measure WIMP interactions by using different detector technologies and target elements. Hence, energy thresholds and sensitivities to light or heavy WIMP masses differ. However, due to large systematic uncertainties in the parameters defining the dark matter halo, a comparison of detectors is demanding. By mapping experimental results from the traditional cross section vs. dark matter mass parameter-space into a dark matter halo independent phase space, direct comparisons between experiments can be made. This is possible due to the monotonicity of the velocity integral which enables to combine all astrophysical assumptions into one parameter common to all experiments. In this talk the motivation as well as the mapping method are explained based on the XENON100 data.

  17. Screening radon risks: A methodology for policymakers

    International Nuclear Information System (INIS)

    Eisinger, D.S.; Simmons, R.A.; Lammering, M.; Sotiros, R.

    1991-01-01

    This paper provides an easy-to-use screening methodology to estimate potential excess lifetime lung cancer risk resulting from indoor radon exposure. The methodology was developed under U.S. EPA Office of Policy, Planning, and Evaluation sponsorship of the agency's Integrated Environmental Management Projects (IEMP) and State/Regional Comparative Risk Projects. These projects help policymakers understand and use scientific data to develop environmental problem-solving strategies. This research presents the risk assessment methodology, discusses its basis, and identifies appropriate applications. The paper also identifies assumptions built into the methodology and qualitatively addresses methodological uncertainties, the direction in which these uncertainties could bias analyses, and their relative importance. The methodology draws from several sources, including risk assessment formulations developed by the U.S. EPA's Office of Radiation Programs, the EPA's Integrated Environmental Management Project (Denver), the International Commission on Radiological Protection, and the National Institute for Occupational Safety and Health. When constructed as a spreadsheet program, the methodology easily facilitates analyses and sensitivity studies (the paper includes several sensitivity study options). The methodology will be most helpful to those who need to make decisions concerning radon testing, public education, and exposure prevention and mitigation programs.26 references

  18. Improved best estimate plus uncertainty methodology, including advanced validation concepts, to license evolving nuclear reactors

    International Nuclear Information System (INIS)

    Unal, C.; Williams, B.; Hemez, F.; Atamturktur, S.H.; McClure, P.

    2011-01-01

    the design and licensing of evolving nuclear technology. The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification - steps similar to the components of the traditional US Nuclear Regulatory Commission (NRC) licensing approach, with the exception of the calibration step. An enhanced calibration concept is introduced here, and is accomplished through data assimilation. The goal of this methodology is to enable best-estimate prediction of system behaviors in both normal and safety-related environments. This goal requires the additional steps of estimating the domain of validation, and quantification of uncertainties, allowing for the extension of results to areas of the validation domain that are not directly tested with experiments. These might include the extension of the M and S capabilities for application to full-scale systems. The new methodology suggests a formalism to quantify an adequate level of validation (predictive maturity) with respect to existing data, so that required new testing can be minimized, saving cost by demonstrating that further testing will not enhance the quality of the predictive tools. The proposed methodology is at a conceptual level. Upon maturity, and if considered favorably by the stakeholders, it could serve as a new framework for the next generation of the best estimate plus uncertainty (BEPU) licensing methodology that the NRC has developed. In order to achieve maturity, the methodology must be communicated to scientific, design, and regulatory stakeholders for discussion and debate. This paper is the first step in establishing that communication.

  19. CRITICISM AND SUPPORT TO CORPORATE SOCIAL RESPONSIBILITY: AN ETHNOGRAPHIC APPROACH BASED ON THE WORKERS’ EXPERIENCE AND A QUALITATIVE METHODOLOGY PROPOSAL

    Directory of Open Access Journals (Sweden)

    JUAN ANTONIO NAVARRO PRADOS

    2007-01-01

    Full Text Available This paper aims at presenting partial results and process of an investigation which analyzes the experience of a groupof employees, namely the experience of the implementation process of the corporate social responsibility policy of amedium-sized service company. For the case study participant observation, analysis of corporate documents and indepthinterviews to 64 employees across all organisational levels were employed. AtlasTi software was used to analyseand feed back the information received. This analysis produced a matrix of 161 content codes further analysed bymeans of network analysis methodology. Eventually, content network data were compared to the corporate sociogram.The investigation has been carried out during the last three years.

  20. Influence of communal and private folklore on bringing meaning to the experience of persistent pain.

    Science.gov (United States)

    Hendricks, Joyce Marie

    2015-11-01

    To provide an overview of the relevance and strengths of using the literary folkloristic methodology to explore the ways in which people with persistent pain relate to and make sense of their experiences through narrative accounts. Storytelling is a conversation with a purpose. The reciprocal bond between researcher and storyteller enables the examination of the meaning of experiences. Life narratives, in the context of wider traditional and communal folklore, can be analysed to discover how people make sense of their circumstances. This paper draws from the experience of the author, who has previously used this narrative approach. It is a reflection of how the approach may be used to understand those experiencing persistent pain without a consensual diagnosis. Using an integrative method, peer-reviewed research and discussion papers published between January 1990 and December 2014 and listed in the CINAHL, Science Direct, PsycINFO and Google Scholar databases were reviewed. In addition, texts that addressed research methodologies such as literary folkloristic methodology and Marxist literary theory were used. The unique role that nurses play in managing pain is couched in the historical and cultural context of nursing. Literary folkloristic methodology offers an opportunity to gain a better understanding and appreciation of how the experience of pain is constructed and to connect with sufferers. Literary folkloristic methodology reveals that those with persistent pain are often rendered powerless to live their lives. Increasing awareness of how this experience is constructed and maintained also allows an understanding of societal influences on nursing practice. Nurse researchers try to understand experiences in light of specific situations. Literary folkloristic methodology can enable them to understand the inter-relationship between people in persistent pain and how they construct their experiences.

  1. Training simulators in nuclear power plants: Experience, programme design and assessment methodology. Proceedings of a specialists' meeting

    International Nuclear Information System (INIS)

    1997-11-01

    Simulators became an indispensable part of training world-wide. Therefore, international exchange of information is important to share the experience gained in different countries in order to assure high international standards. A second aspects is the tremendous evolution in the computing capacities of the simulator hardware and the increasing functionality of the simulator software. This background has let the IAEA to invite the simulator experts for an experience exchange. The German Simulator Centre in Essen, which is operated by the companies KSG and GfS, was asked to host this Specialists' Meeting. The Specialists' Meeting on ''Training Simulators in Nuclear Power Plants: Experience, Programme Design and Assessment Methodology'' was organized by IAEA in-cooperation with the German Simulator Centre operated by KSG Kraftwerks-Simulator-Gesellschaft mbH and GfS Gesellschaft fuer Simulatorschulung mbH and was held from 17 - 19 November 1997 in Essen, Germany. The meeting focused on developments in simulation technology, experiences with simulator upgrades, utilization of computerized tools as support and complement of simulator training, use of simulators for other purposes. The meeting was attended by 50 participants from 16 countries. In the course of four sessions 21 technical presentations were made. The present volume contains the papers by national delegates at the Specialists' Meeting

  2. Training simulators in nuclear power plants: Experience, programme design and assessment methodology. Proceedings of a specialists` meeting

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-11-01

    Simulators became an indispensable part of training world-wide. Therefore, international exchange of information is important to share the experience gained in different countries in order to assure high international standards. A second aspects is the tremendous evolution in the computing capacities of the simulator hardware and the increasing functionality of the simulator software. This background has let the IAEA to invite the simulator experts for an experience exchange. The German Simulator Centre in Essen, which is operated by the companies KSG and GfS, was asked to host this Specialists` Meeting. The Specialists` Meeting on ``Training Simulators in Nuclear Power Plants: Experience, Programme Design and Assessment Methodology`` was organized by IAEA in-cooperation with the German Simulator Centre operated by KSG Kraftwerks-Simulator-Gesellschaft mbH and GfS Gesellschaft fuer Simulatorschulung mbH and was held from 17 - 19 November 1997 in Essen, Germany. The meeting focused on developments in simulation technology, experiences with simulator upgrades, utilization of computerized tools as support and complement of simulator training, use of simulators for other purposes. The meeting was attended by 50 participants from 16 countries. In the course of four sessions 21 technical presentations were made. The present volume contains the papers by national delegates at the Specialists` Meeting Refs, figs, tabs

  3. Topflow-experiments on direct condensation and bubble entrainment. Technical report

    International Nuclear Information System (INIS)

    Seidel, Tobias; Lucas, Dirk; Beyer, Matthias

    2016-01-01

    Direct Contact Condensation between steam and water as well as bubble entrainment below the water surface play an important role in different accident scenarios for light water reactors. One example is the emergency core cooling water injection into a two-phase mixture. It has to be considered for example to evaluate potential pressurized thermal shock phenomena. This report documents experiments conducted in flat basin inside the TOPFLOW pressure chamber aiming on the generation of a database useful for CFD model development and validation. It comprises 3 different setups: condensation at a stratified flow of sub-cooled water, condensation at a sub-cooled water jet and a combination of both phenomena with steam bubble entrainment. The documentation includes all details on the experimental set up, on experimental conditions (experimental matrices), on the conduction of the experiments, on measuring techniques used and on data evaluation procedures. In addition, selected results are presented.

  4. Direct dark matter search with the CRESST-III experiment - status and perspectives

    Science.gov (United States)

    Willers, M.; Angloher, G.; Bento, A.; Bucci, C.; Canonica, L.; Defay, X.; Erb, A.; Feilitzsch, F. v.; Ferreiro Iachellini, N.; Gütlein, A.; Gorla, P.; Hauff, D.; Jochum, J.; Kiefer, M.; Kluck, H.; Kraus, H.; Lanfranchi, J.-C.; Loebell, J.; Mancuso, M.; Münster, A.; Pagliarone, C.; Petricca, F.; Potzel, W.; Pröbst, F.; Puig, R.; Reindl, F.; Schäffner, K.; Schieck, J.; Schönert, S.; Seidel, W.; Stahlberg, M.; Stodolsky, L.; Strandhagen, C.; Strauss, R.; Tanzke, A.; Trinh Thi, H. H.; Türkoǧlu, C.; Uffinger, M.; Ulrich, A.; Usherov, I.; Wawoczny, S.; Wüstrich, M.; Zöller, A.

    2017-09-01

    The CRESST-III experiment, located in the Gran Sasso underground laboratory (LNGS, Italy), aims at the direct detection of dark matter (DM) particles. Scintillating CaWO4 crystals operated as cryogenic detectors are used as target material for DM-nucleus scattering. The simultaneous measurement of the phonon signal from the CaWO4 crystal and of the emitted scintillation light in a separate cryogenic light detector is used to discriminate backgrounds from a possible dark matter signal. The experiment aims to significantly improve the sensitivity for low-mass (≲ 5-10 GeV/c2) DM particles by using optimized detector modules with a nuclear recoil-energy threshold ≲ 100 eV. The current status of the experiment as well as projections of the sensitivity for spin-independent DM-nucleon scattering will be presented.

  5. Involving Freight Transport Actors in Production of Knowledge - Experience with Future Workshop Methodology

    DEFF Research Database (Denmark)

    Jespersen, Per Homann; Drewes, Lise

    2005-01-01

    the experience and knowledge of actors in the freight transport sector are included directly in a scientific process in order to develop future and strategic studies. Future research is often produced as desktop research and presented as the results of scientists’ forecasting and scenario building...... in the format of a future workshop included freight transport stakeholders in the research process in order to produce knowledge meeting scientific quality criteria and at the same time in a form suitable for improving the problem solving capabilities of the participants....

  6. DHCVIM - a direct heating containment vessel interactions module: applications to Sandia National Laboratories Surtsey experiments

    International Nuclear Information System (INIS)

    Ginsberg, T.; Tutu, N.K.

    1987-01-01

    Direct containment heating is the mechanism of severe nuclear reactor accident containment loading that results from transfer of thermal and chemical energy from high-temperature, finely divided, molten core material to the containment atmosphere. The direct heating containment vessel interactions module (DHCVIM) has been developed at Brookhaven National Laboratory to model the mechanisms of containment loading resulting from the direct heating accident sequence. The calculational procedure is being used at present to model the Sandia National Laboratories one-tenth-scale Surtsey direct containment heating experiments. The objective of the code is to provide a test bed for detailed modeling of various aspects of the thermal, chemical, and hydrodynamic interactions that are expected to occur in three regions of a containment building: reactor cavity, intermediate subcompartments, and containment dome. Major emphasis is placed on the description of reactor cavity dynamics. This paper summarizes the modeling principles that are incorporated in DHCVIM and presents a prediction of the Surtsey Test DCH-2 that was made prior to execution of the experiment

  7. Methodology for Computational Fluid Dynamic Validation for Medical Use: Application to Intracranial Aneurysm.

    Science.gov (United States)

    Paliwal, Nikhil; Damiano, Robert J; Varble, Nicole A; Tutino, Vincent M; Dou, Zhongwang; Siddiqui, Adnan H; Meng, Hui

    2017-12-01

    Computational fluid dynamics (CFD) is a promising tool to aid in clinical diagnoses of cardiovascular diseases. However, it uses assumptions that simplify the complexities of the real cardiovascular flow. Due to high-stakes in the clinical setting, it is critical to calculate the effect of these assumptions in the CFD simulation results. However, existing CFD validation approaches do not quantify error in the simulation results due to the CFD solver's modeling assumptions. Instead, they directly compare CFD simulation results against validation data. Thus, to quantify the accuracy of a CFD solver, we developed a validation methodology that calculates the CFD model error (arising from modeling assumptions). Our methodology identifies independent error sources in CFD and validation experiments, and calculates the model error by parsing out other sources of error inherent in simulation and experiments. To demonstrate the method, we simulated the flow field of a patient-specific intracranial aneurysm (IA) in the commercial CFD software star-ccm+. Particle image velocimetry (PIV) provided validation datasets for the flow field on two orthogonal planes. The average model error in the star-ccm+ solver was 5.63 ± 5.49% along the intersecting validation line of the orthogonal planes. Furthermore, we demonstrated that our validation method is superior to existing validation approaches by applying three representative existing validation techniques to our CFD and experimental dataset, and comparing the validation results. Our validation methodology offers a streamlined workflow to extract the "true" accuracy of a CFD solver.

  8. Long-duration planar direct-drive hydrodynamics experiments on the NIF

    Science.gov (United States)

    Casner, A.; Mailliet, C.; Khan, S. F.; Martinez, D.; Izumi, N.; Kalantar, D.; Di Nicola, P.; Di Nicola, J. M.; Le Bel, E.; Igumenshchev, I.; Tikhonchuk, V. T.; Remington, B. A.; Masse, L.; Smalyuk, V. A.

    2018-01-01

    The advent of high-power lasers facilities such as the National Ignition Facility (NIF) and the laser megajoule provide unique platforms to study the physics of turbulent mixing flows in high energy density plasmas. We report here on the commissioning of a novel planar direct-drive platform on the NIF, which allows the acceleration of targets during 30 ns. Planar plastic samples were directly irradiated by 300-450 kJ of UV laser light (351 nm) and a very good planarity of the laser drive is demonstrated. No detrimental effect of imprint is observed in the case of these thick plastic targets (300 μm), which is beneficial for future academic experiments requesting similar irradiation conditions. The long-duration direct-drive (DD) platform is thereafter harnessed to study the ablative Rayleigh-Taylor instability (RTI) in DD. The growth of two-dimensional pre-imposed perturbations is quantified through time-resolved face-on x-ray radiography and used as a benchmark for radiative hydrocode simulations. The ablative RTI is then quantified in its highly nonlinear stage starting from intentionally large 3D imprinted broadband modulations. Two generations of bubble mergers is observed for the first time in DD, as a result of the unprecedented long laser acceleration.

  9. Methodologies for rapid evaluation of seismic demand levels in nuclear power plant structures

    International Nuclear Information System (INIS)

    Manrique, M.; Asfura, A.; Mukhim, G.

    1990-01-01

    A methodology for rapid assessment of both acceleration spectral peak and 'zero period acceleration' (ZPA) values for virtually any major structure in a nuclear power plant is presented. The methodology is based on spectral peak and ZPA amplification factors, developed from regression analyses of an analytical database. The developed amplification factors are applied to the plant's design ground spectrum to obtain amplified response parameters. A practical application of the methodology is presented. This paper also presents a methodology for calculating acceleration response spectrum curves at any number of desired damping ratios directly from a single known damping ratio spectrum. The methodology presented is particularly useful and directly applicable to older vintage nuclear power plant facilities (i.e. such as those affected by USI A-46). The methodology is based on principles of random vibration theory. The methodology has been implemented in a computer program (SPECGEN). SPECGEN results are compared with results obtained from time history analyses. (orig.)

  10. Air-water mixing experiments for direct vessel injection of KNGR

    International Nuclear Information System (INIS)

    Hwang, Do Hyun

    2000-02-01

    Two air-water mixing experiments are conducted to understand the flow behavior in the downcomer for Direct Vessel Injection (DVI) of Korean Next Generation Reactor (KNGR). In the first experiment which is an air-water experiment in the rectangular channel with the gap size of 1cm, the width of water film is proportional to the water and air velocities and the inclined angle is proportional to the water velocity only, regardless of the water velocity injected in the rectangular channel. It is observed that the amount of entrained water is negligible. In the second experiment which is a full-scaled water jetting experiment without air flow, the width of water film is proportional to the flow rate injected from the pipe exit and the film thickness of water varies from 1.0mm to 5.0mm, and the maximum thickness does not exceed 5.0mm. The amount of water separated from the liquid film after striking of water jetting on the wall is measured. The amount of separation water is proportional to the flow rate, but the separation ratio in the full-scaled water jetting is not over 15%. A simplified physical model, which is designed to predict the trajectories of the width of water film, is validated through the comparison with experiment results. The 13 .deg. upward water droplet of the water injected from the pipe constitutes the outermost boundary at 1.7m below from pipe level, after the water impinges against the wall. In the model, the parameter, η which represents the relationship between the jetting velocity and the initial spreading velocity, is inversely proportional to the water velocity when it impinges against the wall. The error of the predictions by the model is decreased within 14% to the experimental data through use of exponential fitting of η for the jetting water velocity

  11. Adopted Methodology for Cool-Down of SST-1 Superconducting Magnet System: Operational Experience with the Helium Refrigerator

    Science.gov (United States)

    Sahu, A. K.; Sarkar, B.; Panchal, P.; Tank, J.; Bhattacharya, R.; Panchal, R.; Tanna, V. L.; Patel, R.; Shukla, P.; Patel, J. C.; Singh, M.; Sonara, D.; Sharma, R.; Duggar, R.; Saxena, Y. C.

    2008-03-01

    The 1.3 kW at 4.5 K helium refrigerator / liquefier (HRL) was commissioned during the year 2003. The HRL was operated with its different modes as per the functional requirements of the experiments. The superconducting magnets system (SCMS) of SST-1 was successfully cooled down to 4.5 K. The actual loads were different from the originally predicted boundary conditions and an adjustment in the thermodynamic balance of the refrigerator was necessary. This led to enhanced capacity, which was achieved without any additional hardware. The required control system for the HRL was tuned to achieve the stable thermodynamic balance, while keeping the turbines' operating parameters at optimized conditions. An extra mass flow rate requirement was met by exploiting the margin available with the compressor station. The methodology adopted to modify the capacity of the HRL, the safety precautions and experience of SCMS cool down to 4.5 K, are discussed.

  12. Direct {sup 13}C-detected NMR experiments for mapping and characterization of hydrogen bonds in RNA

    Energy Technology Data Exchange (ETDEWEB)

    Fürtig, Boris, E-mail: fuertig@nmr.uni-frankfurt.de; Schnieders, Robbin; Richter, Christian; Zetzsche, Heidi; Keyhani, Sara; Helmling, Christina [Johann Wolfgang Goethe Universität Frankfurt, Center for Biomolecular Magnetic Resonance (BMRZ), Institute of Organic Chemistry and Chemical Biology (Germany); Kovacs, Helena [Bruker BioSpin (Switzerland); Schwalbe, Harald, E-mail: schwalbe@nmr.uni-frankfurt.de [Johann Wolfgang Goethe Universität Frankfurt, Center for Biomolecular Magnetic Resonance (BMRZ), Institute of Organic Chemistry and Chemical Biology (Germany)

    2016-03-15

    In RNA secondary structure determination, it is essential to determine whether a nucleotide is base-paired and not. Base-pairing of nucleotides is mediated by hydrogen bonds. The NMR characterization of hydrogen bonds relies on experiments correlating the NMR resonances of exchangeable protons and can be best performed for structured parts of the RNA, where labile hydrogen atoms are protected from solvent exchange. Functionally important regions in RNA, however, frequently reveal increased dynamic disorder which often leads to NMR signals of exchangeable protons that are broadened beyond {sup 1}H detection. Here, we develop {sup 13}C direct detected experiments to observe all nucleotides in RNA irrespective of whether they are involved in hydrogen bonds or not. Exploiting the self-decoupling of scalar couplings due to the exchange process, the hydrogen bonding behavior of the hydrogen bond donor of each individual nucleotide can be determined. Furthermore, the adaption of HNN-COSY experiments for {sup 13}C direct detection allows correlations of donor–acceptor pairs and the localization of hydrogen-bond acceptor nucleotides. The proposed {sup 13}C direct detected experiments therefore provide information about molecular sites not amenable by conventional proton-detected methods. Such information makes the RNA secondary structure determination by NMR more accurate and helps to validate secondary structure predictions based on bioinformatics.

  13. Tornado missile simulation and design methodology. Volume 1: simulation methodology, design applications, and TORMIS computer code. Final report

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and risk has been assessed for a hypothetical nuclear power plant design case study

  14. Studying human-automation interactions: methodological lessons learned from the human-centred automation experiments 1997-2001

    International Nuclear Information System (INIS)

    Massaiu, Salvatore; Skjerve, Ann Britt Miberg; Skraaning, Gyrd Jr.; Strand, Stine; Waeroe, Irene

    2004-04-01

    This report documents the methodological lessons learned from the Human Centred Automation (HCA) programme both in terms of psychometric evaluation of the measurement techniques developed for human-automation interaction study, and in terms of the application of advanced statistical methods for analysis of experiments. The psychometric evaluation is based on data from the four experiments performed within the HCA programme. The result is a single-source reference text of measurement instruments for the study of human-automation interaction, part of which were specifically developed by the programme. The application of advanced statistical techniques is exemplified by additional analyses performed on the IPSN-HCA experiment of 1998. Special importance is given to the statistical technique Structural Equation Modeling, for the possibility it offers to advance, and empirically test, comprehensive explanations about human-automation interactions. The additional analyses of the IPSN-HCA experiment investigated how the operators formed judgments about their own performance. The issue is of substantive interest for human automation interaction research because the operators' over- or underestimation of their own performance could be seen as a symptom of human-machine mismatch, and a potential latent failure. These analyses concluded that it is the interplay between (1) the level of automation and several factors that determines the operators' bias in performance self-estimation: (2) the nature of the task, (3) the level of scenario complexity, and (4) the level of trust in the automatic system. A structural model that expresses the interplay of all these factors was empirically evaluated and was found able to provide a concise and elegant explanation of the intricate pattern of relationships between the identified factors. (Author)

  15. A Methodology for Integrating Maintainability Using Software Metrics

    OpenAIRE

    Lewis, John A.; Henry, Sallie M.

    1989-01-01

    Maintainability must be integrated into software early in the development process. But for practical use, the techniques used must be as unobtrusive to the existing software development process as possible. This paper defines a methodology for integrating maintainability into large-scale software and describes an experiment which implemented the methodology into a major commercial software development environment.

  16. Materials for pressure equipment under the new approach directives: a one-year home-experience

    International Nuclear Information System (INIS)

    Zdankiewicz, M.

    2005-01-01

    The New Approach Directives concerning pressure equipment set forth basic safety requirements for their designing, manufacturing and testing. The said requirements are being implemented in the field of materials in Poland after one-year experience. (author)

  17. CONTAIN code analyses of direct containment heating experiments

    International Nuclear Information System (INIS)

    Williams, D.C.; Griffith, R.O.; Tadios, E.L.; Washington, K.E.

    1995-01-01

    In some nuclear reactor core-melt accidents, a potential exists for molten core-debris to be dispersed into the containment under high pressure. Resulting energy transfer to the containment atmosphere can pressurize the containment. This process, known as direct containment heating (DCH), has been the subject of extensive experimental and analytical programs sponsored by the U.S. Nuclear Regulatory Commission (NRC). The DCH modeling has been an important focus for the development of the CONTAIN code. Results of a detailed independent peer review of the CONTAIN code were published recently. This paper summarizes work performed in support of the peer review in which the CONTAIN code was applied to analyze DCH experiments. Goals of this work were comparison of calculated and experimental results, CONTAIN DCH model assessment, and development of guidance for code users, including development of a standardized input prescription for DCH analysis

  18. What is the probability that direct detection experiments have observed dark matter?

    International Nuclear Information System (INIS)

    Bozorgnia, Nassim; Schwetz, Thomas

    2014-01-01

    In Dark Matter direct detection we are facing the situation of some experiments reporting positive signals which are in conflict with limits from other experiments. Such conclusions are subject to large uncertainties introduced by the poorly known local Dark Matter distribution. We present a method to calculate an upper bound on the joint probability of obtaining the outcome of two potentially conflicting experiments under the assumption that the Dark Matter hypothesis is correct, but completely independent of assumptions about the Dark Matter distribution. In this way we can quantify the compatibility of two experiments in an astrophysics independent way. We illustrate our method by testing the compatibility of the hints reported by DAMA and CDMS-Si with the limits from the LUX and SuperCDMS experiments. The method does not require Monte Carlo simulations but is mostly based on using Poisson statistics. In order to deal with signals of few events we introduce the so-called ''signal length'' to take into account energy information. The signal length method provides a simple way to calculate the probability to obtain a given experimental outcome under a specified Dark Matter and background hypothesis

  19. Experiment on direct nn scattering - The radiation-induced outgassing complication

    Energy Technology Data Exchange (ETDEWEB)

    Stephenson, S.L., E-mail: sstephen@gettysburg.edu [Gettysburg College, Gettysburg, PA 17325 (United States); Crawford, B.E. [Gettysburg College, Gettysburg, PA 17325 (United States); Furman, W.I.; Lychagin, E.V.; Muzichka, A.Yu.; Nekhaev, G.V.; Sharapov, E.I.; Shvetsov, V.N.; Strelkov, A.V. [Joint Institute for Nuclear Research, 141980 Dubna (Russian Federation); Levakov, B.G.; Lyzhin, A.E.; Chernukhin, Yu.I. [Russian Federal Nuclear Center - All Russian Research Institute of Technical Physics, P.O. Box 245, 456770 Snezhinsk (Russian Federation); Howell, C.R. [Duke University and Triangle Universities Nuclear Laboratory, Durham, NC 27708-0308 (United States); Mitchell, G.E. [North Carolina State University, Raleigh, NC 27695-8202 (United States); Triangle Universities Nuclear Laboratory, Durham, NC 27708-0308 (United States); Tornow, W. [Duke University and Triangle Universities Nuclear Laboratory, Durham, NC 27708-0308 (United States); Showalter-Bucher, R.A. [Northeastern University, Boston, MA 02115 (United States)

    2012-12-01

    The first direct neutron-neutron scattering experiment using the YAGUAR pulsed reactor has yielded initial results. They show a unforeseen significant thermal neutron background as a result of radiation-induced desorption within the scattering chamber. Thermal neutrons are mostly scattering not from other neutrons but instead from the desorbed gas molecules. Analysis of the obtained neutron time-of-flight spectra suggests neutron scattering from H{sub 2} molecules. The presented desorption model agrees with our experimental value of the desorption yield {eta}{sub {gamma}}=0.02 molecules/gamma. Possible techniques to reduce the effect of the desorption background are presented.

  20. Disconfirming User Expectations of the Online Service Experience: Inferred versus Direct Disconfirmation Modeling.

    Science.gov (United States)

    O'Neill, Martin; Palmer, Adrian; Wright, Christine

    2003-01-01

    Disconfirmation models of online service measurement seek to define service quality as the difference between user expectations of the service to be received and perceptions of the service actually received. Two such models-inferred and direct disconfirmation-for measuring quality of the online experience are compared (WebQUAL, SERVQUAL). Findings…

  1. Direct photon production in Pb-Pb collisions at the LHC with the ALICE experiment

    Energy Technology Data Exchange (ETDEWEB)

    Bock, Friederike [Physikalisches Institut, Heidelberg University (Germany); Lawrence Berkeley National Laboratory, Berkeley (United States)

    2015-07-01

    Unlike hadrons, direct photons are produced in all stages of a nucleus-nucleus collision and therefore test our understanding of the space-time evolution of the produced medium. Of particular interest are so-called thermal photons expected to be produced in a quark-gluon plasma and the subsequent hadron gas. The transverse momentum spectrum of thermal photons carries information about the temperature of the emitting medium. In this presentation, direct-photon spectra from Pb-Pb collisions at √(s{sub NN}) = 2.76 TeV and p-Pb collisions at √(s{sub NN}) = 5.02 TeV are shown. The results were obtained by measuring e{sup +}e{sup -} pairs from external conversions of photons in the detector material. The measured direct-photon spectra are compared with predictions from state-of-the-art hydrodynamic models. In the standard hydrodynamical modeling of nucleus-nucleus collisions, thermal photons mostly come from the early hot stage of the collision. As collective hydrodynamic flow needs time to build up, the azimuthal anisotropy of thermal photons quantified with the Fourier coefficient v{sub 2} is expected to be smaller than the one for hadrons. However, the PHENIX experiment and ALICE experiment observed v{sub 2} values of direct-photons similar in magnitude to the pion v{sub 2}. We present the inclusive photon v{sub 2} and v{sub 3} in Pb-Pb collisions at √(s{sub NN}) = 2.76 TeV and discuss implications for the v{sub 2} and v{sub 3} of direct-photons.

  2. Development of sodium droplet combustion analysis methodology using direct numerical simulation in 3-dimensional coordinate (COMET)

    International Nuclear Information System (INIS)

    Okano, Yasushi; Ohira, Hiroaki

    1998-08-01

    In the early stage of sodium leak event of liquid metal fast breeder reactor, LMFBR, liquid sodium flows out from a piping, and ignition and combustion of liquid sodium droplet might occur under certain environmental condition. Compressible forced air flow, diffusion of chemical species, liquid sodium droplet behavior, chemical reactions and thermodynamic properties should be evaluated with considering physical dependence and numerical connection among them for analyzing combustion of sodium liquid droplet. A direct numerical simulation code was developed for numerical analysis of sodium liquid droplet in forced convection air flow. The numerical code named COMET, 'Sodium Droplet COmbustion Analysis METhodology using Direct Numerical Simulation in 3-Dimensional Coordinate'. The extended MAC method was used to calculate compressible forced air flow. Counter diffusion among chemical species is also calculated. Transport models of mass and energy between droplet and surrounding atmospheric air were developed. Equation-solving methods were used for computing multiphase equilibrium between sodium and air. Thermodynamic properties of chemical species were evaluated using dynamic theory of gases. Combustion of single sphere liquid sodium droplet in forced convection, constant velocity, uniform air flow was numerically simulated using COMET. Change of droplet diameter with time was closely agree with d 2 -law of droplet combustion theory. Spatial distributions of combustion rate and heat generation and formation, decomposition and movement of chemical species were analyzed. Quantitative calculations of heat generation and chemical species formation in spray combustion are enabled for various kinds of environmental condition by simulating liquid sodium droplet combustion using COMET. (author)

  3. Methodological Approaches to Experimental Teaching of Mathematics to University Students

    Directory of Open Access Journals (Sweden)

    Nikolay I.

    2018-03-01

    Full Text Available Introduction: the article imparts authors’ thoughtson a new teaching methodology for mathematical education in universities. The aim of the study is to substantiate the efficiency of the comprehensive usage of mathematical electronic courses, computer tests, original textbooks and methodologies when teaching mathematics to future agrarian engineers. The authors consider this implementation a unified educational process. Materials and Methods: the synthesis of international and domestic pedagogical experience of teaching students in university and the following methods of empirical research were used: pedagogical experiment, pedagogical measurementsand experimental teaching of mathematics. The authors applied the methodology of revealing interdisciplinary links on the continuum of mathematical problems using the key examples and exercises. Results: the online course “Mathematics” was designed and developed on the platform of Learning Management System Moodle. The article presents the results of test assignments assessing students’ intellectual abilities and analysis of solutions of various types of mathematical problems by students. The pedagogical experiment substantiated the integrated selection of textbooks, online course and online tests using the methodology of determination of the key examples and exercises. Discussion and Conclusions: the analysis of the experimental work suggested that the new methodology is able to have positive effect on the learning process. The learning programme determined the problem points for each student. The findings of this study have a number of important implications for future educational practice.

  4. Direct and generative retrieval of autobiographical memories: The roles of visual imagery and executive processes.

    Science.gov (United States)

    Anderson, Rachel J; Dewhurst, Stephen A; Dean, Graham M

    2017-03-01

    Two experiments used a dual task methodology to investigate the role of visual imagery and executive resources in the retrieval of specific autobiographical memories. In Experiment 1, dynamic visual noise led to a reduction in the number of specific memories retrieved in response to both high and low imageability cues, but did not affect retrieval times. In Experiment 2, irrelevant pictures reduced the number of specific memories but only in response to low imageability cues. Irrelevant pictures also increased response times to both high and low imageability cues. The findings are in line with previous work suggesting that disrupting executive resources may impair generative, but not direct, retrieval of autobiographical memories. In contrast, visual distractor tasks appear to impair access to specific autobiographical memories via both the direct and generative retrieval routes, thereby highlighting the potential role of visual imagery in both pathways. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. A Performance Tuning Methodology with Compiler Support

    Directory of Open Access Journals (Sweden)

    Oscar Hernandez

    2008-01-01

    Full Text Available We have developed an environment, based upon robust, existing, open source software, for tuning applications written using MPI, OpenMP or both. The goal of this effort, which integrates the OpenUH compiler and several popular performance tools, is to increase user productivity by providing an automated, scalable performance measurement and optimization system. In this paper we describe our environment, show how these complementary tools can work together, and illustrate the synergies possible by exploiting their individual strengths and combined interactions. We also present a methodology for performance tuning that is enabled by this environment. One of the benefits of using compiler technology in this context is that it can direct the performance measurements to capture events at different levels of granularity and help assess their importance, which we have shown to significantly reduce the measurement overheads. The compiler can also help when attempting to understand the performance results: it can supply information on how a code was translated and whether optimizations were applied. Our methodology combines two performance views of the application to find bottlenecks. The first is a high level view that focuses on OpenMP/MPI performance problems such as synchronization cost and load imbalances; the second is a low level view that focuses on hardware counter analysis with derived metrics that assess the efficiency of the code. Our experiments have shown that our approach can significantly reduce overheads for both profiling and tracing to acceptable levels and limit the number of times the application needs to be run with selected hardware counters. In this paper, we demonstrate the workings of this methodology by illustrating its use with selected NAS Parallel Benchmarks and a cloud resolving code.

  6. Narrative inquiry: Locating Aboriginal epistemology in a relational methodology.

    Science.gov (United States)

    Barton, Sylvia S

    2004-03-01

    This methodology utilizes narrative analysis and the elicitation of life stories as understood through dimensions of interaction, continuity, and situation. It is congruent with Aboriginal epistemology formulated by oral narratives through representation, connection, storytelling and art. Needed for culturally competent scholarship is an experience of research whereby inquiry into epiphanies, ritual, routines, metaphors and everyday experience creates a process of reflexive thinking for multiple ways of knowing. Based on the sharing of perspectives, narrative inquiry allows for experimentation into creating new forms of knowledge by contextualizing diabetes from the experience of a researcher overlapped with experiences of participants--a reflective practice in itself. The aim of this paper is to present narrative inquiry as a relational methodology and to analyse critically its appropriateness as an innovative research approach for exploring Aboriginal people's experience living with diabetes. Narrative inquiry represents an alternative culture of research for nursing science to generate understanding and explanation of Aboriginal people's 'diabetic self' stories, and to coax open a window for co-constructing a narrative about diabetes as a chronic illness. The ability to adapt a methodology for use in a cultural context, preserve the perspectives of Aboriginal peoples, maintain the holistic nature of social problems, and value co-participation in respectful ways are strengths of an inquiry partial to a responsive and embodied scholarship.

  7. Conducting interactive experiments online.

    Science.gov (United States)

    Arechar, Antonio A; Gächter, Simon; Molleman, Lucas

    2018-01-01

    Online labor markets provide new opportunities for behavioral research, but conducting economic experiments online raises important methodological challenges. This particularly holds for interactive designs. In this paper, we provide a methodological discussion of the similarities and differences between interactive experiments conducted in the laboratory and online. To this end, we conduct a repeated public goods experiment with and without punishment using samples from the laboratory and the online platform Amazon Mechanical Turk. We chose to replicate this experiment because it is long and logistically complex. It therefore provides a good case study for discussing the methodological and practical challenges of online interactive experimentation. We find that basic behavioral patterns of cooperation and punishment in the laboratory are replicable online. The most important challenge of online interactive experiments is participant dropout. We discuss measures for reducing dropout and show that, for our case study, dropouts are exogenous to the experiment. We conclude that data quality for interactive experiments via the Internet is adequate and reliable, making online interactive experimentation a potentially valuable complement to laboratory studies.

  8. Methodological Reflections on Working with Young Children

    DEFF Research Database (Denmark)

    Korn, Matthias

    2009-01-01

    This paper provides methodological reflections on an evolutionary and participatory software development process for designing interactive systems with children of very young age. The approach was put into practice for the design of a software environment for self-directed project management...

  9. Experimental facility and methodology for systematic studies of cold startability in direct injection Diesel engines

    Science.gov (United States)

    Pastor, J. V.; García-Oliver, J. M.; Pastor, J. M.; Ramírez-Hernández, J. G.

    2009-09-01

    Cold start at low temperatures in current direct injection (DI) Diesel engines is a problem which has not yet been properly solved and it becomes particularly critical with the current trend to reduce the engine compression ratio. Although it is clear that there are some key factors whose control leads to a proper cold start process, their individual relevance and relationships are not clearly understood. Thus, efforts on optimization of the cold start process are mainly based on a trial-and-error procedure in climatic chambers at low ambient temperature, with serious limitations in terms of measurement reliability during such a transient process, low repeatability and experimental cost. This paper presents a novel approach for an experimental facility capable of simulating real engine cold start, at room temperature and under well-controlled low speed and low temperature conditions. It is based on an optical single cylinder engine adapted to reproduce in-cylinder conditions representative of those of a real engine during start at cold ambient temperatures (of the order of -20 °C). Such conditions must be realistic, controlled and repeatable in order to perform systematic studies in the borderline between ignition success and misfiring. An analysis methodology, combining optical techniques and heat release analysis of individual cycles, has been applied.

  10. Experimental facility and methodology for systematic studies of cold startability in direct injection Diesel engines

    International Nuclear Information System (INIS)

    Pastor, J V; García-Oliver, J M; Pastor, J M; Ramírez-Hernández, J G

    2009-01-01

    Cold start at low temperatures in current direct injection (DI) Diesel engines is a problem which has not yet been properly solved and it becomes particularly critical with the current trend to reduce the engine compression ratio. Although it is clear that there are some key factors whose control leads to a proper cold start process, their individual relevance and relationships are not clearly understood. Thus, efforts on optimization of the cold start process are mainly based on a trial-and-error procedure in climatic chambers at low ambient temperature, with serious limitations in terms of measurement reliability during such a transient process, low repeatability and experimental cost. This paper presents a novel approach for an experimental facility capable of simulating real engine cold start, at room temperature and under well-controlled low speed and low temperature conditions. It is based on an optical single cylinder engine adapted to reproduce in-cylinder conditions representative of those of a real engine during start at cold ambient temperatures (of the order of −20 °C). Such conditions must be realistic, controlled and repeatable in order to perform systematic studies in the borderline between ignition success and misfiring. An analysis methodology, combining optical techniques and heat release analysis of individual cycles, has been applied

  11. Hanford Site Risk Assessment Methodology. Revision 3

    International Nuclear Information System (INIS)

    1995-05-01

    This methodology has been developed to prepare human health and ecological evaluations of risk as part of the Comprehensive Environmental Response, Compensation, and Liability Act of 1980 (CERCLA) remedial investigations (RI) and the Resource conservation and Recovery Act of 1976 (RCRA) facility investigations (FI) performed at the Hanford Site pursuant to the hanford Federal Facility Agreement and Consent Order (Ecology et al. 1994), referred to as the Tri-Party Agreement. Development of the methodology has been undertaken so that Hanford Site risk assessments are consistent with current regulations and guidance, while providing direction on flexible, ambiguous, or undefined aspects of the guidance. The methodology identifies site-specific risk assessment considerations and integrates them with approaches for evaluating human and ecological risk that can be factored into the risk assessment program supporting the Hanford Site cleanup mission. Consequently, the methodology will enhance the preparation and review of individual risk assessments at the Hanford Site

  12. Similarity principles for equipment qualification by experience

    International Nuclear Information System (INIS)

    Kana, D.D.; Pomerening, D.J.

    1988-07-01

    A methodology is developed for seismic qualification of nuclear plant equipment by applying similarity principles to existing experience data. Experience data are available from previous qualifications by analysis or testing, or from actual earthquake events. Similarity principles are defined in terms of excitation, equipment physical characteristics, and equipment response. Physical similarity is further defined in terms of a critical transfer function for response at a location on a primary structure, whose response can be assumed directly related to ultimate fragility of the item under elevated levels of excitation. Procedures are developed for combining experience data into composite specifications for qualification of equipment that can be shown to be physically similar to the reference equipment. Other procedures are developed for extending qualifications beyond the original specifications under certain conditions. Some examples for application of the procedures and verification of them are given for certain cases that can be approximated by a two degree of freedom simple primary/secondary system. Other examples are based on use of actual test data available from previous qualifications. Relationships of the developments with other previously-published methods are discussed. The developments are intended to elaborate on the rather broad revised guidelines developed by the IEEE 344 Standards Committee for equipment qualification in new nuclear plants. However, the results also contribute to filling a gap that exists between the IEEE 344 methodology and that previously developed by the Seismic Qualification Utilities Group. The relationship of the results to safety margin methodology is also discussed. (author)

  13. Towards methodological improvement in the Spanish studies

    Directory of Open Access Journals (Sweden)

    Beatriz Amante García

    2012-09-01

    Full Text Available The European Higher Education Area (EHEA has triggered many changes in the new degrees in Spanish universities, mainly in terms of methodology and assessment. However, in order to make such changes a success it is essential to have coordination within the teaching staff as well as active methodologies in use, which enhance and encourage students’ participation in all the activities carried out in the classroom. Most of all, when dealing with formative and summative evaluation, in which students become the ones responsible for their own learning process (López-Pastor, 2009; Torre, 2008. In this second issue of JOTSE we have included several teaching innovation experiences related to the above mentioned methodological and assessment changes.

  14. A Methodology for Anatomic Ultrasound Image Diagnostic Quality Assessment

    DEFF Research Database (Denmark)

    Hemmsen, Martin Christian; Lange, Theis; Brandt, Andreas Hjelm

    2017-01-01

    are presented. Earlier uses of the methodology has shown that it ensures validity of the assessment, as it separates the influences between developer, investigator, and assessor once a research protocol has been established. This separation reduces confounding influences on the result from the developer......This paper discusses methods for assessment of ultrasound image quality based on our experiences with evaluating new methods for anatomic imaging. It presents a methodology to ensure a fair assessment between competing imaging methods using clinically relevant evaluations. The methodology...... to properly reveal the clinical value. The paper exemplifies the methodology using recent studies of Synthetic Aperture Sequential Beamforming tissue harmonic imaging....

  15. Evolutionary-Simulative Methodology in the Management of Social and Economic Systems

    Directory of Open Access Journals (Sweden)

    Konyavskiy V.A.

    2017-01-01

    Full Text Available The article outlines the main provisions of the evolutionary-simulative methodology (ESM which is a methodology of mathematical modeling of equilibrium random processes (CPR, widely used in the economy. It discusses the basic directions of use of ESM solutions for social problems and economic management systems.

  16. A Methodological Study of Order Effects in Reporting Relational Aggression Experiences.

    Science.gov (United States)

    Serico, Jennifer M; NeMoyer, Amanda; Goldstein, Naomi E S; Houck, Mark; Leff, Stephen S

    2018-03-01

    Unlike the overt nature of physical aggression, which lends itself to simpler and more direct methods of investigation, the often-masked nature of relational aggression has led to difficulties and debate regarding the most effective tools of study. Given concerns with the accuracy of third-party relational aggression reports, especially as individuals age, self-report measures may be particularly useful when assessing experiences with relational aggression. However, it is important to recognize validity concerns-in particular, the potential effects of item order presentation-associated with self-report of relational aggression perpetration and victimization. To investigate this issue, surveys were administered and completed by 179 young adults randomly assigned to one of four survey conditions reflecting manipulation of item order. Survey conditions included presentation of (a) perpetration items only, (b) victimization items only, (c) perpetration items followed by victimization items, and (d) victimization items followed by perpetration items. Results revealed that participants reported perpetrating relational aggression significantly more often when asked only about perpetration or when asked about perpetration before victimization, compared with participants who were asked about victimization before perpetration. Item order manipulation did not result in significant differences in self-reported victimization experiences. Results of this study indicate a need for greater consideration of item order when conducting research using self-report data and the importance of additional investigation into which form of item presentation elicits the most accurate self-report information.

  17. Energy Efficiency Indicators Methodology Booklet

    Energy Technology Data Exchange (ETDEWEB)

    Sathaye, Jayant; Price, Lynn; McNeil, Michael; de la rue du Can, Stephane

    2010-05-01

    This Methodology Booklet provides a comprehensive review and methodology guiding principles for constructing energy efficiency indicators, with illustrative examples of application to individual countries. It reviews work done by international agencies and national government in constructing meaningful energy efficiency indicators that help policy makers to assess changes in energy efficiency over time. Building on past OECD experience and best practices, and the knowledge of these countries' institutions, relevant sources of information to construct an energy indicator database are identified. A framework based on levels of hierarchy of indicators -- spanning from aggregate, macro level to disaggregated end-use level metrics -- is presented to help shape the understanding of assessing energy efficiency. In each sector of activity: industry, commercial, residential, agriculture and transport, indicators are presented and recommendations to distinguish the different factors affecting energy use are highlighted. The methodology booklet addresses specifically issues that are relevant to developing indicators where activity is a major factor driving energy demand. A companion spreadsheet tool is available upon request.

  18. Application and licensing requirements of the Framatome ANP RLBLOCA methodology

    International Nuclear Information System (INIS)

    Martin, R.P.; Dunn, B.M.

    2004-01-01

    The Framatome ANP Realistic Large-Break LOCA methodology (FANP RLBLOCA) is an analysis approach approved by the US NRC for supporting the licensing basis of 3- and 4-loop Westinghouse PWRs and CE 2x4 PWRs. It was developed consistent with the NRC's Code Scaling, Applicability, and Uncertainty (CSAU) methodology for performing best-estimate large-break LOCAs. The CSAU methodology consists of three key elements with the second and third element addressing uncertainty identification and application. Unique to the CSAU methodology is the use of engineering judgment and the Process Identification and Ranking Table (PIRT) defined in the first element to lay the groundwork for achieving the ultimate goal of quantifying the total uncertainty in predicted measures of interest associated with the large-break LOCA. It is the PIRT that not only directs the methodology development, but also directs the methodology review. While the FANP RLBLOCA methodology was generically approved, a plant-specific application is customized in two ways addressing how the unique plant characterization 1) is translated to code input and 2) relates to the unique methodology licensing requirements. Related to the former, plants are required by 10 CFR 50.36 to define a technical specification limiting condition for operation based on the following criteria: 1. Installed instrumentation that is used in the control room to detect, and indicate, a significant abnormal degradation of the reactor coolant pressure boundary. 2. A process variable, design feature, or operating restriction that is an initial condition of a design basis accident or transient analysis that either assumes the failure of or presents a challenge to the integrity of a fission product barrier. 3. A structure, system, or component that is part of the primary success path and which functions or actuates to mitigate a design basis accident or transient that either assumes the failure of or presents a challenge to the integrity of a

  19. Hanford Site baseline risk assessment methodology. Revision 2

    Energy Technology Data Exchange (ETDEWEB)

    1993-03-01

    This methodology has been developed to prepare human health and environmental evaluations of risk as part of the Comprehensive Environmental Response, Compensation, and Liability Act remedial investigations (RIs) and the Resource Conservation and Recovery Act facility investigations (FIs) performed at the Hanford Site pursuant to the Hanford Federal Facility Agreement and Consent Order referred to as the Tri-Party Agreement. Development of the methodology has been undertaken so that Hanford Site risk assessments are consistent with current regulations and guidance, while providing direction on flexible, ambiguous, or undefined aspects of the guidance. The methodology identifies Site-specific risk assessment considerations and integrates them with approaches for evaluating human and environmental risk that can be factored into the risk assessment program supporting the Hanford Site cleanup mission. Consequently, the methodology will enhance the preparation and review of individual risk assessments at the Hanford Site.

  20. Operational experiences of (in)direct co-combustion in coal and gas fired power plants in Europe

    International Nuclear Information System (INIS)

    Van Ree, R.; Korbee, R.; Meijer, R.; Konings, T.; Van Aart, F.

    2001-02-01

    The operational experiences of direct and indirect co-combustion of biomass/waste in European coal and natural gas fired power plants are addressed. The operational experiences of mainly Dutch direct co-combustion activities in coal fired power plants are discussed; whereas an overview of European indirect co-combustion activities is presented. The technical, environmental, and economic feasibility of different indirect co-combustion concepts (i.e. upstream gasification, pyrolysis, combustion with steam-side integration) is investigated, and the results are compared with the economic preferable concept of direct co-combustion. Main technical constraints that limit the co-combustion capacity of biomass/waste in conventional coal fired power plants are: the grindability of the biomass/coal blend, the capacity of available unit components, and the danger of severe slagging, fouling, corrosion and erosion. The main environmental constraints that have to be taken into account are the quality of produced solid waste streams (fly ash, bottom ash, gypsum) and the applicable air emission regulations. 6 refs

  1. Conjugate gradient based projection - A new explicit methodology for frictional contact

    Science.gov (United States)

    Tamma, Kumar K.; Li, Maocheng; Sha, Desong

    1993-01-01

    With special attention towards the applicability to parallel computation or vectorization, a new and effective explicit approach for linear complementary formulations involving a conjugate gradient based projection methodology is proposed in this study for contact problems with Coulomb friction. The overall objectives are focussed towards providing an explicit methodology of computation for the complete contact problem with friction. In this regard, the primary idea for solving the linear complementary formulations stems from an established search direction which is projected to a feasible region determined by the non-negative constraint condition; this direction is then applied to the Fletcher-Reeves conjugate gradient method resulting in a powerful explicit methodology which possesses high accuracy, excellent convergence characteristics, fast computational speed and is relatively simple to implement for contact problems involving Coulomb friction.

  2. Introduction to Statistically Designed Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Heaney, Mike

    2016-09-13

    Statistically designed experiments can save researchers time and money by reducing the number of necessary experimental trials, while resulting in more conclusive experimental results. Surprisingly, many researchers are still not aware of this efficient and effective experimental methodology. As reported in a 2013 article from Chemical & Engineering News, there has been a resurgence of this methodology in recent years (http://cen.acs.org/articles/91/i13/Design-Experiments-Makes-Comeback.html?h=2027056365). This presentation will provide a brief introduction to statistically designed experiments. The main advantages will be reviewed along with the some basic concepts such as factorial and fractional factorial designs. The recommended sequential approach to experiments will be introduced and finally a case study will be presented to demonstrate this methodology.

  3. arXiv Inelastic Boosted Dark Matter at Direct Detection Experiments

    CERN Document Server

    Giudice, Gian F.; Park, Jong-Chul; Shin, Seodong

    2018-05-10

    We explore a novel class of multi-particle dark sectors, called Inelastic Boosted Dark Matter (iBDM). These models are constructed by combining properties of particles that scatter off matter by making transitions to heavier states (Inelastic Dark Matter) with properties of particles that are produced with a large Lorentz boost in annihilation processes in the galactic halo (Boosted Dark Matter). This combination leads to new signals that can be observed at ordinary direct detection experiments, but require unconventional searches for energetic recoil electrons in coincidence with displaced multi-track events. Related experimental strategies can also be used to probe MeV-range boosted dark matter via their interactions with electrons inside the target material.

  4. Methodology, Measurement and Analysis of Flow Table Update Characteristics in Hardware OpenFlow Switches

    KAUST Repository

    Kuźniar, Maciej; Pereší ni, Peter; Kostić, Dejan; Canini, Marco

    2018-01-01

    and performance characteristics is essential for ensuring successful and safe deployments.We propose a systematic methodology for SDN switch performance analysis and devise a series of experiments based on this methodology. The methodology relies on sending a

  5. Review of experience with plutonium exposure assessment methodologies at the nuclear fuel reprocessing site of British Nuclear Fuels plc

    International Nuclear Information System (INIS)

    Strong, R.

    1988-01-01

    British Nuclear Fuels plc and its predecessors have provided a complete range of nuclear fuel services to utilities in the UK and elsewhere for more than 30 years. Over 30,000 ton of Magnox and Oxide fuel have been reprocessed at Sellafield. During this time substantial experience has accumulated of methodologies for the assessment of exposure to actinides, mainly isotopes of plutonium. For most of the period monitoring of personnel included assessment of systemic uptake deduced from plutonium-in-urine results. The purpose of the paper is to present some conclusions of contemporary work in this area

  6. Assessing importance and satisfaction judgments of intermodal work commuters with electronic survey methodology.

    Science.gov (United States)

    2013-09-01

    Recent advances in multivariate methodology provide an opportunity to further the assessment of service offerings in public transportation for work commuting. We offer methodologies that are alternative to direct rating scale and have advantages in t...

  7. Experience and benefits from using the EPRI MOV Performance Prediction Methodology in nuclear power plants

    International Nuclear Information System (INIS)

    Walker, T.; Damerell, P.S.

    1999-01-01

    The EPRI MOV Performance Prediction Methodology (PPM) is an effective tool for evaluating design basis thrust and torque requirements for MOVs. Use of the PPM has become more widespread in US nuclear power plants as they close out their Generic Letter (GL) 89-10 programs and address MOV periodic verification per GL 96-05. The PPM has also been used at plants outside the US, many of which are implementing programs similar to US plants' GL 89-10 programs. The USNRC Safety Evaluation of the PPM and the USNRC's discussion of the PPM in GL 96-05 make the PPM an attractive alternative to differential pressure (DP) testing, which can be costly and time-consuming. Significant experience and benefits, which are summarized in this paper, have been gained using the PPM. Although use of PPM requires a commitment of resources, the benefits of a solidly justified approach and a reduced need for DP testing provide a substantial safety and economic benefit. (author)

  8. Multimodal imaging and detection approach to 18F-FDG-directed surgery for patients with known or suspected malignancies: a comprehensive description of the specific methodology utilized in a single-institution cumulative retrospective experience

    Directory of Open Access Journals (Sweden)

    Povoski Stephen P

    2011-11-01

    Full Text Available Abstract Background 18F-FDG PET/CT is widely utilized in the management of cancer patients. The aim of this paper was to comprehensively describe the specific methodology utilized in our single-institution cumulative retrospective experience with a multimodal imaging and detection approach to 18F-FDG-directed surgery for known/suspected malignancies. Methods From June 2005-June 2010, 145 patients were injected with 18F-FDG in anticipation of surgical exploration, biopsy, and possible resection of known/suspected malignancy. Each patient underwent one or more of the following: (1 same-day preoperative patient diagnostic PET/CT imaging, (2 intraoperative gamma probe assessment, (3 clinical PET/CT specimen scanning of whole surgically resected specimens (WSRS, research designated tissues (RDT, and/or sectioned research designated tissues (SRDT, (4 micro PET/CT specimen scanning of WSRS, RDT, and/or SRDT, (5 total radioactivity counting of each SRDT piece by an automatic gamma well counter, and (6 same-day postoperative patient diagnostic PET/CT imaging. Results Same-day 18F-FDG injection dose was 15.1 (± 3.5, 4.6-26.1 mCi. Fifty-five same-day preoperative patient diagnostic PET/CT scans were performed. One hundred forty-two patients were taken to surgery. Three of the same-day preoperative patient diagnostic PET/CT scans led to the cancellation of the anticipated surgical procedure. One hundred forty-one cases utilized intraoperative gamma probe assessment. Sixty-two same-day postoperative patient diagnostic PET/CT scans were performed. WSRS, RDT, and SRDT were scanned by clinical PET/CT imaging and micro PET/CT imaging in 109 and 32 cases, 33 and 22 cases, and 49 and 26 cases, respectively. Time from 18F-FDG injection to same-day preoperative patient diagnostic PET/CT scan, intraoperative gamma probe assessment, and same-day postoperative patient diagnostic PET/CT scan were 73 (± 9, 53-114, 286 (± 93, 176-532, and 516 (± 134, 178-853 minutes

  9. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  10. Measurement of direct CP-violation with the NA48 experiment at the CERN SPS

    CERN Document Server

    Blümer, H

    1999-01-01

    The NA48 experiment at the CERN SPS uses simultaneous, nearly collinear beams of long-lived and short-lived neutral kaons to measure the direct CP-violation parameter epsilon '/ epsilon using the double ratio method to an overall accuracy of 2.10/sup -4/, three times better than previous results. The detector has been installed and commissioned in 1995 and 1996. First physics data were recorded during 42 days in fall 1997 yielding more events than the previous experiment NA31. The talk presents the apparatus performance, data quality, the current status of the physics analysis and ongoing activities. The experiment has performed another data run from May to September 1998, which has given a substantial increase in statistics. (11 refs).

  11. The opinions and experiences of family physicians regarding direct-to-consumer advertising.

    Science.gov (United States)

    Lipsky, M S; Taylor, C A

    1997-12-01

    The use of direct-to-consumer advertising (DTCA) by pharmaceutical companies is increasing. Our study examines the opinions and experiences of family physicians concerning DTCA. A survey instrument designed to elicit the opinions, experiences, and perceptions of family physicians about DTCA was sent to a 2% (N = 880) systematic sampling of active physician members of the American Academy of Family Physicians. Descriptive statistics were used to analyze responses with t tests and chi 2 tests for independence used to examine subgroup response differences. Four hundred fifty-four (52%) physicians responded to the survey. Most physicians (95%) had encountered DTCA personally, and had been approached by an average of 7 patients over the previous 6 months with requests for specific prescription drugs. Prescription antihistamines and antihypertensive drugs were the most commonly requested. Overall, 80% of the physician respondents believed that print DTCA was not a good idea, while 84% expressed negative feelings about television and radio advertising. Both groups cited "misleading biased view" and "increased costs" as the most common disadvantages. Some reported benefits included "better informed patients" and "promoting physician-patient communication." Overall, the study group physicians had negative feelings about DTCA in both print and electronic media. Studies directly examining patient perspectives, as well as cost benefits, are necessary to test the validity of the physicians' perceptions about DTCA.

  12. An objective methodology for the evaluation of the air quality stations positioning

    International Nuclear Information System (INIS)

    Benassi, A.; Marson, G.; Baraldo, E.; Dalan, F.; Lorenzet, K.; Bellasio, R.; Bianconi, R.

    2006-01-01

    This work describes a methodology for the evaluation of the correct positioning of the monitoring stations of an air quality network. The methodology is based on the Italian legislation, the European Directives and on some technical documents used as guidelines at European level. The paper describes all the assumption on which the methodology is based and the results of its application to the air quality network of Region Veneto (Italy) [it

  13. Fuel-pellet-fabrication experience using direct-denitration-recycle-PuO2-coprecipitated mixed oxide

    International Nuclear Information System (INIS)

    Rasmussen, D.E.; Schaus, P.S.

    1980-01-01

    The fuel pellet fabrication experience described in this paper involved three different feed powders: coprecipitated PuO 2 -UO 2 which was flash calcined in a fluidized bed; co-direct denitrated PuO 2 -UO 2 ; and direct denitrated LWR recycle PuO 2 which was mechanically blended with natural UO 2 . The objectives of this paper are twofold; first, to demonstrate that acceptable quality fuel pellets were fabricated using feed powders manufactured by processes other than the conventional oxalate process; and second, to highlight some pellet fabrication difficulties experienced with the direct denitration LWR recycle PuO 2 feed material, which did not produce acceptable pellets. The direct denitration LWR recycle PuO 2 was available as a by-product and was not specifically produced for use in fuel pellet fabrication. Nevertheless, its characteristics and pellet fabrication behavior serve to re-emphasize the importance of continued process development involving both powder suppliers and fuel fabricators to close the fuel cycle in the future

  14. Sustainability of sunflower cultivation for biodiesel production in central Italy according to the Renewable Energy Directive methodology

    Directory of Open Access Journals (Sweden)

    Daniele Duca

    2014-02-01

    Full Text Available The use of renewable energies as alternative to fossil fuels has value from different points of view and has effects at environmental, social and economic level. These aspects are often connected to each other and together define the overall sustainability of bioenergy. At European level, the Directive 2009/28/EC gives the basic criteria for the estimation of sustainability of biofuels and indicates a minimum threshold of 35% of greenhouse gas saving for a biofuel in order to be considered sustainable. The Directive gives the possibility to identify standard regional values for the cultivation steps that could be utilized for the certification. This paper aims to give a contribution to the definition of these values considering the RED methodology applied to the sunflower cropped in central Italy which is characterized by a hilly landscape and not-irrigated crops. To determine input and output of sunflower cultivation in the central Italy, the results of PROBIO project, carried out by the Authors, were used. The sustainability of biodiesel produced from sunflower grown in central Italy is variable and depends on the nitrogen input and seasonal climatic conditions that affect the yields. The greenhouse gases savings of the Italian chain is 40% in average, greater than the required 35% and would be possible to assign this value as standard to the biofuel chain biodiesel from sunflower cultivated in central Italy. Using an averaged regional standard value guards against the possibility of considering unsustainable harvesting in unfavourable years and seeing it overestimated in the favourable ones.

  15. Theory of mind and Verstehen (understanding) methodology.

    Science.gov (United States)

    Kumazaki, Tsutomu

    2016-09-01

    Theory of mind is a prominent, but highly controversial, field in psychology, psychiatry, and philosophy of mind. Simulation theory, theory-theory and other views have been presented in recent decades, none of which are monolithic. In this article, various views on theory of mind are reviewed, and methodological problems within each view are investigated. The relationship between simulation theory and Verstehen (understanding) methodology in traditional human sciences is an intriguing issue, although the latter is not a direct ancestor of the former. From that perspective, lessons for current clinical psychiatry are drawn. © The Author(s) 2016.

  16. Assimilation of wind speed and direction observations: results from real observation experiments

    Directory of Open Access Journals (Sweden)

    Feng Gao

    2015-06-01

    Full Text Available The assimilation of wind observations in the form of speed and direction (asm_sd by the Weather Research and Forecasting Model Data Assimilation System (WRFDA was performed using real data and employing a series of cycling assimilation experiments for a 2-week period, as a follow-up for an idealised post hoc assimilation experiment. The satellite-derived Atmospheric Motion Vectors (AMV and surface dataset in Meteorological Assimilation Data Ingest System (MADIS were assimilated. This new method takes into account the observation errors of both wind speed (spd and direction (dir, and WRFDA background quality control (BKG-QC influences the choice of wind observations, due to data conversions between (u,v and (spd, dir. The impacts of BKG-QC, as well as the new method, on the wind analysis were analysed separately. Because the dir observational errors produced by different platforms are not known or tuned well in WRFDA, a practical method, which uses similar assimilation weights in comparative trials, was employed to estimate the spd and dir observation errors. The asm_sd produces positive impacts on analyses and short-range forecasts of spd and dir with smaller root-mean-square errors than the u,v-based system. The bias of spd analysis decreases by 54.8%. These improvements result partly from BKG-QC screening of spd and dir observations in a direct way, but mainly from the independent impact of spd (dir data assimilation on spd (dir analysis, which is the primary distinction from the standard WRFDA method. The potential impacts of asm_sd on precipitation forecasts were evaluated. Results demonstrate that the asm_sd is able to indirectly improve the precipitation forecasts by improving the prediction accuracies of key wind-related factors leading to precipitation (e.g. warm moist advection and frontogenesis.

  17. Review of direct electrical heating experiments on irradiated mixed-oxide fuel

    International Nuclear Information System (INIS)

    Fenske, G.R.; Bandyopadhyay, G.

    1982-01-01

    Results of approximately 50 out-of-reactor experiments that simulated various stages of a loss-of-flow event with irradiated fuel are presented. The tests, which utilized the direct electrical heating technique to simulate nuclear heating, were performed either on fuel segments with their original cladding intact or on fuel segments that were extruded into quartz tubes. The test results demonstrated that the macro- and microscopic fuel behavior was dependent on a number of variables including fuel heating rate, thermal history prior to a transient, the number of heating cycles, type of cladding (quartz vs stainless steel), and fuel burnup

  18. Toward a Multi-scale Phase Transition Kinetics Methodology: From Non-Equilibrium Statistical Mechanics to Hydrodynamics

    Science.gov (United States)

    Belof, Jonathan; Orlikowski, Daniel; Wu, Christine; McLaughlin, Keith

    2013-06-01

    Shock and ramp compression experiments are allowing us to probe condensed matter under extreme conditions where phase transitions and other non-equilibrium aspects can now be directly observed, but first principles simulation of kinetics remains a challenge. A multi-scale approach is presented here, with non-equilibrium statistical mechanical quantities calculated by molecular dynamics (MD) and then leveraged to inform a classical nucleation and growth kinetics model at the hydrodynamic scale. Of central interest is the free energy barrier for the formation of a critical nucleus, with direct NEMD presenting the challenge of relatively long timescales necessary to resolve nucleation. Rather than attempt to resolve the time-dependent nucleation sequence directly, the methodology derived here is built upon the non-equilibrium work theorem in order to bias the formation of a critical nucleus and thus construct the nucleation and growth rates. Having determined these kinetic terms from MD, a hydrodynamics implementation of Kolmogorov-Johnson-Mehl-Avrami (KJMA) kinetics and metastabilty is applied to the dynamic compressive freezing of water and compared with recent ramp compression experiments [Dolan et al., Nature (2007)] Lawrence Livermore National Laboratory is operated by Lawrence Livermore National Security, LLC, for the U.S. Department of Energy, National Nuclear Security Administration under Contract DE-AC52-07NA27344.

  19. A methodology for performing computer security reviews

    International Nuclear Information System (INIS)

    Hunteman, W.J.

    1991-01-01

    DOE Order 5637.1, ''Classified Computer Security,'' requires regular reviews of the computer security activities for an ADP system and for a site. Based on experiences gained in the Los Alamos computer security program through interactions with DOE facilities, we have developed a methodology to aid a site or security officer in performing a comprehensive computer security review. The methodology is designed to aid a reviewer in defining goals of the review (e.g., preparation for inspection), determining security requirements based on DOE policies, determining threats/vulnerabilities based on DOE and local threat guidance, and identifying critical system components to be reviewed. Application of the methodology will result in review procedures and checklists oriented to the review goals, the target system, and DOE policy requirements. The review methodology can be used to prepare for an audit or inspection and as a periodic self-check tool to determine the status of the computer security program for a site or specific ADP system. 1 tab

  20. A methodology for performing computer security reviews

    International Nuclear Information System (INIS)

    Hunteman, W.J.

    1991-01-01

    This paper reports on DIE Order 5637.1, Classified Computer Security, which requires regular reviews of the computer security activities for an ADP system and for a site. Based on experiences gained in the Los Alamos computer security program through interactions with DOE facilities, the authors have developed a methodology to aid a site or security officer in performing a comprehensive computer security review. The methodology is designed to aid a reviewer in defining goals of the review (e.g., preparation for inspection), determining security requirements based on DOE policies, determining threats/vulnerabilities based on DOE and local threat guidance, and identifying critical system components to be reviewed. Application of the methodology will result in review procedures and checklists oriented to the review goals, the target system, and DOE policy requirements. The review methodology can be used to prepare for an audit or inspection and as a periodic self-check tool to determine the status of the computer security program for a site or specific ADP system

  1. ISE System Development Methodology Manual

    Energy Technology Data Exchange (ETDEWEB)

    Hayhoe, G.F.

    1992-02-17

    The Information Systems Engineering (ISE) System Development Methodology Manual (SDM) is a framework of life cycle management guidelines that provide ISE personnel with direction, organization, consistency, and improved communication when developing and maintaining systems. These guide-lines were designed to allow ISE to build and deliver Total Quality products, and to meet the goals and requirements of the US Department of Energy (DOE), Westinghouse Savannah River Company, and Westinghouse Electric Corporation.

  2. Residual radioactive material guidelines: Methodology and applications

    International Nuclear Information System (INIS)

    Yu, C.; Yuan, Y.C.; Zielen, A.J.; Wallo, A. III.

    1989-01-01

    A methodology to calculate residual radioactive material guidelines was developed for the US Department of Energy (DOE). This methodology is coded in a menu-driven computer program, RESRAD, which can be run on IBM or IBM-compatible microcomputers. Seven pathways of exposure are considered: external radiation, inhalation, and ingestion of plant foods, meat, milk, aquatic foods, and water. The RESRAD code has been applied to several DOE sites to calculate soil cleanup guidelines. This experience has shown that the computer code is easy to use and very user-friendly. 3 refs., 8 figs

  3. Lessons learned on probabilistic methodology for precursor analyses

    Energy Technology Data Exchange (ETDEWEB)

    Babst, Siegfried [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Berlin (Germany); Wielenberg, Andreas; Gaenssmantel, Gerhard [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Garching (Germany)

    2016-11-15

    Based on its experience in precursor assessment of operating experience from German NPP and related international activities in the field, GRS has identified areas for enhancing probabilistic methodology. These are related to improving the completeness of PSA models, to insufficiencies in probabilistic assessment approaches, and to enhancements of precursor assessment methods. Three examples from the recent practice in precursor assessments illustrating relevant methodological insights are provided and discussed in more detail. Our experience reinforces the importance of having full scope, current PSA models up to Level 2 PSA and including hazard scenarios for precursor analysis. Our lessons learned include that PSA models should be regularly updated regarding CCF data and inclusion of newly discovered CCF mechanisms or groups. Moreover, precursor classification schemes should be extended to degradations and unavailabilities of the containment function. Finally, PSA and precursor assessments should put more emphasis on the consideration of passive provisions for safety, e. g. by sensitivity cases.

  4. Lessons learned on probabilistic methodology for precursor analyses

    International Nuclear Information System (INIS)

    Babst, Siegfried; Wielenberg, Andreas; Gaenssmantel, Gerhard

    2016-01-01

    Based on its experience in precursor assessment of operating experience from German NPP and related international activities in the field, GRS has identified areas for enhancing probabilistic methodology. These are related to improving the completeness of PSA models, to insufficiencies in probabilistic assessment approaches, and to enhancements of precursor assessment methods. Three examples from the recent practice in precursor assessments illustrating relevant methodological insights are provided and discussed in more detail. Our experience reinforces the importance of having full scope, current PSA models up to Level 2 PSA and including hazard scenarios for precursor analysis. Our lessons learned include that PSA models should be regularly updated regarding CCF data and inclusion of newly discovered CCF mechanisms or groups. Moreover, precursor classification schemes should be extended to degradations and unavailabilities of the containment function. Finally, PSA and precursor assessments should put more emphasis on the consideration of passive provisions for safety, e. g. by sensitivity cases.

  5. Statistical characteristics of falling-film flows: A synergistic approach at the crossroads of direct numerical simulations and experiments

    Science.gov (United States)

    Charogiannis, Alexandros; Denner, Fabian; van Wachem, Berend G. M.; Kalliadasis, Serafim; Markides, Christos N.

    2017-12-01

    We scrutinize the statistical characteristics of liquid films flowing over an inclined planar surface based on film height and velocity measurements that are recovered simultaneously by application of planar laser-induced fluorescence (PLIF) and particle tracking velocimetry (PTV), respectively. Our experiments are complemented by direct numerical simulations (DNSs) of liquid films simulated for different conditions so as to expand the parameter space of our investigation. Our statistical analysis builds upon a Reynolds-like decomposition of the time-varying flow rate that was presented in our previous research effort on falling films in [Charogiannis et al., Phys. Rev. Fluids 2, 014002 (2017), 10.1103/PhysRevFluids.2.014002], and which reveals that the dimensionless ratio of the unsteady term to the mean flow rate increases linearly with the product of the coefficients of variation of the film height and bulk velocity, as well as with the ratio of the Nusselt height to the mean film height, both at the same upstream PLIF/PTV measurement location. Based on relations that are derived to describe these results, a methodology for predicting the mass-transfer capability (through the mean and standard deviation of the bulk flow speed) of these flows is developed in terms of the mean and standard deviation of the film thickness and the mean flow rate, which are considerably easier to obtain experimentally than velocity profiles. The errors associated with these predictions are estimated at ≈1.5 % and 8% respectively in the experiments and at <1 % and <2 % respectively in the DNSs. Beyond the generation of these relations for the prediction of important film flow characteristics based on simple flow information, the data provided can be used to design improved heat- and mass-transfer equipment reactors or other process operation units which exploit film flows, but also to develop and validate multiphase flow models in other physical and technological settings.

  6. Story-Making as Methodology: Disrupting Dominant Stories through Multimedia Storytelling.

    Science.gov (United States)

    Rice, Carla; Mündel, Ingrid

    2018-05-01

    In this essay, we discuss multimedia story-making methodologies developed through Re•Vision: The Centre for Art and Social Justice that investigates the power of the arts, especially story, to positively influence decision makers in diverse sectors. Our story-making methodology brings together majority and minoritized creators to represent previously unattended experiences (e.g., around mind-body differences, queer sexuality, urban Indigenous identity, and Inuit cultural voice) with an aim to building understanding and shifting policies/practices that create barriers to social inclusion and justice. We analyze our ongoing efforts to rework our storytelling methodology, spotlighting acts of revising carried out by facilitators and researchers as they/we redefine methodological terms for each storytelling context, by researcher-storytellers as they/we rework material from our lives, and by receivers of the stories as we revise our assumptions about particular embodied histories and how they are defined within dominant cultural narratives and institutional structures. This methodology, we argue, contributes to the existing qualitative lexicon by providing innovative new approaches not only for chronicling marginalized/misrepresented experiences and critically researching selves, but also for scaffolding intersectional alliances and for imagining more just futures. © 2018 Canadian Sociological Association/La Société canadienne de sociologie.

  7. Interplay and Characterization of Dark Matter Searches at Colliders and in Direct Detection Experiments

    CERN Document Server

    Malik, Sarah A.; Araujo, Henrique; Belyaev, A.; Bœhm, Céline; Brooke, Jim; Buchmueller, Oliver; Davies, Gavin; De Roeck, Albert; de Vries, Kees; Dolan, Matthew J.; Ellis, John; Fairbairn, Malcolm; Flaecher, Henning; Gouskos, Loukas; Khoze, Valentin V.; Landsberg, Greg; Newbold, Dave; Papucci, Michele; Sumner, Timothy; Thomas, Marc; Worm, Steven

    2015-01-01

    In this White Paper we present and discuss a concrete proposal for the consistent interpretation of Dark Matter searches at colliders and in direct detection experiments. Based on a specific implementation of simplified models of vector and axial-vector mediator exchanges, this proposal demonstrates how the two search strategies can be compared on an equal footing.

  8. Validation of multi-body modelling methodology for reconfigurable underwater robots

    DEFF Research Database (Denmark)

    Nielsen, M.C.; Eidsvik, O. A.; Blanke, Mogens

    2016-01-01

    This paper investigates the problem of employing reconfigurable robots in an underwater setting. The main results presented is the experimental validation of a modelling methodology for a system consisting of N dynamically connected robots with heterogeneous dynamics. Two distinct types...... of experiments are performed, a series of hydrostatic free-decay tests and a series of open-loop trajectory tests. The results are compared to a simulation based on the modelling methodology. The modelling methodology shows promising results for usage with systems composed of reconfigurable underwater modules...

  9. Visual experience and blindsight: A methodological review

    DEFF Research Database (Denmark)

    Overgaard, Morten

    2011-01-01

    Blindsight is classically defined as residual visual capacity, e.g., to detect and identify visual stimuli, in the total absence of perceptual awareness following lesions to V1. However, whereas most experiments have investigated what blindsight patients can and cannot do, the literature contains...

  10. New well pattern optimization methodology in mature low-permeability anisotropic reservoirs

    Science.gov (United States)

    Qin, Jiazheng; Liu, Yuetian; Feng, Yueli; Ding, Yao; Liu, Liu; He, Youwei

    2018-02-01

    In China, lots of well patterns were designed before people knew the principal permeability direction in low-permeability anisotropic reservoirs. After several years’ production, it turns out that well line direction is unparallel with principal permeability direction. However, traditional well location optimization methods (in terms of the objective function such as net present value and/or ultimate recovery) are inapplicable, since wells are not free to move around in a mature oilfield. Thus, the well pattern optimization (WPO) of mature low-permeability anisotropic reservoirs is a significant but challenging task, since the original well pattern (WP) will be distorted and reconstructed due to permeability anisotropy. In this paper, we investigate the destruction and reconstruction of WP when the principal permeability direction and well line direction are unparallel. A new methodology was developed to quantitatively optimize the well locations of mature large-scale WP through a WPO algorithm on the basis of coordinate transformation (i.e. rotating and stretching). For a mature oilfield, large-scale WP has settled, so it is not economically viable to carry out further infill drilling. This paper circumvents this difficulty by combining the WPO algorithm with the well status (open or shut-in) and schedule adjustment. Finally, this methodology is applied to an example. Cumulative oil production rates of the optimized WP are higher, and water-cut is lower, which highlights the potential of the WPO methodology application in mature large-scale field development projects.

  11. Consensus methodology to determine minor ailments appropriate to be directed for management within community pharmacy.

    Science.gov (United States)

    Nazar, Hamde; Nazar, Zachariah; Yeung, Andre; Maguire, Mike; Connelly, Alex; Slight, Sarah P

    2018-01-04

    National Health Service (NHS) 111, a medical helpline for urgent care used within the England and Scotland, receives significant numbers of patient calls yearly for a range of clinical conditions. Some are considered high acuity and mainly directed to urgent and emergency care. Low acuity conditions are also directed to these costly, overburdened services. Community pharmacy is a recognised setting for effective low acuity condition management and could offer an alternative. To design and evaluate a new NHS111 pathway re-directing patients with low acuity conditions to community pharmacy. Two consensus development stakeholder workshops were undertaken. A "low acuity" condition was defined as one that can be clinically assessed by a community pharmacist and requires a treatment and/or advice available within a community pharmacy. Retrospective NHS111 patient data (February-August 2016) from the North East of England and access to the NHS Pathways clinical decision support software were available to stakeholders. The NHS111 data demonstrated the volume of patient calls for these conditions that could have been redirected to community pharmacy. Stakeholders reached consensus that 64 low acuity conditions could be safely redirected to community pharmacy via NHS111. This represented approximately 35,000 patients (11.5% of total) being shifted away from the higher cost settings in the North East region alone during February-August 2016. The stakeholder group discussions provided rationale behind their classifications of conditions to ensure patient safety, the care experience and added value. The resulting definitive list of low acuity conditions that could be directed to community pharmacy via NHS111 could result in a shift of workload from urgent and emergency care settings. Future work needs to evaluate the cost, clinical outcomes, patient satisfaction of a community pharmacy referral service that has the potential to improve integration of community pharmacy in the

  12. An Experiment on Graph Analysis Methodologies for Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Brothers, Alan J.; Whitney, Paul D.; Wolf, Katherine E.; Kuchar, Olga A.; Chin, George

    2005-09-30

    Visual graph representations are increasingly used to represent, display, and explore scenarios and the structure of organizations. The graph representations of scenarios are readily understood, and commercial software is available to create and manage these representations. The purpose of the research presented in this paper is to explore whether these graph representations support quantitative assessments of the underlying scenarios. The underlying structure of the scenarios is the information that is being targeted in the experiment and the extent to which the scenarios are similar in content. An experiment was designed that incorporated both the contents of the scenarios and analysts’ graph representations of the scenarios. The scenarios’ content was represented graphically by analysts, and both the structure and the semantics of the graph representation were attempted to be used to understand the content. The structure information was not found to be discriminating for the content of the scenarios in this experiment; but, the semantic information was discriminating.

  13. Effect of gravitational focusing on annual modulation in dark-matter direct-detection experiments.

    Science.gov (United States)

    Lee, Samuel K; Lisanti, Mariangela; Peter, Annika H G; Safdi, Benjamin R

    2014-01-10

    The scattering rate in dark-matter direct-detection experiments should modulate annually due to Earth's orbit around the Sun. The rate is typically thought to be extremized around June 1, when the relative velocity of Earth with respect to the dark-matter wind is maximal. We point out that gravitational focusing can alter this modulation phase. Unbound dark-matter particles are focused by the Sun's gravitational potential, affecting their phase-space density in the lab frame. Gravitational focusing can result in a significant overall shift in the annual-modulation phase, which is most relevant for dark matter with low scattering speeds. The induced phase shift for light O(10)  GeV dark matter may also be significant, depending on the threshold energy of the experiment.

  14. Direct Down-scale Experiments of Concentration Column Designs for SHINE Process

    Energy Technology Data Exchange (ETDEWEB)

    Youker, Amanda J. [Argonne National Lab. (ANL), Argonne, IL (United States); Stepinski, Dominique C. [Argonne National Lab. (ANL), Argonne, IL (United States); Vandegrift, George F. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-05-01

    Argonne is assisting SHINE Medical Technologies in their efforts to become a domestic Mo-99 producer. The SHINE accelerator-driven process uses a uranyl-sulfate target solution for the production of fission-product Mo-99. Argonne has developed a molybdenum recovery and purification process for this target solution. The process includes an initial Mo recovery column followed by a concentration column to reduce the product volume from 15-25 L to < 1 L prior to entry into the LEU Modified Cintichem (LMC) process for purification.1 This report discusses direct down-scale experiments of the plant-scale concentration column design, where the effects of loading velocity and temperature were investigated.

  15. Reliability evaluation methodologies for ensuring container integrity of stored transuranic (TRU) waste

    International Nuclear Information System (INIS)

    Smith, K.L.

    1995-06-01

    This report provides methodologies for providing defensible estimates of expected transuranic waste storage container lifetimes at the Radioactive Waste Management Complex. These methodologies can be used to estimate transuranic waste container reliability (for integrity and degradation) and as an analytical tool to optimize waste container integrity. Container packaging and storage configurations, which directly affect waste container integrity, are also addressed. The methodologies presented provide a means for demonstrating Resource Conservation and Recovery Act waste storage requirements

  16. Developments in Sensitivity Methodologies and the Validation of Reactor Physics Calculations

    Directory of Open Access Journals (Sweden)

    Giuseppe Palmiotti

    2012-01-01

    Full Text Available The sensitivity methodologies have been a remarkable story when adopted in the reactor physics field. Sensitivity coefficients can be used for different objectives like uncertainty estimates, design optimization, determination of target accuracy requirements, adjustment of input parameters, and evaluations of the representativity of an experiment with respect to a reference design configuration. A review of the methods used is provided, and several examples illustrate the success of the methodology in reactor physics. A new application as the improvement of nuclear basic parameters using integral experiments is also described.

  17. An Integrated Methodology for Emulsified Formulated Product Design

    DEFF Research Database (Denmark)

    Mattei, Michele

    are mixed together to determine the desired emulsified product. They are still mainly designed and analysed through trial - and - error based exper- imental techniques, therefore a systematic approach , integrating model-based as well a s experiment - based techniques, for design of these products could......The consumer oriented chemical based products are used every day by millions of people. They are structured products constituted of numerous chemicals, and many of them, especially household and personal care products, are emulsions where active ingredients, solvents, additives and surfactants...... significantly reduce both time and cost connected to product development by doing only the necessary experi- ments , and ensuring chances for innovation . The main contribution of this project i s the development of an integrated methodology for the design of emulsified formulated products. The methodology...

  18. Development of a super-resolution optical microscope for directional dark matter search experiment

    International Nuclear Information System (INIS)

    Alexandrov, A.; Asada, T.; Consiglio, L.; D'Ambrosio, N.; De Lellis, G.; Di Crescenzo, A.; Di Marco, N.; Furuya, S.; Hakamata, K.; Ishikawa, M.; Katsuragawa, T.; Kuwabara, K.; Machii, S.; Naka, T.; Pupilli, F.; Sirignano, C.; Tawara, Y.; Tioukov, V.; Umemoto, A.; Yoshimoto, M.

    2016-01-01

    Nuclear emulsion is a perfect choice for a detector for directional DM search because of its high density and excellent position accuracy. The minimal detectable track length of a recoil nucleus in emulsion is required to be at least 100 nm, making the resolution of conventional optical microscopes insufficient to resolve them. Here we report about the R&D on a super-resolution optical microscope to be used in future directional DM search experiments with nuclear emulsion as a detector media. The microscope will be fully automatic, will use novel image acquisition and analysis techniques, will achieve the spatial resolution of the order of few tens of nm and will be capable of reconstructing recoil tracks with the length of at least 100 nm with high angular resolution.

  19. APPLICATION OF METHODOLOGY OF STRATEGIC PLANNING IN DEVELOPING NATIONAL PROGRAMMES ON DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Inna NOVAK

    2015-07-01

    Full Text Available Actuality: The main purpose of strategic planning is that long-term interests of sustainable development of a market economy require the use of effective measures of state regulation of economic and social processes. Objective: The aim of the article is determined to analyze the development of strategic planning methodology and practical experience of its application in the design of national development programs. Methods: When writing the article the following research methods were used: analysis and synthesis, target-oriented and monographic. Results: In Ukraine at the level of state and local government authorities strategies of development of branches, regions, cities, etc. are being developed but given the lack of state funding a unified investment strategy of the country is not developed. After analyzing development of the strategic planning methodology and examples of its application in the design of state development programs we identified the need to develop an investment strategy of the state (sectors, regions, etc., as due to defined directions and guidelines of the activity it will increase the investment level in the country and ensure national strategy “Ukraine-2020”.

  20. Methodology for assessing laser-based equipment

    Science.gov (United States)

    Pelegrina-Bonilla, Gabriel; Hermsdorf, Jörg; Thombansen, Ulrich; Abels, Peter; Kaierle, Stefan; Neumann, Jörg

    2017-10-01

    Methodologies for the assessment of technology's maturity are widely used in industry and research. Probably the best known are technology readiness levels (TRLs), initially pioneered by the National Aeronautics and Space Administration (NASA). At the beginning, only descriptively defined TRLs existed, but over time, automated assessment techniques in the form of questionnaires emerged in order to determine TRLs. Originally TRLs targeted equipment for space applications, but the demands on industrial relevant equipment are partly different in terms of, for example, overall costs, product quantities, or the presence of competitors. Therefore, we present a commonly valid assessment methodology with the aim of assessing laser-based equipment for industrial use, in general. The assessment is carried out with the help of a questionnaire, which allows for a user-friendly and easy accessible way to monitor the progress from the lab-proven state to the application-ready product throughout the complete development period. The assessment result is presented in a multidimensional metric in order to reveal the current specific strengths and weaknesses of the equipment development process, which can be used to direct the remaining development process of the equipment in the right direction.

  1. Scrum Methodology in Higher Education: Innovation in Teaching, Learning and Assessment

    Science.gov (United States)

    Jurado-Navas, Antonio; Munoz-Luna, Rosa

    2017-01-01

    The present paper aims to detail the experience developed in a classroom of English Studies from the Spanish University of Málaga, where an alternative project-based learning methodology has been implemented. Such methodology is inspired by scrum sessions widely extended in technological companies where staff members work in teams and are assigned…

  2. ISAAC: A REXUS Student Experiment to Demonstrate an Ejection System with Predefined Direction

    Science.gov (United States)

    Balmer, G.; Berquand, A.; Company-Vallet, E.; Granberg, V.; Grigore, V.; Ivchenko, N.; Kevorkov, R.; Lundkvist, E.; Olentsenko, G.; Pacheco-Labrador, J.; Tibert, G.; Yuan, Y.

    2015-09-01

    ISAAC Infrared Spectroscopy to Analyse the middle Atmosphere Composition — was a student experiment launched from SSC's Esrange Space Centre, Sweden, on 29th May 2014, on board the sounding rocket REXUS 15 in the frame of the REXUS/BEXUS programme. The main focus of the experiment was to implement an ejection system for two large Free Falling Units (FFUs) (240 mm x 80 mm) to be ejected from a spinning rocket into a predefined direction. The system design relied on a spring-based ejection system. Sun and angular rate sensors were used to control and time the ejection. The flight data includes telemetry from the Rocket Mounted Unit (RMU), received and saved during flight, as well as video footage from the GoPro camera mounted inside the RMU and recovered after the flight. The FFUs' direction, speed and spin frequency as well as the rocket spin frequency were determined by analyzing the video footage. The FFU-Rocket-Sun angles were 64.3° and 104.3°, within the required margins of 90°+45°. The FFU speeds were 3.98 mIs and 3.74 mIs, lower than the expected 5± 1 mIs. The FFUs' spin frequencies were 1 .38 Hz and 1 .60 Hz, approximately half the rocket's spin frequency. The rocket spin rate slightly changed from 3. 163 Hz before the ejection to 3.1 17 Hz after the ejection of the two FFUs. The angular rate, sun sensor data and temperature on the inside of the rocket module skin were also recorded. The experiment design and results of the data analysis are presented in this paper.

  3. Utility of radiotracer methodology in scientific research of industrial relevancy

    International Nuclear Information System (INIS)

    Kolar, Z.I.

    1990-01-01

    Utilization of radiotracer methodology in industrial research provides substantial scientific rather than directly demonstrable economic benefits. These benefits include better understanding of industrial processes and subsequently the development of new ones. Examples are given of the use of radiotracers in technological studies and the significance of the obtained results is put down. Creative application of radiotracer methodology may contribute to the economic development and technological advancement of all countries including the developing ones. (orig.) [de

  4. A Goal based methodology for HAZOP analysis

    DEFF Research Database (Denmark)

    Rossing, Netta Liin; Lind, Morten; Jensen, Niels

    2010-01-01

    to nodes with simple functions such as liquid transport, gas transport, liquid storage, gas-liquid contacting etc. From the functions of the nodes the selection of relevant process variables and deviation variables follows directly. The knowledge required to perform the pre-meeting HAZOP task of dividing...... the plant along functional lines is that of chemical unit operations and transport processes plus a some familiarity with the plant a hand. Thus the preparatory work may be performed by a chemical engineer with just an introductory course in risk assessment. The goal based methodology lends itself directly...

  5. Direction of the methodological work in the Technical and Professional Education: the department bosses' preparation

    Directory of Open Access Journals (Sweden)

    Rolando Rodríguez Delgado

    2017-09-01

    Full Text Available The present work deals with the process of preparation of the heads of teaching departments for the direction of methodological work in Technical and Professional Education with the purpose of improving their professional performance with the collective of teachers and specialists in the continuous training of the worker. The results of the diagnosis made on this process revealed shortcomings because it is unsystematic and poorly contextualized, which does not allow to achieve a positive impact on the training of subordinates and students to join the world of work. For the development of the work we used the structural systemic methods, analysis and synthesis, induction and deduction, survey, interview, observation and statistical techniques such as percentage calculation, tables of frequencies and contingencies, which allowed to deepen the object of study and to elaborate a strategy of preparation for the heads of department like main result that is presented in this work. In order to verify the practical effectiveness of the proposed strategy, 12 heads of teaching departments of the polytechnic centers of the municipality of La Palma for the academic year 2014-2015 were studied and 5 managers, 69 teachers and 18 specialists of production and services. The evaluation of the results evidences advances in the preparation of the heads of departments and in their professional performance with teachers and specialist, according to the current requirements of the technical and professional continuing education of the worker in formation.

  6. Methodology for evaluating pattern transfer completeness in inkjet printing with irregular edges

    Science.gov (United States)

    Huang, Bo-Cin; Chan, Hui-Ju; Hong, Jian-Wei; Lo, Cheng-Yao

    2016-06-01

    A methodology for quantifying and qualifying pattern transfer completeness in inkjet printing through examining both pattern dimensions and pattern contour deviations from reference design is proposed, which enables scientifically identifying and evaluating inkjet-printed lines, corners, circles, ellipses, and spirals with irregular edges of bulging, necking, and unpredictable distortions resulting from different process conditions. This methodology not only avoids differences in individual perceptions of ambiguous pattern distortions but also indicates the systematic effects of mechanical stresses applied in different directions to a polymer substrate, and is effective for both optical and electrical microscopy in direct and indirect lithography or lithography-free patterning.

  7. Methodology for evaluating pattern transfer completeness in inkjet printing with irregular edges

    International Nuclear Information System (INIS)

    Huang, Bo-Cin; Chan, Hui-Ju; Lo, Cheng-Yao; Hong, Jian-Wei

    2016-01-01

    A methodology for quantifying and qualifying pattern transfer completeness in inkjet printing through examining both pattern dimensions and pattern contour deviations from reference design is proposed, which enables scientifically identifying and evaluating inkjet-printed lines, corners, circles, ellipses, and spirals with irregular edges of bulging, necking, and unpredictable distortions resulting from different process conditions. This methodology not only avoids differences in individual perceptions of ambiguous pattern distortions but also indicates the systematic effects of mechanical stresses applied in different directions to a polymer substrate, and is effective for both optical and electrical microscopy in direct and indirect lithography or lithography-free patterning. (paper)

  8. Application of extended statistical combination of uncertainties methodology for digital nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    In, Wang Ki; Uh, Keun Sun; Chul, Kim Heui [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-02-01

    A technically more direct statistical combinations of uncertainties methodology, extended SCU (XSCU), was applied to statistically combine the uncertainties associated with the DNBR alarm setpoint and the DNBR trip setpoint of digital nuclear power plants. The modified SCU (MSCU) methodology is currently used as the USNRC approved design methodology to perform the same function. In this report, the MSCU and XSCU methodologies were compared in terms of the total uncertainties and the net margins to the DNBR alarm and trip setpoints. The MSCU methodology resulted in the small total penalties due to a significantly negative bias which are quite large. However the XSCU methodology gave the virtually unbiased total uncertainties. The net margins to the DNBR alarm and trip setpoints by the MSCU methodology agree with those by the XSCU methodology within statistical variations. (Author) 12 refs., 17 figs., 5 tabs.

  9. Methodology for economic evaluation of software development projects

    International Nuclear Information System (INIS)

    Witte, D.M.

    1990-01-01

    Many oil and gas exploration and production companies develop computer software in-house or with contract programmers to support their exploration activities. Software development projects compete for funding with exploration and development projects, though most companies lack valid comparison measures for the two types of projects. This paper presents a methodology of pro form a cash flow analysis for software development proposals intended for internal use. This methodology, based on estimates of development and support costs, exploration benefits, and probability of successful development and implementation, can be used to compare proposed software development projects directly with competing exploration proposals

  10. Gamma ray auto absorption correction evaluation methodology

    International Nuclear Information System (INIS)

    Gugiu, Daniela; Roth, Csaba; Ghinescu, Alecse

    2010-01-01

    Neutron activation analysis (NAA) is a well established nuclear technique, suited to investigate the microstructural or elemental composition and can be applied to studies of a large variety of samples. The work with large samples involves, beside the development of large irradiation devices with well know neutron field characteristics, the knowledge of perturbing phenomena and adequate evaluation of correction factors like: neutron self shielding, extended source correction, gamma ray auto absorption. The objective of the works presented in this paper is to validate an appropriate methodology for gamma ray auto absorption correction evaluation for large inhomogeneous samples. For this purpose a benchmark experiment has been defined - a simple gamma ray transmission experiment, easy to be reproduced. The gamma ray attenuation in pottery samples has been measured and computed using MCNP5 code. The results show a good agreement between the computed and measured values, proving that the proposed methodology is able to evaluate the correction factors. (authors)

  11. One common way - The strategic and methodological influence on environmental planning across Europe

    International Nuclear Information System (INIS)

    Jiricka, Alexandra; Proebstl, Ulrike

    2009-01-01

    In the last decades the European Union exerted influence on precautionary environmental planning by the establishment of several Directives. The most relevant were the Habitat-Directive, the EIA-Directive, the SEA-Directive and the Water Framework Directive. Comparing these EU policies in the area of environmental precaution it becomes obvious that there is a lot of common ground. Thus, the conclusion seems likely that the European Union, in doing so, has intended to establish general planning concepts through introducing several methodological steps indicated by the regulations. The goal of this article is firstly to point out, which are the common planning principles, converted by methodological elements and secondly examine the consideration of these planning concepts by the implementation and application in the member states. In this context it is analysed whether the connections and divergences between the directives lead to significant differences in the implementation process. To this aim the directives are shortly introduced and significant steps of the processes regulated by them are outlined. In the second steps the national legal implementation in the Alpine states and its consequences for the practical application are discussed. The results show a heterogeneous application of the EU principles. Within the comparative view on the four directives influence and causalities between the national implementation and the practical application were identified, which can be simplified as four types. Since a coherent strategic and methodological concept for improving environmental precaution planning from part of the EU is noticeable, more unity and comparability within the implementation is desirable, particularly in areas with comparable habitats such as the alpine space. Beyond this the trade-off between the directives poses an important task for the future.

  12. Artificial Intelligence Techniques and Methodology

    OpenAIRE

    Carbonell, Jaime G.; Sleeman, Derek

    1982-01-01

    Two closely related aspects of artificial intelligence that have received comparatively little attention in the recent literature are research methodology, and the analysis of computational techniques that span multiple application areas. We believe both issues to be increasingly significant as Artificial Intelligence matures into a science and spins off major application efforts. It is imperative to analyze the repertoire of AI methods with respect to past experience, utility in new domains,...

  13. Directional spectrum of ocean waves

    Digital Repository Service at National Institute of Oceanography (India)

    Fernandes, A.A; Gouveia, A; Nagarajan, R.

    This paper describes a methodology for obtaining the directional spectrum of ocean waves from time series measurement of wave elevation at several gauges arranged in linear or polygonal arrays. Results of simulated studies using sinusoidal wave...

  14. The phase-space structure of a dark-matter halo: Implications for dark-matter direct detection experiments

    International Nuclear Information System (INIS)

    Helmi, Amina; White, Simon D.M.; Springel, Volker

    2002-01-01

    We study the phase-space structure of a dark-matter halo formed in a high resolution simulation of a ΛCDM cosmology. Our goal is to quantify how much substructure is left over from the inhomogeneous growth of the halo, and how it may affect the signal in experiments aimed at detecting the dark matter particles directly. If we focus on the equivalent of 'solar vicinity', we find that the dark matter is smoothly distributed in space. The probability of detecting particles bound within dense lumps of individual mass less than 10 7 M · h -1 is small, less than 10 -2 . The velocity ellipsoid in the solar neighborhood deviates only slightly from a multivariate Gaussian, and can be thought of as a superposition of thousands of kinematically cold streams. The motions of the most energetic particles are, however, strongly clumped and highly anisotropic. We conclude that experiments may safely assume a smooth multivariate Gaussian distribution to represent the kinematics of dark-matter particles in the solar neighborhood. Experiments sensitive to the direction of motion of the incident particles could exploit the expected anisotropy to learn about the recent merging history of our Galaxy

  15. Acquisition procedures, processing methodologies and preliminary results of magnetic and ROV data collected during the TOMO-ETNA experiment

    Directory of Open Access Journals (Sweden)

    Danilo Cavallaro

    2016-09-01

    Full Text Available The TOMO-ETNA experiment was devised for the investigation of the continental and oceanic crust beneath Mt. Etna volcano and northeastern Sicily up to the Aeolian Islands, through an active source study. In this experiment, a large amount of geophysical data was collected both inland and in the Ionian and Tyrrhenian Seas for identifying the major geological and structural features offshore Mt. Etna and NE Sicily. One of the oceanographic cruises organized within the TOMO-ETNA experiment was carried out on the hydrographic vessel “Galatea” by Italian Navy. During the cruise a detailed magnetic survey and a set of ROV (remotely operated vehicle dives were performed offshore Mt. Etna. The magnetic survey allowed the compilation of a preliminary magnetic map revealing a clear direct relationship between volcanic structures and high frequency magnetic anomalies. Significant positive magnetic anomalies were identified offshore the Timpa area and along the easternmost portion of the Riposto Ridge and correlated to a primitive volcanic edifice and to shallow volcanic bodies, respectively. On the whole, the magnetic anomaly map highlights a clear SW-NE decreasing trend, where high amplitude positive magnetic anomaly pattern of the SW sector passes, northeastwardly, to a main negative one. ROV dives permitted to directly explore the shallowest sectors of the Riposto Ridge and to collect several videos and seafloor samples, allowing us to identify some locally developed volcanic manifestations.

  16. Experiments with a methodology to model the role of R and D expenditures in energy technology learning processes; first results

    International Nuclear Information System (INIS)

    Miketa, Asami; Schrattenholzer, Leo

    2004-01-01

    This paper presents the results of using a stylized optimization model of the global electricity supply system to analyze the optimal research and development (R and D) support for an energy technology. The model takes into account the dynamics of technological progress as described by a so-called two-factor learning curve (2FLC). The two factors are cumulative experience ('learning by doing') and accumulated knowledge ('learning by searching'); the formulation is a straightforward expansion of conventional one-factor learning curves, in which only cumulative experience is included as a factor, which aggregates the effects of accumulated knowledge and cumulative experience, among others. The responsiveness of technological progress to the two factors is quantified using learning parameters, which are estimated using empirical data. Sensitivities of the model results to the parameters are also tested. The model results also address the effect of competition between technologies and of CO 2 constraints. The results are mainly methodological; one of the most interesting is that, at least up to a point, competition between technologies - in terms of both market share and R and D support - need not lead to 'lock-in' or 'crowding-out'

  17. Experiments with a methodology to model the role of R and D expenditures in energy technology learning processes: first results

    International Nuclear Information System (INIS)

    Miketa, A.; Schrattenholzer, L.

    2004-01-01

    This paper presents the results of using a stylized optimization model of the global electricity supply system to analyze the optimal research and development (R and D) support for an energy technology. The model takes into account the dynamics of technological progress as described by a so-called two-factor learning curve (2FLC). The two factors are cumulative experience (''learning by doing'') and accumulated knowledge (''learning by searching''); the formulation is a straightforward expansion of conventional one-factor learning curves, in which only cumulative experience is included as a factor, which aggregates the effects of accumulated knowledge and cumulative experience, among others. The responsiveness of technological progress to the two factors is quantified using learning parameters, which are estimated using empirical data. Sensitivities of the model results to the parameters are also tested. The model results also address the effect of competition between technologies and of CO 2 constraints. The results are mainly methodological; one of the most interesting is that, at least up to a point, competition between technologies-in terms of both market share and R and D support-need not lead to ''lock-in'' or ''crowding-out''. (author)

  18. Application of precursor methodology in initiating frequency estimates

    International Nuclear Information System (INIS)

    Kohut, P.; Fitzpatrick, R.G.

    1991-01-01

    The precursor methodology developed in recent years provides a consistent technique to identify important accident sequence precursors. It relies on operational events (extracting information from actual experience) and infers core damage scenarios based on expected safety system responses. The ranking or categorization of each precursor is determined by considering the full spectrum of potential core damage sequences. The methodology estimates the frequency of severe core damage based on the approach suggested by Apostolakis and Mosleh, which may lead to a potential overestimation of the severe-accident sequence frequency due to the inherent dependencies between the safety systems and the initiating events. The methodology is an encompassing attempt to incorporate most of the operating information available from nuclear power plants and is an attractive tool from the point of view of risk management. In this paper, a further extension of this methodology is discussed with regard to the treatment of initiating frequency of the accident sequences

  19. Peaked signals from dark matter velocity structures in direct detection experiments

    Science.gov (United States)

    Lang, Rafael F.; Weiner, Neal

    2010-06-01

    In direct dark matter detection experiments, conventional elastic scattering of WIMPs results in exponentially falling recoil spectra. In contrast, theories of WIMPs with excited states can lead to nuclear recoil spectra that peak at finite recoil energies ER. The peaks of such signals are typically fairly broad, with ΔER/Epeak ~ 1. We show that in the presence of dark matter structures with low velocity dispersion, such as streams or clumps, peaks from up-scattering can become extremely narrow with FWHM of a few keV only. This differs dramatically from the conventionally expected WIMP spectrum and would, once detected, open the possibility to measure the dark matter velocity structure with high accuracy. As an intriguing example, we confront the observed cluster of 3 events near 42 keV from the CRESST commissioning run with this scenario. Inelastic dark matter particles with a wide range of parameters are capable of producing such a narrow peak. We calculate the possible signals at other experiments, and find that such particles could also give rise to the signal at DAMA, although not from the same stream. Over some range of parameters, a signal would be visible at xenon experiments. We show that such dark matter peaks are a very clear signal and can be easily disentangled from potential backgrounds, both terrestrial or due to WIMP down-scattering, by an enhanced annual modulation in both the amplitude of the signal and its spectral shape.

  20. Peaked signals from dark matter velocity structures in direct detection experiments

    International Nuclear Information System (INIS)

    Lang, Rafael F.; Weiner, Neal

    2010-01-01

    In direct dark matter detection experiments, conventional elastic scattering of WIMPs results in exponentially falling recoil spectra. In contrast, theories of WIMPs with excited states can lead to nuclear recoil spectra that peak at finite recoil energies E R . The peaks of such signals are typically fairly broad, with ΔE R /E peak ∼ 1. We show that in the presence of dark matter structures with low velocity dispersion, such as streams or clumps, peaks from up-scattering can become extremely narrow with FWHM of a few keV only. This differs dramatically from the conventionally expected WIMP spectrum and would, once detected, open the possibility to measure the dark matter velocity structure with high accuracy. As an intriguing example, we confront the observed cluster of 3 events near 42 keV from the CRESST commissioning run with this scenario. Inelastic dark matter particles with a wide range of parameters are capable of producing such a narrow peak. We calculate the possible signals at other experiments, and find that such particles could also give rise to the signal at DAMA, although not from the same stream. Over some range of parameters, a signal would be visible at xenon experiments. We show that such dark matter peaks are a very clear signal and can be easily disentangled from potential backgrounds, both terrestrial or due to WIMP down-scattering, by an enhanced annual modulation in both the amplitude of the signal and its spectral shape

  1. Design of formulated products: a systematic methodology

    DEFF Research Database (Denmark)

    Conte, Elisa; Gani, Rafiqul; Ng, K.M.

    2011-01-01

    /or verifies a specified set through a sequence of predefined activities (work-flow). Stage-2 and stage-3 (not presented here) deal with the planning and execution of experiments, for product validation. Four case studies have been developed to test the methodology. The computer-aided design (stage-1...

  2. The fractional scaling methodology (FSM) Part 1. methodology development

    International Nuclear Information System (INIS)

    Novak Zuber; Ivan Catton; Upendra S Rohatgi; Wolfgang Wulff

    2005-01-01

    Full text of publication follows: a quantitative methodology is developed, based on the concepts of hierarchy and synthesis, to integrate and organize information and data. The methodology uses scaling to synthesize experimental data and analytical results, and to provide quantitative criteria for evaluating the effects of various design and operating parameters that influence processes in a complex system such as a nuclear power plant or a related test facility. Synthesis and scaling are performed on three hierarchical levels: the process, component and system levels. Scaling on the process level determines the effect of a selected process on a particular state variable during a selected scenario. At the component level this scaling determines the effects various processes have on a state variable, and it ranks the processes according to their importance by the magnitude of the fractional change they cause on that state variable. At the system level the scaling determines the governing processes and corresponding components, ranking these in the order of importance according to their effect on the fractional change of system-wide state variables. The scaling methodology reveals on all levels the fractional change of state variables and is called therefore the Fractional Scaling Methodology (FSM). FSM synthesizes process parameters and assigns to each thermohydraulic process a dimensionless effect metric Ω = ωt, that is the product of the specific rate of fractional change ω and the characteristic time t. The rate of fractional change ω is the ratio of process transport rate over content of a preserved quantity in a component. The effect metric Ω quantifies the contribution of the process to the fractional change of a state variable in a given component. Ordering of a component effect metrics provides the hierarchy of processes in a component, then in all components and the system. FSM separates quantitatively dominant from minor processes and components and

  3. Theoretical and Methodological Foundations of Reverse Inclusion: The Experience of Moscow State University of Humanities and Economics

    Directory of Open Access Journals (Sweden)

    Bairamov V.D.,

    2017-08-01

    Full Text Available The article substantiates the model of “reverse inclusion” in the interconnection of sociostructural, sociocultural and spatial aspects. In addition to these aspects, the paper describes the socio-legal and socio-pedagogical foundations of the model. Along with the key category of inclusion the following categories are revealed: “disability”, “disabled person”, “social barrier”, “inclusive social strategy”, and “inclusive strategy in education”. “Reverse inclusion” is opposed to the dominant model of direct inclusion. Due to the fact that the article is of a theoretical and methodological nature, factual data play an illustrative role. The empirical base is represented by secondary data, as well as by some references to the authors’ research of 2016 conducted by the staff of the research laboratory of the Moscow State University of Humanities and Economics for purposes of vocational guidance; in this research a series of 27 in-depth interviews were carried out with students with musculoskeletal disorders studying at MSUHE.

  4. Comparing a simple methodology to evaluate hydrodynamic parameters with rainfall simulation experiments

    Science.gov (United States)

    Di Prima, Simone; Bagarello, Vincenzo; Bautista, Inmaculada; Burguet, Maria; Cerdà, Artemi; Iovino, Massimo; Prosdocimi, Massimo

    2016-04-01

    Studying soil hydraulic properties is necessary for interpreting and simulating many hydrological processes having environmental and economic importance, such as rainfall partition into infiltration and runoff. The saturated hydraulic conductivity, Ks, exerts a dominating influence on the partitioning of rainfall in vertical and lateral flow paths. Therefore, estimates of Ks are essential for describing and modeling hydrological processes (Zimmermann et al., 2013). According to several investigations, Ks data collected by ponded infiltration tests could be expected to be unusable for interpreting field hydrological processes, and particularly infiltration. In fact, infiltration measured by ponding give us information about the soil maximum or potential infiltration rate (Cerdà, 1996). Moreover, especially for the hydrodynamic parameters, many replicated measurements have to be carried out to characterize an area of interest since they are known to vary widely both in space and time (Logsdon and Jaynes, 1996; Prieksat et al., 1994). Therefore, the technique to be applied at the near point scale should be simple and rapid. Bagarello et al. (2014) and Alagna et al. (2015) suggested that the Ks values determined by an infiltration experiment carried applying water at a relatively large distance from the soil surface could be more appropriate than those obtained with a low height of water pouring to explain surface runoff generation phenomena during intense rainfall events. These authors used the Beerkan Estimation of Soil Transfer parameters (BEST) procedure for complete soil hydraulic characterization (Lassabatère et al., 2006) to analyze the field infiltration experiment. This methodology, combining low and high height of water pouring, seems appropriate to test the effect of intense and prolonged rainfall events on the hydraulic characteristics of the surface soil layer. In fact, an intense and prolonged rainfall event has a perturbing effect on the soil surface

  5. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  6. Methodological foundations of the modern training system of skilled handballers

    Directory of Open Access Journals (Sweden)

    V.A. Tyshchenko

    2014-01-01

    Full Text Available Purpose: to consider the direction of training handball team in the annual Ukrainian Superleague macrocycles game seasons in years 2006-2013. Material: in the experiment took part 125 participated highly qualified handballers. The analysis of more than 50 references on multi-year training athletes is conducted. Results: confirmed advisability of constructing the training process handball qualifications based on the structural components of the preparation. According to the requirements of the system approach presented technology of preparation are disclosed management methodology training process in terms of long-term training. Conclusions: it is necessary to compile and optimize long-term training program handball qualifications; raise the level of preparedness of the various parties in strict accordance with the objective laws of the formation of their constituents, and calendar events, to consider specific features of the occurrence of adaptive reactions in improving the various components of sportsmanship.

  7. Direct CP violation results in $K^{\\pm} \\rightarrow 3\\pi^{\\pm}$ decays from NA48/2 experiment at CERN

    CERN Document Server

    Biino, Cristina

    2006-01-01

    After firmly establishing direct CP Violation in two pions decays of neutral kaons, the NA48 experiment, during the 2003 run at CERN-SPS, has collected more than 1.6 billion of charged kaon decays into three charged pions, using a unique double beam technique which allows a high level of control on systematic effects. The measurement of the direct CP violation Dalitz plot linear slope asymmetry parameter A$_{g}$ is reported. This result corresponds to more than an order of magnitude improvement in precision with respect to previous experiments and is limited by the statistics of the data sample.

  8. Vibration-Driven Microrobot Positioning Methodologies for Nonholonomic Constraint Compensation

    Directory of Open Access Journals (Sweden)

    Kostas Vlachos

    2015-03-01

    Full Text Available This paper presents the formulation and practical implementation of positioning methodologies that compensate for the nonholonomic constraints of a mobile microrobot that is driven by two vibrating direct current (DC micromotors. The open-loop and closed-loop approaches described here add the capability for net sidewise displacements of the microrobotic platform. A displacement is achieved by the execution of a number of repeating steps that depend on the desired displacement, the speed of the micromotors, and the elapsed time. Simulation and experimental results verified the performance of the proposed methodologies.

  9. Applying of component system development in object methodology, case study

    Directory of Open Access Journals (Sweden)

    Milan Mišovič

    2013-01-01

    Full Text Available To create computarization target software as a component system has been a very strong requirement for the last 20 years of software developing. Finally, the architectural components are self-contained units, presenting not only partial and overall system behavior, but also cooperating with each other on the basis of their interfaces. Among others, components have allowed flexible modification of processes the behavior of which is the foundation of components behavior without changing the life of the component system. On the other hand, the component system makes it possible, at design time, to create numerous new connections between components and thus creating modified system behaviors. This all enables the company management to perform, at design time, required behavioral changes of processes in accordance with the requirements of changing production and market.The development of software which is generally referred to as SDP (Software Development Process contains two directions. The first one, called CBD (Component–Based Development, is dedicated to the development of component–based systems CBS (Component–based System, the second target is the development of software under the influence of SOA (Service–Oriented Architecture. Both directions are equipped with their different development methodologies. The subject of this paper is only the first direction and application of development of component–based systems in its object–oriented methodologies. The requirement of today is to carry out the development of component-based systems in the framework of developed object–oriented methodologies precisely in the way of a dominant style. In some of the known methodologies, however, this development is not completely transparent and is not even recognized as dominant. In some cases, it is corrected by the special meta–integration models of component system development into an object methodology.This paper presents a case study

  10. Learning to Support Learning Together: An Experience with the Soft Systems Methodology

    Science.gov (United States)

    Sanchez, Adolfo; Mejia, Andres

    2008-01-01

    An action research approach called soft systems methodology (SSM) was used to foster organisational learning in a school regarding the role of the learning support department within the school and its relation with the normal teaching-learning activities. From an initial situation of lack of coordination as well as mutual misunderstanding and…

  11. Direct block scheduling technology: Analysis of Avidity

    Directory of Open Access Journals (Sweden)

    Felipe Ribeiro Souza

    Full Text Available Abstract This study is focused on Direct Block Scheduling testing (Direct Multi-Period Scheduling methodology which schedules mine production considering the correct discount factor of each mining block, resulting in the final pit. Each block is analyzed individually in order to define the best target period. This methodology presents an improvement of the classical methodology derived from Lerchs-Grossmann's initial proposition improved by Whittle. This paper presents the differences between these methodologies, specially focused on the algorithms' avidity. Avidity is classically defined by the voracious search algorithms, whereupon some of the most famous greedy algorithms are Branch and Bound, Brutal Force and Randomized. Strategies based on heuristics can accentuate the voracity of the optimizer system. The applied algorithm use simulated annealing combined with Tabu Search. The most avid algorithm can select the most profitable blocks in early periods, leading to higher present value in the first periods of mine operation. The application of discount factors to blocks on the Lerchs-Grossmann's final pit has an accentuated effect with time, and this effect may make blocks scheduled for the end of the mine life unfeasible, representing a trend to a decrease in reported reserves.

  12. Mikhail Geraskov (1874-1957 Methodological Concepts of Learning Physics.

    Directory of Open Access Journals (Sweden)

    Mariyana Ilieva

    2014-02-01

    Full Text Available Mikhail Geraskov is a distinguished Bulgarian educator from the first half of the twentieth century, who developed the scientific foundations of didactics and methodology of training. His work contributed a lot to the development of the Bulgarian pedagogy. The subject of scientific research is didactical conceptions and methodological conceptions of learning. The aim of the research paper is to presents his ideas about particular methods of teaching Physics for high school. Geraskov assumes direct correlation between didactics and methodology. This paper focuses on his ideas about design, technology and methodological requirements for lessons of Physics. He believes that the appropriate methods are determined by the curriculum, set of educational goals and age characteristics, and capabilities of adolescents. In his methodical recommendations he focuses on teaching methods and forms that provoke students’ activity. Comparative analysis with publications on the issues set for development of the Bulgarian pedagogic science and the actuality in the modern education system.

  13. Case Study Methodology and Homelessness Research

    Directory of Open Access Journals (Sweden)

    Jill Pable

    2013-10-01

    Full Text Available This paper describes the potential suitability of case study methodology for inquiry with the homeless population. It references a research study that uses case study research method to build theory. This study's topic is the lived experience of destitute individuals who reside in homeless shelters, and explores the homeless shelter built environment's potential influence on resident satisfaction and recovery. Case study methodology may be appropriate because it explores real-life contextual issues that characterize homelessness and can also accommodate the wide range of homeless person demographics that make this group difficult to study in a generalized fashion. Further, case study method accommodates the need within research in this area to understand individualized treatments as a potential solution for homelessness.

  14. First results from the NEWS-G direct dark matter search experiment at the LSM

    Science.gov (United States)

    Arnaud, Q.; Asner, D.; Bard, J.-P.; Brossard, A.; Cai, B.; Chapellier, M.; Clark, M.; Corcoran, E. C.; Dandl, T.; Dastgheibi-Fard, A.; Dering, K.; Di Stefano, P.; Durnford, D.; Gerbier, G.; Giomataris, I.; Gorel, P.; Gros, M.; Guillaudin, O.; Hoppe, E. W.; Kamaha, A.; Katsioulas, I.; Kelly, D. G.; Martin, R. D.; McDonald, J.; Muraz, J.-F.; Mols, J.-P.; Navick, X.-F.; Papaevangelou, T.; Piquemal, F.; Roth, S.; Santos, D.; Savvidis, I.; Ulrich, A.; Vazquez de Sola Fernandez, F.; Zampaolo, M.

    2018-01-01

    New Experiments With Spheres-Gas (NEWS-G) is a direct dark matter detection experiment using Spherical Proportional Counters (SPCs) with light noble gases to search for low-mass Weakly Interacting Massive Particles (WIMPs). We report the results from the first physics run taken at the Laboratoire Souterrain de Modane (LSM) with SEDINE, a 60 cm diameter prototype SPC operated with a mixture of Ne + CH4 (0.7%) at 3.1 bars for a total exposure of 9.6 kg · days. New constraints are set on the spin-independent WIMP-nucleon scattering cross-section in the sub-GeV/c2 mass region. We exclude cross-sections above 4.4 ×10-37cm2 at 90% confidence level (C.L.) for a 0.5 GeV/c2 WIMP. The competitive results obtained with SEDINE are promising for the next phase of the NEWS-G experiment: a 140 cm diameter SPC to be installed at SNOLAB by summer 2018.

  15. Production of direct drive cylindrical targets for inertial confinement fusion experiments

    International Nuclear Information System (INIS)

    Elliott, N.E.; Day, R.D.; Hatch, D.J.; Sandoval, D.L.; Gomez, V.M.; Pierce, T.H.; Elliott, J.E.; Manzanares, R.

    2002-01-01

    We have made targets with cylindrical geometry for Inertial Confinement Fusion (ICF) experiments. These targets are used in hydrodynamic experiments on the OMEGA laser at the University of Rochester. The cylindrical design allows the study of three dimensional hydrodynamic effects in a pseudo 2D mode, simplifying data gathering and analysis. Direct drive refers to the fact that the target is illuminated directly by approximately 50 laser beams and is imploded by the material pressure generated from ablation of the outside of the target. The production of cylindrical targets involves numerous steps. These steps are shared in common with many other types of ICF targets but no other single target type encompasses such a wide range of fabrication techniques. These targets consist of a large number of individual parts, all fabricated from commercially purchased raw material, requiring many machining, assembly, electroplating and chemical process steps. Virtually every manufacturing and assembly process we currently possess is involved in the production of these targets. The generic target consists of a plastic cylinder (ablator) that is roughly lmm in diameter by 2.25mm long. The wall of the cylinder is roughly 0.07mm thick. There is an aluminum cylinder 0.5mm wide and O.Olmm thick centered on the inside of the plastic cylinder and coaxial with the outside plastic cylinder. The outside of this aluminum band has surface finishes of differing random average roughness. The required average surface roughness is determined in advance by experimental design based on the amount of turbulent mix to be observed. The interior of the cylinder is filled with low density polystyrene foam that is made in house. To produce a finished target additional features are added to each target. X-ray backlighters are cantilevered off the target that allow time resolved x-ray images of the imploding target to be recorded during the experiment. The x-ray backlighters are driven by additional

  16. METHODOLOGICAL PROPOSAL FOR COMPILING THE ILO UNEMPLOYMENT WITH MONTHLY PERIODICITY

    Directory of Open Access Journals (Sweden)

    Silvia PISICĂ

    2011-08-01

    Full Text Available Development of methodology for deriving the monthly unemployment statistics directly from the quarterly Labour Force Survey (LFS results by econometric modeling meets the requirements of insuring the information on short-term needed for employment policies, aiming to achieve the objectives of Europe 2020. Estimated monthly data series according to the methodology allow assessment of short-term trends in unemployment measured according to the criteria of the International Labour Organisation (ILO in terms of comparability with European statistics.

  17. Similarity principles for seismic qualification of equipment by experience data

    International Nuclear Information System (INIS)

    Kana, D.D.; Pomerening, D.J.

    1989-01-01

    A methodology is developed for seismic qualification of nuclear plant equipment by applying similarity principles to existing experience data. Experience data is that available from previous qualifications by analysis or testing, or from actual earthquake events. Similarity principles are defined in terms of excitation, equipment physical characteristics, and equipment response. Physical similarity is further defined in terms of a critical transfer function for response at a location on a primary structure, whose response can be assumed directly related to fragility of the item under elevated levels of excitation. Procedures are developed for combining experience data into composite specifications for qualification of equipment that can be shown to be physically similar to the reference equipment. Other procedures are developed for extending qualifications beyond the original specifications under certain conditions. Some examples for application and verification of the procedures are given for actual test data available from previous qualifications. The developments are intended to elaborate on the rather broad guidelines by the IEEE 344 Standards Committee for equipment qualification in new nuclear plants. The results also contribute to filling a gap that exists between the IEEE 344 methodology and that which was previously developed for equipment in existing plants by the Seismic Qualification Utilities Group. 10 refs., 9 figs., 1 tab

  18. Direct containment heating experiments in Zion Nuclear Power Plant geometry using prototypic materials

    International Nuclear Information System (INIS)

    Binder, J.L.; McUmber, L.M.; Spencer, B.W.

    1993-01-01

    Direct Containment Heating (DCH) experiments have been completed which utilize prototypic core materials. The experiments reported on here are a continuation of the Integral Effects Testing (IET) DCH program. The experiments incorporated a 1/40 scale model of the Zion Nuclear Power Plant containment structures. The model included representations of the primary system volume, RPV lower head, cavity and instrument tunnel, and the lower containment structures. The experiments were steam driven. Iron-alumina thermite with chromium was used as a core melt stimulant in the earlier IET experiments. These earlier IET experiments at Argonne National Laboratory (ANL) and Sandia National Laboratories (SNL) provided useful data on the effect of scale on DCH phenomena; however, a significant question concerns the potential experiment distortions introduced by the use of non-prototypic iron/alumina thermite. Therefore, further testing with prototypic materials has been carried out at ANL. Three tests have been completed, DCH-U1A, U1B and U2. DCH-U1A and U1B employed an inerted containment atmosphere and are counterpart to the IET-1RR test with iron/alumina thermite. DCH-U2 employed nominally the same atmosphere composition of its counterpart iron/alumina test, IET-6. All tests, with prototypic material, have produced lower peak containment pressure rises; 45, 111 and 185 kPa in U1A, U1B and U2, compared to 150 and 250 kPa IET-1RR and 6. Hydrogen production, due to metal-steam reactions, was 33% larger in U1B and U2 compared to IET-1RR and IET-6. The pressurization efficiency was consistently lower for the corium tests compared to the IET tests

  19. A Generalizable Methodology for Quantifying User Satisfaction

    Science.gov (United States)

    Huang, Te-Yuan; Chen, Kuan-Ta; Huang, Polly; Lei, Chin-Laung

    Quantifying user satisfaction is essential, because the results can help service providers deliver better services. In this work, we propose a generalizable methodology, based on survival analysis, to quantify user satisfaction in terms of session times, i. e., the length of time users stay with an application. Unlike subjective human surveys, our methodology is based solely on passive measurement, which is more cost-efficient and better able to capture subconscious reactions. Furthermore, by using session times, rather than a specific performance indicator, such as the level of distortion of voice signals, the effects of other factors like loudness and sidetone, can also be captured by the developed models. Like survival analysis, our methodology is characterized by low complexity and a simple model-developing process. The feasibility of our methodology is demonstrated through case studies of ShenZhou Online, a commercial MMORPG in Taiwan, and the most prevalent VoIP application in the world, namely Skype. Through the model development process, we can also identify the most significant performance factors and their impacts on user satisfaction and discuss how they can be exploited to improve user experience and optimize resource allocation.

  20. A new methodology for fluorescence analysis of composite resins used in anterior direct restorations.

    Science.gov (United States)

    de Lima, Liliane Motta; Abreu, Jessica Dantas; Cohen-Carneiro, Flavia; Regalado, Diego Ferreira; Pontes, Danielson Guedes

    2015-01-01

    The aim of this study was to use a new methodology to evaluate the fluorescence of composite resins for direct restorations. Microhybrid (group 1, Amelogen; group 2, Opallis; group 3, Filtek Z250) and nanohybrid (group 4, Filtek Z350 XT; group 5, Brilliant NG; group 6, Evolu-X) composite resins were analyzed in this study. A prefabricated matrix was used to prepare 60 specimens of 7.0 × 3.0 mm (n = 10 per group); the composite resin discs were prepared in 2 increments (1.5 mm each) and photocured for 20 seconds. To establish a control group of natural teeth, 10 maxillary central incisor crowns were horizontally sectioned to create 10 discs of dentin and enamel tissues with the same dimensions as the composite resin specimens. The specimens were placed in a box with ultraviolet light, and photographs were taken. Aperture 3.0 software was used to quantify the central portion of the image of each specimen in shades of red (R), green (G), and blue (B) of the RGB color space. The brighter the B shade in the evaluated area of the image, the greater the fluorescence shown by the specimen. One-way analysis of variance revealed significant differences between the groups. The fluorescence achieved in group 1 was statistically similar to that of the control group and significantly different from those of the other groups (Bonferroni test). Groups 3 and 4 had the lowest fluorescence values, which were significantly different from those of the other groups. According to the results of this study, neither the size nor the amount of inorganic particles in the evaluated composite resin materials predicts if the material will exhibit good fluorescence.

  1. Paramedics experiences and expectations concerning advance directives: a prospective, questionnaire-based, bi-centre study.

    Science.gov (United States)

    Taghavi, Mahmoud; Simon, Alfred; Kappus, Stefan; Meyer, Nicole; Lassen, Christoph L; Klier, Tobias; Ruppert, David B; Graf, Bernhard M; Hanekop, Gerd G; Wiese, Christoph H R

    2012-10-01

    Advance directives and palliative crisis cards are means by which palliative care patients can exert their autonomy in end-of-life decisions. To examine paramedics' attitudes towards advance directives and end-of-life care. Questionnaire-based investigation using a self-administered survey instrument. Paramedics of two cities (Hamburg and Goettingen, Germany) were included. Participants were questioned as to (1) their attitudes about advance directives, (2) their clinical experiences in connection with end-of-life situations (e.g. resuscitation), (3) their suggestions in regard to advance directives, 'Do not attempt resuscitation' orders and palliative crisis cards. Questionnaires were returned by 728 paramedics (response rate: 81%). The majority of paramedics (71%) had dealt with advance directives and end-of-life decisions in emergency situations. Most participants (84%) found that cardiopulmonary resuscitation in end-of-life patients is not useful and 75% stated that they would withhold cardiopulmonary resuscitation in the case of legal possibility. Participants also mentioned that more extensive discussion of legal aspects concerning advance directives should be included in paramedic training curricula. They suggested that palliative crisis cards should be integrated into end-of-life care. Decision making in prehospital end-of-life care is a challenge for all paramedics. The present investigation demonstrates that a dialogue bridging emergency medical and palliative care issues is necessary. The paramedics indicated that improved guidelines on end-of-life decisions and the termination of cardiopulmonary resuscitation in palliative care patients may be essential. Participants do not feel adequately trained in end-of-life care and the content of advance directives. Other recent studies have also demonstrated that there is a need for training curricula in end-of-life care for paramedics.

  2. Design Methodology of a Sensor Network Architecture Supporting Urgent Information and Its Evaluation

    Science.gov (United States)

    Kawai, Tetsuya; Wakamiya, Naoki; Murata, Masayuki

    Wireless sensor networks are expected to become an important social infrastructure which helps our life to be safe, secure, and comfortable. In this paper, we propose design methodology of an architecture for fast and reliable transmission of urgent information in wireless sensor networks. In this methodology, instead of establishing single complicated monolithic mechanism, several simple and fully-distributed control mechanisms which function in different spatial and temporal levels are incorporated on each node. These mechanisms work autonomously and independently responding to the surrounding situation. We also show an example of a network architecture designed following the methodology. We evaluated the performance of the architecture by extensive simulation and practical experiments and our claim was supported by the results of these experiments.

  3. Methodological principles for optimising functional MRI experiments

    International Nuclear Information System (INIS)

    Wuestenberg, T.; Giesel, F.L.; Strasburger, H.

    2005-01-01

    Functional magnetic resonance imaging (fMRI) is one of the most common methods for localising neuronal activity in the brain. Even though the sensitivity of fMRI is comparatively low, the optimisation of certain experimental parameters allows obtaining reliable results. In this article, approaches for optimising the experimental design, imaging parameters and analytic strategies will be discussed. Clinical neuroscientists and interested physicians will receive practical rules of thumb for improving the efficiency of brain imaging experiments. (orig.) [de

  4. Direct flow in 10.8 GeV/nucleon Au+Au collisions measured in experiment E917 at the AGS

    International Nuclear Information System (INIS)

    Back, B. B.; Betts, R. R.; Britt, H. C.; Chang, J.; Chang, W. C.; Gillitzer, A.; Henning, W. F.; Hofman, D. J.; Nanal, V.; Wuosmaa, A. H.

    1999-01-01

    Analysis of directed flow observable for protons and pions from Au+Au collisions at 10.8 GeV/nucleon from experiment E917 at the AGS is presented. Using a Fourier series expansion, the first Fourier component, ν 1 ,was extracted as a function of rapidity for mid-central collisions (17-24%). Clear evidence for positive directed flow is found in the proton data, and a weak, possibly negative directed flow signal is observed for π + and π -

  5. Safety analysis and evaluation methodology for fusion systems

    International Nuclear Information System (INIS)

    Fujii-e, Y.; Kozawa, Y.; Namba, C.

    1987-03-01

    Fusion systems which are under development as future energy systems have reached a stage that the break even is expected to be realized in the near future. It is desirable to demonstrate that fusion systems are well acceptable to the societal environment. There are three crucial viewpoints to measure the acceptability, that is, technological feasibility, economy and safety. These three points have close interrelation. The safety problem is more important since three large scale tokamaks, JET, TFTR and JT-60, start experiment, and tritium will be introduced into some of them as the fusion fuel. It is desirable to establish a methodology to resolve the safety-related issues in harmony with the technological evolution. The promising fusion system toward reactors is not yet settled. This study has the objective to develop and adequate methodology which promotes the safety design of general fusion systems and to present a basis for proposing the R and D themes and establishing the data base. A framework of the methodology, the understanding and modeling of fusion systems, the principle of ensuring safety, the safety analysis based on the function and the application of the methodology are discussed. As the result of this study, the methodology for the safety analysis and evaluation of fusion systems was developed. New idea and approach were presented in the course of the methodology development. (Kako, I.)

  6. Strategic directions of computing at Fermilab

    Science.gov (United States)

    Wolbers, Stephen

    1998-05-01

    Fermilab computing has changed a great deal over the years, driven by the demands of the Fermilab experimental community to record and analyze larger and larger datasets, by the desire to take advantage of advances in computing hardware and software, and by the advances coming from the R&D efforts of the Fermilab Computing Division. The strategic directions of Fermilab Computing continue to be driven by the needs of the experimental program. The current fixed-target run will produce over 100 TBytes of raw data and systems must be in place to allow the timely analysis of the data. The collider run II, beginning in 1999, is projected to produce of order 1 PByte of data per year. There will be a major change in methodology and software language as the experiments move away from FORTRAN and into object-oriented languages. Increased use of automation and the reduction of operator-assisted tape mounts will be required to meet the needs of the large experiments and large data sets. Work will continue on higher-rate data acquisition systems for future experiments and projects. R&D projects will be pursued as necessary to provide software, tools, or systems which cannot be purchased or acquired elsewhere. A closer working relation with other high energy laboratories will be pursued to reduce duplication of effort and to allow effective collaboration on many aspects of HEP computing.

  7. Experience with novel technologies for direct measurement of atmospheric NO2

    Science.gov (United States)

    Hueglin, Christoph; Hundt, Morten; Mueller, Michael; Schwarzenbach, Beat; Tuzson, Bela; Emmenegger, Lukas

    2017-04-01

    Nitrogen dioxide (NO2) is an air pollutant that has a large impact on human health and ecosystems, and it plays a key role in the formation of ozone and secondary particulate matter. Consequently, legal limit values for NO2 are set in the EU and elsewhere, and atmospheric observation networks typically include NO2 in their measurement programmes. Atmospheric NO2 is principally measured by chemiluminescence detection, an indirect measurement technique that requires conversion of NO2 into nitrogen monoxide (NO) and finally calculation of NO2 from the difference between total nitrogen oxides (NOx) and NO. Consequently, NO2 measurements with the chemiluminescence method have a relatively high measurement uncertainty and can be biased depending on the selectivity of the applied NO2 conversion method. In the past years, technologies for direct and selective measurement of NO2 have become available, e.g. cavity attenuated phase shift spectroscopy (CAPS), cavity enhanced laser absorption spectroscopy and quantum cascade laser absorption spectrometry (QCLAS). These technologies offer clear advantages over the indirect chemiluminescence method. We tested the above mentioned direct measurement techniques for NO2 over extended time periods at atmospheric measurement stations and report on our experience including comparisons with co-located chemiluminescence instruments equipped with molybdenum as well as photolytic NO2 converters. A still open issue related to the direct measurement of NO2 is instrument calibration. Accurate and traceable reference standards and NO2 calibration gases are needed. We present results from the application of different calibration strategies based on the use of static NO2 calibration gases as well as dynamic NO2 calibration gases produced by permeation and by gas-phase titration (GPT).

  8. A Modeling Approach for Plastic-Metal Laser Direct Joining

    Science.gov (United States)

    Lutey, Adrian H. A.; Fortunato, Alessandro; Ascari, Alessandro; Romoli, Luca

    2017-09-01

    Laser processing has been identified as a feasible approach to direct joining of metal and plastic components without the need for adhesives or mechanical fasteners. The present work sees development of a modeling approach for conduction and transmission laser direct joining of these materials based on multi-layer optical propagation theory and numerical heat flow simulation. The scope of this methodology is to predict process outcomes based on the calculated joint interface and upper surface temperatures. Three representative cases are considered for model verification, including conduction joining of PBT and aluminum alloy, transmission joining of optically transparent PET and stainless steel, and transmission joining of semi-transparent PA 66 and stainless steel. Conduction direct laser joining experiments are performed on black PBT and 6082 anticorodal aluminum alloy, achieving shear loads of over 2000 N with specimens of 2 mm thickness and 25 mm width. Comparison with simulation results shows that consistently high strength is achieved where the peak interface temperature is above the plastic degradation temperature. Comparison of transmission joining simulations and published experimental results confirms these findings and highlights the influence of plastic layer optical absorption on process feasibility.

  9. Computational modeling of direct-drive fusion pellets and KrF-driven foil experiments

    International Nuclear Information System (INIS)

    Gardner, J.H.; Schmitt, A.J.; Dahlburg, J.P.; Pawley, C.J.; Bodner, S.E.; Obenschain, S.P.; Serlin, V.; Aglitskiy, Y.

    1998-01-01

    FAST is a radiation transport hydrodynamics code that simulates laser matter interactions of relevance to direct-drive laser fusion target design. FAST solves the Euler equations of compressible flow using the Flux-Corrected Transport finite volume method. The advection algorithm provides accurate computation of flows from nearly incompressible vortical flows to those that are highly compressible and dominated by strong pressure and density gradients. In this paper we describe the numerical techniques and physics packages. FAST has also been benchmarked with Nike laser facility experiments in which linearly perturbed, low adiabat planar plastic targets are ablatively accelerated to velocities approaching 10 7 cm/s. Over a range of perturbation wavelengths, the code results agree with the measured Rayleigh endash Taylor growth from the linear through the deeply nonlinear regimes. FAST has been applied to the two-dimensional spherical simulation design to provide surface finish and laser bandwidth tolerances for a promising new direct-drive pellet that uses a foam ablator

  10. A methodology for flood risk appraisal in Lithuania

    Directory of Open Access Journals (Sweden)

    Kriščiukaitienė Irena

    2015-06-01

    Full Text Available This paper presents a methodology for flood risk mapping as envisaged by the Directive on the Assessment and Management of Flood Risks [Directive 2007/60/EC]. Specifically, we aimed at identifying the types of flood damage that can be estimated given data availability in Lithuania. Furthermore, we present the main sources of data and the associated cost functions. The methodology covers the following main types of flood threats: risk to inhabitants, risk to economic activity, and social risk. A multi-criteria framework for aggregation of different risks is proposed to provide a comprehensive appraisal of flood risk. On the basis of the proposed research, flood risk maps have been prepared for Lithuania. These maps are available for each type of flood risk (i.e. inhabitants, economic losses, social risk as well as for aggregate risk. The results indicate that flood risk management is crucial for western and central Lithuania, whereas other parts of the country are not likely to suffer from significant losses due to flooding.

  11. Methodology for technical risk assessment

    International Nuclear Information System (INIS)

    Waganer, L.M.; Zuckerman, D.S.

    1983-01-01

    A methodology has been developed for and applied to the assessment of the technical risks associated with an evolving technology. This methodology, originally developed for fusion by K. W. Billman and F. R. Scott at EPRI, has been applied to assess the technical risk of a fuel system for a fusion reactor. Technical risk is defined as the risk that a particular technology or component which is currently under development will not achieve a set of required technical specifications (i.e. probability of failure). The individual steps in the technical risk assessment are summarized. The first step in this methodology is to clearly and completely quantify the technical requirements for the particular system being examined. The next step is to identify and define subsystems and various options which appear capable of achieving the required technical performance. The subsystem options are then characterized regarding subsystem functions, interface requirements with the subsystems and systems, important components, developmental obstacles and technical limitations. Key technical subsystem performance parameters are identified which directly or indirectly relate to the system technical specifications. Past, existing and future technical performance data from subsystem experts are obtained by using a Bayesian Interrogation technique. The input data is solicited in the form of probability functions. Thus the output performance of the system is expressed as probability functions

  12. Study of methodology for low power/shutdown fire PSA

    International Nuclear Information System (INIS)

    Yan Zhen; Li Zhaohua; Li Lin; Song Lei

    2014-01-01

    As a risk assessment technology based on probability, the fire PSA is accepted abroad by nuclear industry in its application in the risk assessment for nuclear power plants. Based on the industry experience, the fire-induced impact on the plant safety during low power and shutdown operation cannot be neglected, therefore fire PSA can be used to assess the corresponding fire risk. However, there is no corresponding domestic guidance/standard as well as accepted analysis methodology up to date. Through investigating the latest evolvement on fire PSA during low power and shutdown operation, and integrating its characteristic with the corresponding engineering experience, an engineering methodology to evaluate the fire risk during low power and shutdown operation for nuclear power plant is established in this paper. In addition, an analysis demonstration as an example is given. (authors)

  13. Thermal and orbital analysis of Earth monitoring Sun-synchronous space experiments

    Science.gov (United States)

    Killough, Brian D.

    1990-01-01

    The fundamentals of an Earth monitoring Sun-synchronous orbit are presented. A Sun-synchronous Orbit Analysis Program (SOAP) was developed to calculate orbital parameters for an entire year. The output from this program provides the required input data for the TRASYS thermal radiation computer code, which in turn computes the infrared, solar and Earth albedo heat fluxes incident on a space experiment. Direct incident heat fluxes can be used as input to a generalized thermal analyzer program to size radiators and predict instrument operating temperatures. The SOAP computer code and its application to the thermal analysis methodology presented, should prove useful to the thermal engineer during the design phases of Earth monitoring Sun-synchronous space experiments.

  14. Final Progress Report: Direct Experiments on the Ocean Disposal of Fossil Fuel CO2.

    Energy Technology Data Exchange (ETDEWEB)

    James P. Barry; Peter G. Brewer

    2004-05-25

    OAK-B135 This report summarizes activities and results of investigations of the potential environmental consequences of direct injection of carbon dioxide into the deep-sea as a carbon sequestration method. Results of field experiments using small scale in situ releases of liquid CO2 are described in detail. The major conclusions of these experiments are that mortality rates of deep sea biota will vary depending on the concentrations of CO2 in deep ocean waters that result from a carbon sequestration project. Large changes in seawater acidity and carbon dioxide content near CO2 release sites will likely cause significant harm to deep-sea marine life. Smaller changes in seawater chemistry at greater distances from release sites will be less harmful, but may result in significant ecosystem changes.

  15. Base line definitions and methodological lessons from Zimbabwe

    International Nuclear Information System (INIS)

    Maya, R.S.

    1995-01-01

    The UNEP Greenhouse Gas Abatement Costing Studies carried out under the management of the UNEP Collaborating Centre On Energy and Environment at Risoe National Laboratories in Denmark has placed effort in generating methodological approaches to assessing the cost of abatement activities to reduce CO 2 emissions. These efforts have produced perhaps the most comprehensive set of methodological approaches to defining and assessing the cost of greenhouse gas abatement. Perhaps the most importance aspect of the UNEP study which involved teams of researchers from ten countries is the mix of countries in which the studies were conducted and hence the representation of views and concepts from researchers in these countries particularly those from developing countries namely, Zimbabwe, India, Venezuela, Brazil, Thailand and Senegal. Methodological lessons from Zimbabwe, therefore, would have benefited from the interactions with methodological experiences form the other participating countries. Methodological lessons from the Zimbabwean study can be placed in two categories. One relates to the modelling of tools to analyze economic trends and the various factors studied in order to determine the unit cost of CO 2 abatement. The other is the definition of factors influencing the levels of emissions reducible and those realised under specific economic trends. (au)

  16. A review of methodologies used in research on cadastral development

    DEFF Research Database (Denmark)

    Silva, Maria Augusta; Stubkjær, Erik

    2002-01-01

    to the acceptance of research methodologies needed for cadastral development, and thereby enhance theory in the cadastral domain. The paper reviews nine publica-tions on cadastre and identifies the methodologies used. The review focuses on the institutional, social political and economic aspects of cadastral......World-wide, much attention has been given to cadastral development. As a consequence of experiences made during the last decades, several authors have stated the need of research in the domain of cadastre and proposed methodologies to be used. The purpose of this paper is to contribute...... development, rather than on the technical aspects. The main conclusion of this paper is that the methodologies used are largely those of the social sciences. That agrees with the notion that cadastre relates as much to people and institutions, as it relates to land, and that cadastral systems are shaped...

  17. Effects of direct social experience on trust decisions and neural reward circuitry

    Directory of Open Access Journals (Sweden)

    Dominic S. Fareri

    2012-10-01

    Full Text Available The human striatum is integral for reward-processing and supports learning by linking experienced outcomes with prior expectations. Recent endeavors implicate the striatum in processing outcomes of social interactions, such as social approval/rejection, as well as in learning reputations of others. Interestingly, social impressions often influence our behavior with others during interactions. Information about an interaction partner’s moral character acquired from biographical information hinders updating of expectations after interactions via top down modulation of reward circuitry. An outstanding question is whether initial impressions formed through experience similarly modulate the ability to update social impressions at the behavioral and neural level. We investigated the role of experienced social information on trust behavior and reward-related BOLD activity. Participants played a computerized ball tossing game with three fictional partners manipulated to be perceived as good, bad or neutral. Participants then played an iterated trust game as investors with these same partners while undergoing fMRI. Unbeknownst to participants, partner behavior in the trust game was random and unrelated to their ball-tossing behavior. Participants’ trust decisions were influenced by their prior experience in the ball tossing game, investing less often with the bad partner compared to the good and neutral. Reinforcement learning models revealed that participants were more sensitive to updating their beliefs about good and bad partners when experiencing outcomes consistent with initial experience. Increased striatal and anterior cingulate BOLD activity for positive versus negative trust game outcomes emerged, which further correlated with model-derived prediction-error (PE learning signals. These results suggest that initial impressions formed from direct social experience can be continually shaped by consistent information through reward learning

  18. Effects of Direct Social Experience on Trust Decisions and Neural Reward Circuitry

    Science.gov (United States)

    Fareri, Dominic S.; Chang, Luke J.; Delgado, Mauricio R.

    2012-01-01

    The human striatum is integral for reward-processing and supports learning by linking experienced outcomes with prior expectations. Recent endeavors implicate the striatum in processing outcomes of social interactions, such as social approval/rejection, as well as in learning reputations of others. Interestingly, social impressions often influence our behavior with others during interactions. Information about an interaction partner’s moral character acquired from biographical information hinders updating of expectations after interactions via top down modulation of reward circuitry. An outstanding question is whether initial impressions formed through experience similarly modulate the ability to update social impressions at the behavioral and neural level. We investigated the role of experienced social information on trust behavior and reward-related BOLD activity. Participants played a computerized ball-tossing game with three fictional partners manipulated to be perceived as good, bad, or neutral. Participants then played an iterated trust game as investors with these same partners while undergoing fMRI. Unbeknownst to participants, partner behavior in the trust game was random and unrelated to their ball-tossing behavior. Participants’ trust decisions were influenced by their prior experience in the ball-tossing game, investing less often with the bad partner compared to the good and neutral. Reinforcement learning models revealed that participants were more sensitive to updating their beliefs about good and bad partners when experiencing outcomes consistent with initial experience. Increased striatal and anterior cingulate BOLD activity for positive versus negative trust game outcomes emerged, which further correlated with model-derived prediction error learning signals. These results suggest that initial impressions formed from direct social experience can be continually shaped by consistent information through reward learning mechanisms. PMID:23087604

  19. Recommendations for benefit-risk assessment methodologies and visual representations

    DEFF Research Database (Denmark)

    Hughes, Diana; Waddingham, Ed; Mt-Isa, Shahrul

    2016-01-01

    PURPOSE: The purpose of this study is to draw on the practical experience from the PROTECT BR case studies and make recommendations regarding the application of a number of methodologies and visual representations for benefit-risk assessment. METHODS: Eight case studies based on the benefit......-risk balance of real medicines were used to test various methodologies that had been identified from the literature as having potential applications in benefit-risk assessment. Recommendations were drawn up based on the results of the case studies. RESULTS: A general pathway through the case studies...

  20. Q and you: The application of Q methodology in recreation research

    Science.gov (United States)

    Whitney. Ward

    2010-01-01

    Researchers have used various qualitative and quantitative methods to deal with subjectivity in studying people's recreation experiences. Q methodology has been the most effective approach for analyzing both qualitative and quantitative aspects of experience, including attitudes or perceptions. The method is composed of two main components--Q sorting and Q factor...

  1. Development of Management Methodology for Engineering Production Quality

    Science.gov (United States)

    Gorlenko, O.; Miroshnikov, V.; Borbatc, N.

    2016-04-01

    The authors of the paper propose four directions of the methodology developing the quality management of engineering products that implement the requirements of new international standard ISO 9001:2015: the analysis of arrangement context taking into account stakeholders, the use of risk management, management of in-house knowledge, assessment of the enterprise activity according to the criteria of effectiveness

  2. Theoretical and Methodological Considerations on the Public Offers

    OpenAIRE

    Claudia Catalina SAVA

    2013-01-01

    This paper describes the most important characteristics of the public offers, both from the theoretical and methodological view. The European Union emphasizes clarity and transparency. The author focuses on specific provisions of European Directive and Romanian law and regulations related to voluntary and mandatory takeover bids, on characteristics such as price, offeror and offeeree right, offer timetable.

  3. Stepping up for Childhood: A Contextual Critical Methodology

    Science.gov (United States)

    Kyriacopoulos, Konstantine; Sánchez, Marta

    2017-01-01

    In this paper, we theorize a critical methodology for education centering community experiences of systemic injustice, drawing upon Critical Race Theory, critical educational leadership studies, Chicana feminism, participant action research and political theory, to refocus our work on the human relationships at the center of the learning and…

  4. 42 CFR 493.649 - Methodology for determining fee amount.

    Science.gov (United States)

    2010-10-01

    ... fringe benefit costs to support the required number of State inspectors, management and direct support... full time equivalent employee. Included in this cost are salary and fringe benefit costs, necessary... 42 Public Health 5 2010-10-01 2010-10-01 false Methodology for determining fee amount. 493.649...

  5. Propensity score methodology for confounding control in health care utilization databases

    Directory of Open Access Journals (Sweden)

    Elisabetta Patorno

    2013-06-01

    Full Text Available Propensity score (PS methodology is a common approach to control for confounding in nonexperimental studies of treatment effects using health care utilization databases. This methodology offers researchers many advantages compared with conventional multivariate models: it directly focuses on the determinants of treatment choice, facilitating the understanding of the clinical decision-making process by the researcher; it allows for graphical comparisons of the distribution of propensity scores and truncation of subjects without overlapping PS indicating a lack of equipoise; it allows transparent assessment of the confounder balance achieved by the PS at baseline; and it offers a straightforward approach to reduce the dimensionality of sometimes large arrays of potential confounders in utilization databases, directly addressing the “curse of dimensionality” in the context of rare events. This article provides an overview of the use of propensity score methodology for pharmacoepidemiologic research with large health care utilization databases, covering recent discussions on covariate selection, the role of automated techniques for addressing unmeasurable confounding via proxies, strategies to maximize clinical equipoise at baseline, and the potential of machine-learning algorithms for optimized propensity score estimation. The appendix discusses the available software packages for PS methodology. Propensity scores are a frequently used and versatile tool for transparent and comprehensive adjustment of confounding in pharmacoepidemiology with large health care databases.

  6. Methodological considerations in the use of audio diaries in work psychology: Adding to the qualitative toolkit.

    Science.gov (United States)

    Crozier, Sarah E; Cassell, Catherine M

    2016-06-01

    The use of longitudinal methodology as a means of capturing the intricacies in complex organizational phenomena is well documented, and many different research strategies for longitudinal designs have been put forward from both a qualitative and quantitative stance. This study explores a specific emergent qualitative methodology, audio diaries, and assesses their utility for work psychology research drawing on the findings from a four-stage study addressing transient working patterns and stress in UK temporary workers. Specifically, we explore some important methodological, analytical and technical issues for practitioners and researchers who seek to use these methods and explain how this type of methodology has much to offer when studying stress and affective experiences at work. We provide support for the need to implement pluralistic and complementary methodological approaches in unearthing the depth in sense-making and assert their capacity to further illuminate the process orientation of stress. This study illustrates the importance of verbalization in documenting stress and affective experience as a mechanism for accessing cognitive processes in making sense of such experience.This study compares audio diaries with more traditional qualitative methods to assess applicability to different research contexts.This study provides practical guidance and a methodological framework for the design of audio diary research and design, taking into account challenges and solutions for researchers and practitioners.

  7. EDELWEISS-II, direct Dark Matter search experiment: first data analysis and results

    International Nuclear Information System (INIS)

    Scorza, Silvia

    2009-01-01

    One of the greatest mysteries of the universe that, for the present, puzzles the mind of most astronomers, cosmologists and physicists is the question: 'What makes up our universe?'. This is due to how a certain substance named Dark Matter came under speculation. It is believed this enigmatic substance, of type unknown, accounts for almost three-quarters of the cosmos within the universe, could be the answer to several questions raised by the models of the expanding universe astronomers have created, and even decide the fate of the expansion of the universe. There is strong observational evidence for the dominance of non-baryonic Dark Matter (DM) over baryonic matter in the universe. Such evidence comes from many independent observations over different length scales. The most stringent constraint on the abundance of DM comes from the analysis of the Cosmic Microwave Background (CMB) anisotropies. In particular, the WMAP (Wilkinson Microwave Anisotropy Probe) experiment restricts the abundance of matter and the abundance of baryonic matter in good agreement with predictions from Big Bang Nucleosynthesis. It is commonly believed that such a non-baryonic component could consist of new, as yet undiscovered, particles, usually referred to as WIMPs (Weakly Interacting Massive Particles). Some extensions of the standard model (SM) of particle physics predict the existence of particles that would be excellent DM candidates. In particular great attention has been dedicated to candidates arising in supersymmetric theories: the Lightest Supersymmetric Particle (LSP). In the most supersymmetric scenarios, the so-called neutralino seems to be a natural candidate, being stable in theories with conservation of R-parity and having masses and cross sections of typical WIMPs. The EDELWEISS collaboration is a direct dark matter search experiment, aiming to detect directly a WIMP interaction in a target material, high purity germanium crystal working at cryogenic temperatures. It

  8. Implications of the recent D-T μCF experiments at RIKEN-RAL and near-future directions

    International Nuclear Information System (INIS)

    Nagamine, K.; Matsuzaki, T.; Ishida, K.; Nakamura, S.N.; Kawamura, N.

    1999-01-01

    The paper describes physics implications obtained through the recent experimental results on D-T μCF at RIKEN-RAL. Smaller sticking and larger cycling rates in solid/liquid D-T mixture than the theoretical predictions were observed, suggesting needs of further theoretical understandings. Some possible future directions in D-T μCF experiments are also described

  9. The EIA Directive of the European Union - some experiences

    Energy Technology Data Exchange (ETDEWEB)

    Verheem, R. [EIA Commission (Netherlands)

    1995-12-01

    Information is presented on the provisions of the existing European Council Directive on EIA for projects 85/337, some of the main findings of the report from the European Commission of the implementation of the Directive, in particular as regards involvement of the public and a short discussion of the proposed modification of the Directive. The directive has the characteristics of a `framework law`. It establishes basic assessment principles and procedural requirements, and then allows Member States considerable discretion with regard to the transposition of their details into national legislation, provided that these basics are respected. The information in this article is solely intended to be an overview of the main provisions of the Directive.

  10. Controlled Nucleosynthesis Breakthroughs in Experiment and Theory

    CERN Document Server

    Adamenko, Stanislav; Merwe, Alwyn

    2007-01-01

    This book ushers in a new era of experimental and theoretical investigations into collective processes, structure formation, and self-organization of nuclear matter. It reports the results of experiments wherein for the first time the nuclei constituting our world (those displayed in Mendeleev's table as well as the super-heavy ones) have been artificially created. Pioneering breakthroughs are described, achieved at the "Proton-21" Laboratory, Kiev, Ukraine, in a variety of new physical and technological directions. A detailed description of the main experiments, their analyses, and the interpretation of copious experimental data are given, along with the methodology governing key measurements and the processing algorithms of the data that empirically confirm the occurrence of macroscopic self-organizing processes leading to the nuclear transformations of various materials. The basic concepts underlying the initiation of self-sustaining collective processes that result in the formation of nuclear structures a...

  11. Multimodal hybrid reasoning methodology for personalized wellbeing services.

    Science.gov (United States)

    Ali, Rahman; Afzal, Muhammad; Hussain, Maqbool; Ali, Maqbool; Siddiqi, Muhammad Hameed; Lee, Sungyoung; Ho Kang, Byeong

    2016-02-01

    A wellness system provides wellbeing recommendations to support experts in promoting a healthier lifestyle and inducing individuals to adopt healthy habits. Adopting physical activity effectively promotes a healthier lifestyle. A physical activity recommendation system assists users to adopt daily routines to form a best practice of life by involving themselves in healthy physical activities. Traditional physical activity recommendation systems focus on general recommendations applicable to a community of users rather than specific individuals. These recommendations are general in nature and are fit for the community at a certain level, but they are not relevant to every individual based on specific requirements and personal interests. To cover this aspect, we propose a multimodal hybrid reasoning methodology (HRM) that generates personalized physical activity recommendations according to the user׳s specific needs and personal interests. The methodology integrates the rule-based reasoning (RBR), case-based reasoning (CBR), and preference-based reasoning (PBR) approaches in a linear combination that enables personalization of recommendations. RBR uses explicit knowledge rules from physical activity guidelines, CBR uses implicit knowledge from experts׳ past experiences, and PBR uses users׳ personal interests and preferences. To validate the methodology, a weight management scenario is considered and experimented with. The RBR part of the methodology generates goal, weight status, and plan recommendations, the CBR part suggests the top three relevant physical activities for executing the recommended plan, and the PBR part filters out irrelevant recommendations from the suggested ones using the user׳s personal preferences and interests. To evaluate the methodology, a baseline-RBR system is developed, which is improved first using ranged rules and ultimately using a hybrid-CBR. A comparison of the results of these systems shows that hybrid-CBR outperforms the

  12. Application of Haddon’s matrix in qualitative research methodology: an experience in burns epidemiology

    Directory of Open Access Journals (Sweden)

    Deljavan R

    2012-07-01

    Full Text Available Reza Deljavan,1 Homayoun Sadeghi-Bazarganim,2,3 Nasrin Fouladim,4 Shahnam Arshi,5 Reza Mohammadi61Injury Epidemiology and Prevention Research Center, 2Neuroscience Research Center, Department of Statistics and Epidemiology, Tabriz University of Medical Sciences, Tabriz, Iran; 3Public Health Department, Karolinska Institute, Stockholm, Sweden; 4Ardabil University of Medical Sciences, Ardabil, Iran; 5Shahid Beheshti University of Medical Sciences, Tehran, Iran; 6Public Health Department, Karolinska Institute, Stockholm, SwedenBackground: Little has been done to investigate the application of injury specific qualitative research methods in the field of burn injuries. The aim of this study was to use an analytical tool (Haddon’s matrix through qualitative research methods to better understand people’s perceptions about burn injuries.Methods: This study applied Haddon’s matrix as a framework and an analytical tool for a qualitative research methodology in burn research. Both child and adult burn injury victims were enrolled into a qualitative study conducted using focus group discussion. Haddon’s matrix was used to develop an interview guide and also through the analysis phase.Results: The main analysis clusters were pre-event level/human (including risky behaviors, belief and cultural factors, and knowledge and education, pre-event level/object, pre-event phase/environment and event and post-event phase (including fire control, emergency scald and burn wound management, traditional remedies, medical consultation, and severity indicators. This research gave rise to results that are possibly useful both for future injury research and for designing burn injury prevention plans.Conclusion: Haddon’s matrix is applicable in a qualitative research methodology both at data collection and data analysis phases. The study using Haddon’s matrix through a qualitative research methodology yielded substantially rich information regarding burn injuries

  13. Large sample NAA facility and methodology development

    International Nuclear Information System (INIS)

    Roth, C.; Gugiu, D.; Barbos, D.; Datcu, A.; Aioanei, L.; Dobrea, D.; Taroiu, I. E.; Bucsa, A.; Ghinescu, A.

    2013-01-01

    A Large Sample Neutron Activation Analysis (LSNAA) facility has been developed at the TRIGA- Annular Core Pulsed Reactor (ACPR) operated by the Institute for Nuclear Research in Pitesti, Romania. The central irradiation cavity of the ACPR core can accommodate a large irradiation device. The ACPR neutron flux characteristics are well known and spectrum adjustment techniques have been successfully applied to enhance the thermal component of the neutron flux in the central irradiation cavity. An analysis methodology was developed by using the MCNP code in order to estimate counting efficiency and correction factors for the major perturbing phenomena. Test experiments, comparison with classical instrumental neutron activation analysis (INAA) methods and international inter-comparison exercise have been performed to validate the new methodology. (authors)

  14. Methodology applied to develop the DHIE: applied methodology

    CSIR Research Space (South Africa)

    Herselman, Marlien

    2016-12-01

    Full Text Available This section will address the methodology that was applied to develop the South African Digital Health Innovation Ecosystem (DHIE). Each chapter under Section B represents a specific phase in the methodology....

  15. Continuous culture apparatus and methodology

    International Nuclear Information System (INIS)

    Conway, H.L.

    1975-01-01

    At present, we are investigating the sorption of potentially toxic trace elements by phytoplankton under controlled laboratory conditions. Continuous culture techniques were used to study the mechanism of the sorption of the trace elements by unialgal diatom populations and the factors influencing this sorption. Continuous culture methodology has been used extensively to study bacterial kinetics. It is an excellent technique for obtaining a known physiological state of phytoplankton populations. An automated method for the synthesis of continuous culture medium for use in these experiments is described

  16. Evolution of teaching and evaluation methodologies: The experience in the computer programming course at the Universidad Nacional de Colombia

    Directory of Open Access Journals (Sweden)

    Jonatan Gomez Perdomo

    2014-05-01

    Full Text Available In this paper, we present the evolution of a computer-programming course at the Universidad Nacional de Colombia (UNAL. The teaching methodology has evolved from a linear and non-standardized methodology to a flexible, non-linear and student-centered methodology. Our methodology uses an e-learning platform that supports the learning process by offering students and professors custom navigation between the content and material in an interactive way (book chapters, exercises, videos. Moreover, the platform is open access, and approximately 900 students from the university take this course each term. However, our evaluation methodology has evolved from static evaluations based on paper tests to an online process based on computer adaptive testing (CAT that chooses the questions to ask a student and assigns the student a grade according to the student’s ability.

  17. CAGE IIIA Distributed Simulation Design Methodology

    Science.gov (United States)

    2014-05-01

    2 VHF Very High Frequency VLC Video LAN Codec – an Open-source cross-platform multimedia player and framework VM Virtual Machine VOIP Voice Over...Implementing Defence Experimentation (GUIDEx). The key challenges for this methodology are with understanding how to: • design it o define the...operation and to be available in the other nation’s simulations. The challenge for the CAGE campaign of experiments is to continue to build upon this

  18. Direct Experiments on the Ocean Disposal of Fossil Fuel CO2

    Energy Technology Data Exchange (ETDEWEB)

    Barry, James, P.

    2010-05-26

    Funding from DoE grant # FG0204-ER63721, Direct Experiments on the Ocean Disposal of Fossil Fuel CO2, supposed several postdoctoral fellows and research activities at MBARI related to ocean CO2 disposal and the biological consequences of high ocean CO2 levels on marine organisms. Postdocs supported on the project included Brad Seibel, now an associate professor at the University of Rhode Island, Jeff Drazen, now an associate professor at the University of Hawaii, and Eric Pane, who continues as a research associate at MBARI. Thus, the project contributed significantly to the professional development of young scientists. In addition, we made significant progress in several research areas. We continued several deep-sea CO2 release experiments using support from DoE and MBARI, along with several collaborators. These CO2 release studies had the goal of broadening our understanding of the effects of high ocean CO2 levels on deep sea animals in the vicinity of potential release sites for direct deep-ocean carbon dioxide sequestration. Using MBARI ships and ROVs, we performed these experiments at depths of 3000 to 3600 m, where liquid CO2 is heavier than seawater. CO2 was released into small pools (sections of PVC pipe) on the seabed, where it dissolved and drifted downstream, bathing any caged animals and sediments in a CO2-rich, low-pH plume. We assessed the survival of organisms nearby. Several publications arose from these studies (Barry et al. 2004, 2005; Carman et al. 2004; Thistle et al. 2005, 2006, 2007; Fleeger et al. 2006, 2010; Barry and Drazen 2007; Bernhard et al. 2009; Sedlacek et al. 2009; Ricketts et al. in press; Barry et al, in revision) concerning the sensitivity of animals to low pH waters. Using funds from DoE and MBARI, we designed and fabricated a hyperbaric trap-respirometer to study metabolic rates of deep-sea fishes under high CO2 conditions (Drazen et al, 2005), as well as a gas-control aquarium system to support laboratory studies of the

  19. Language barriers and qualitative nursing research: methodological considerations.

    Science.gov (United States)

    Squires, A

    2008-09-01

    This review of the literature synthesizes methodological recommendations for the use of translators and interpreters in cross-language qualitative research. Cross-language qualitative research involves the use of interpreters and translators to mediate a language barrier between researchers and participants. Qualitative nurse researchers successfully address language barriers between themselves and their participants when they systematically plan for how they will use interpreters and translators throughout the research process. Experienced qualitative researchers recognize that translators can generate qualitative data through translation processes and by participating in data analysis. Failure to address language barriers and the methodological challenges they present threatens the credibility, transferability, dependability and confirmability of cross-language qualitative nursing research. Through a synthesis of the cross-language qualitative methods literature, this article reviews the basics of language competence, translator and interpreter qualifications, and roles for each kind of qualitative research approach. Methodological and ethical considerations are also provided. By systematically addressing the methodological challenges cross-language research presents, nurse researchers can produce better evidence for nursing practice and policy making when working across different language groups. Findings from qualitative studies will also accurately represent the experiences of the participants without concern that the meaning was lost in translation.

  20. Methodology for energy audits in the framework of the energy efficiency directive

    OpenAIRE

    Méchaussie, Elfie; Maréchal, François; Van Eetvelde, Greet

    2015-01-01

    The Energy Efficiency Directive 2012/27/EU (EED) was released in October 2012 and transposed in June 2014 by Member States. The Directive requires large companies to carry out an energy audit before December 2015, which has to be repeated every 4 years. A possibility for companies to be exempted from regular energy audits is to be or become certified by an approved energy management system (EnMS), most likely the international standard ISO 50001. In both cases it means that companies have to ...

  1. Urban metabolism: A review of research methodologies

    International Nuclear Information System (INIS)

    Zhang, Yan

    2013-01-01

    Urban metabolism analysis has become an important tool for the study of urban ecosystems. The problems of large metabolic throughput, low metabolic efficiency, and disordered metabolic processes are a major cause of unhealthy urban systems. In this paper, I summarize the international research on urban metabolism, and describe the progress that has been made in terms of research methodologies. I also review the methods used in accounting for and evaluating material and energy flows in urban metabolic processes, simulation of these flows using a network model, and practical applications of these methods. Based on this review of the literature, I propose directions for future research, and particularly the need to study the urban carbon metabolism because of the modern context of global climate change. Moreover, I recommend more research on the optimal regulation of urban metabolic systems. Highlights: •Urban metabolic processes can be analyzed by regarding cities as superorganisms. •Urban metabolism methods include accounting, assessment, modeling, and regulation. •Research methodologies have improved greatly since this field began in 1965. •Future research should focus on carbon metabolism and optimal regulation. -- The author reviews research progress in the field of urban metabolism, and based on her literature review, proposes directions for future research

  2. Experiments to investigate direct containment heating phenomena with scaled models of the Surry Nuclear Power Plant

    International Nuclear Information System (INIS)

    Blanchat, T.K.; Allen, M.D.; Pilch, M.M.

    1994-01-01

    The Containment Technology Test Facility (CTTF) and the Surtsey Test Facility at Sandia National Laboratories (SNL) are used to perform scaled experiments for the Nuclear Regulatory Commission (NRC) that simulate High Pressure Melt Ejection (HPME) accidents in a nuclear power plant (NPP). These experiments are designed to investigate the effects of direct containment heating (DCH) phenomena on the containment load. High-temperature, chemically reactive melt is ejected by high-pressure steam into a scale model of a reactor cavity. Debris is entrained by the steam blowdown into a containment model where specific phenomena, such as the effect of subcompartment structures, prototypic atmospheres, and hydrogen generation and combustion, can be studied

  3. A novel porous Ffowcs-Williams and Hawkings acoustic methodology for complex geometries

    Science.gov (United States)

    Nitzkorski, Zane Lloyd

    flows are investigated at Re = 3900, 10000 and 89000 in order to evaluate the method and investigate the physical sources of noise production. The Re = 3900 case was chosen due to its highly validated flow-field and to serve as a basis of comparison. The Re = 10000 cylinder is used to validate the noise production at turbulent Reynolds numbers against other simulations. Finally the Re = 89000 simulations are used to compare to experiment serving as a rigorous test of the methods predictive ability. The proposed approach demonstrates better performance than other commonly used approaches with the added benefit of computational efficiency and the ability to query independent volumes. This gives the added benefit of discovering how much noise production is directly associated with volumetric noise contributions. These capabilities allow for a thorough investigation of the sources of noise production and a means to evaluate proposed theories. A physical description of the source of sound for subcritical Reynolds number cylinders is established. A 45° beveled trailing edge configuration is investigated due to its relevance to hydrofoil and propeller noise. This configuration also allows for the evaluation of the assumption associated with the free-space Green's function since the half-plane Green's function can be used to represent the solution to the wave equation for this geometry. Similar results for directivity and amplitudes of the two formulations confirm the flexibility of the porous surface implementation. Good agreement with experiment is obtained. The effect of boundary layer thickness is investigated. The noise produced in the upper half plane is significantly decreased for the thinner boundary layer while the noise production in the lower half plane is only slightly decreased.

  4. Managing Complex Battlespace Environments Using Attack the Network Methodologies

    DEFF Research Database (Denmark)

    Mitchell, Dr. William L.

    This paper examines the last 8 years of development and application of Attack the Network (AtN) intelligence methodologies for creating shared situational understanding of complex battlespace environment and the development of deliberate targeting frameworks. It will present a short history...... of their development, how they are integrated into operational planning through strategies of deliberate targeting for modern operations. The paper will draw experience and case studies from Iraq, Syria, and Afghanistan and will offer some lessons learned as well as insight into the future of these methodologies....... Including their possible application on a national security level for managing longer strategic endeavors....

  5. An enhanced topologically significant directed random walk in cancer classification using gene expression datasets

    Directory of Open Access Journals (Sweden)

    Choon Sen Seah

    2017-12-01

    Full Text Available Microarray technology has become one of the elementary tools for researchers to study the genome of organisms. As the complexity and heterogeneity of cancer is being increasingly appreciated through genomic analysis, cancerous classification is an emerging important trend. Significant directed random walk is proposed as one of the cancerous classification approach which have higher sensitivity of risk gene prediction and higher accuracy of cancer classification. In this paper, the methodology and material used for the experiment are presented. Tuning parameter selection method and weight as parameter are applied in proposed approach. Gene expression dataset is used as the input datasets while pathway dataset is used to build a directed graph, as reference datasets, to complete the bias process in random walk approach. In addition, we demonstrate that our approach can improve sensitive predictions with higher accuracy and biological meaningful classification result. Comparison result takes place between significant directed random walk and directed random walk to show the improvement in term of sensitivity of prediction and accuracy of cancer classification.

  6. Drug-targeting methodologies with applications: A review

    Science.gov (United States)

    Kleinstreuer, Clement; Feng, Yu; Childress, Emily

    2014-01-01

    Targeted drug delivery to solid tumors is a very active research area, focusing mainly on improved drug formulation and associated best delivery methods/devices. Drug-targeting has the potential to greatly improve drug-delivery efficacy, reduce side effects, and lower the treatment costs. However, the vast majority of drug-targeting studies assume that the drug-particles are already at the target site or at least in its direct vicinity. In this review, drug-delivery methodologies, drug types and drug-delivery devices are discussed with examples in two major application areas: (1) inhaled drug-aerosol delivery into human lung-airways; and (2) intravascular drug-delivery for solid tumor targeting. The major problem addressed is how to deliver efficiently the drug-particles from the entry/infusion point to the target site. So far, most experimental results are based on animal studies. Concerning pulmonary drug delivery, the focus is on the pros and cons of three inhaler types, i.e., pressurized metered dose inhaler, dry powder inhaler and nebulizer, in addition to drug-aerosol formulations. Computational fluid-particle dynamics techniques and the underlying methodology for a smart inhaler system are discussed as well. Concerning intravascular drug-delivery for solid tumor targeting, passive and active targeting are reviewed as well as direct drug-targeting, using optimal delivery of radioactive microspheres to liver tumors as an example. The review concludes with suggestions for future work, considereing both pulmonary drug targeting and direct drug delivery to solid tumors in the vascular system. PMID:25516850

  7. BATSE gamma-ray burst line search. 2: Bayesian consistency methodology

    Science.gov (United States)

    Band, D. L.; Ford, L. A.; Matteson, J. L.; Briggs, M.; Paciesas, W.; Pendleton, G.; Preece, R.; Palmer, D.; Teegarden, B.; Schaefer, B.

    1994-01-01

    We describe a Bayesian methodology to evaluate the consistency between the reported Ginga and Burst and Transient Source Experiment (BATSE) detections of absorption features in gamma-ray burst spectra. Currently no features have been detected by BATSE, but this methodology will still be applicable if and when such features are discovered. The Bayesian methodology permits the comparison of hypotheses regarding the two detectors' observations and makes explicit the subjective aspects of our analysis (e.g., the quantification of our confidence in detector performance). We also present non-Bayesian consistency statistics. Based on preliminary calculations of line detectability, we find that both the Bayesian and non-Bayesian techniques show that the BATSE and Ginga observations are consistent given our understanding of these detectors.

  8. Experiments for Multi-Stage Processes

    DEFF Research Database (Denmark)

    Tyssedal, John; Kulahci, Murat

    2015-01-01

    Multi-stage processes are very common in both process and manufacturing industries. In this article we present a methodology for designing experiments for multi-stage processes. Typically in these situations the design is expected to involve many factors from different stages. To minimize...... the required number of experimental runs, we suggest using mirror image pairs of experiments at each stage following the first. As the design criterion, we consider their projectivity and mainly focus on projectivity 3 designs. We provide the methodology for generating these designs for processes with any...

  9. Methodological Potential of Computer Experiment in Teaching Mathematics at University

    Science.gov (United States)

    Lin, Kequan; Sokolova, Anna Nikolaevna; Vlasova, Vera K.

    2017-01-01

    The study is relevant due to the opportunity of increasing efficiency of teaching mathematics at university through integration of students of computer experiment conducted with the use of IT in this process. The problem of there search is defined by a contradiction between great potential opportunities of mathematics experiment for motivating and…

  10. Qualitative experiments in psychology

    DEFF Research Database (Denmark)

    Wagoner, Brady

    2015-01-01

    In this article, I explore the meaning of experiments in early twentieth century psychology, focusing on the qualitative experimental methodology of psychologist Frederic BARTLETT. I begin by contextualizing BARTLETT's experiments within the continental research tradition of his time, which...... was in a state of transition from a focus on elements (the concern of psychophysics) to a focus on wholes (the concern of Gestalt psychology). The defining feature of BARTLETT's early experiments is his holistic treatment of human responses, in which the basic unit of analysis is the active person relating...... to some material within the constraints of a social and material context. This manifests itself in a number of methodological principles that contrast with contemporary understandings of experimentation in psychology. The contrast is further explored by reviewing the history of "replications...

  11. Strategic directions of computing at Fermilab

    International Nuclear Information System (INIS)

    Wolbers, S.

    1997-04-01

    Fermilab computing has changed a great deal over the years, driven by the demands of the Fermilab experimental community to record and analyze larger and larger datasets, by the desire to take advantage of advances in computing hardware and software, and by the advances coming from the R ampersand D efforts of the Fermilab Computing Division. The strategic directions of Fermilab Computing continue to be driven by the needs of the experimental program. The current fixed-target run will produce over 100 TBytes of raw data and systems must be in place to allow the timely analysis of the data. The collider run II, beginning in 1999, is projected to produce of order 1 PByte of data per year. There will be a major change in methodology and software language as the experiments move away from FORTRAN and into object- oriented languages. Increased use of automation and the reduction of operator-assisted tape mounts will be required to meet the needs of the large experiments and large data sets. Work will continue on higher-rate data acquisition systems for future experiments and project. R ampersand D projects will be pursued as necessary to provide software, tools, or systems which cannot be purchased or acquired elsewhere. A closer working relation with other high energy laboratories will be pursued to reduce duplication of effort and to allow effective collaboration on many aspects of HEP computing

  12. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  13. Researching experiences

    DEFF Research Database (Denmark)

    Gjedde, Lisa; Ingemann, Bruno

    In the beginning was - not the word - but the experience. This phenomenological approach provides the basis for this book, which focuses on how a person-in-situation experiences and constructs meaning from a variety of cultural visual events. This book presents video-based processual methods......, dialogue, moods, values and narratives have been investigated qualitatively with more than sixty informants in a range of projects. The processual methodological insights are put into a theoretical perspective and also presented as pragmatic dilemmas. Researching Experiences is relevant not only...

  14. Methodological Aspects of Modelling and Simulation of Robotized Workstations

    Directory of Open Access Journals (Sweden)

    Naqib Daneshjo

    2018-05-01

    Full Text Available From the point of view of development of application and program products, key directions that need to be respected in computer support for project activities are quite clearly specified. User interfaces with a high degree of graphical interactive convenience, two-dimensional and three-dimensional computer graphics contribute greatly to streamlining project methodologies and procedures in particular. This is mainly due to the fact that a high number of solved tasks is clearly graphic in the modern design of robotic systems. Automation of graphical character tasks is therefore a significant development direction for the subject area. The authors present results of their research in the area of automation and computer-aided design of robotized systems. A new methodical approach to modelling robotic workstations, consisting of ten steps incorporated into the four phases of the logistics process of creating and implementing a robotic workplace, is presented. The emphasis is placed on the modelling and simulation phase with verification of elaborated methodologies on specific projects or elements of the robotized welding plant in automotive production.

  15. PICTURE: a sounding rocket experiment for direct imaging of an extrasolar planetary environment

    Science.gov (United States)

    Mendillo, Christopher B.; Hicks, Brian A.; Cook, Timothy A.; Bifano, Thomas G.; Content, David A.; Lane, Benjamin F.; Levine, B. Martin; Rabin, Douglas; Rao, Shanti R.; Samuele, Rocco; Schmidtlin, Edouard; Shao, Michael; Wallace, J. Kent; Chakrabarti, Supriya

    2012-09-01

    The Planetary Imaging Concept Testbed Using a Rocket Experiment (PICTURE 36.225 UG) was designed to directly image the exozodiacal dust disk of ǫ Eridani (K2V, 3.22 pc) down to an inner radius of 1.5 AU. PICTURE carried four key enabling technologies on board a NASA sounding rocket at 4:25 MDT on October 8th, 2011: a 0.5 m light-weight primary mirror (4.5 kg), a visible nulling coronagraph (VNC) (600-750 nm), a 32x32 element MEMS deformable mirror and a milliarcsecond-class fine pointing system. Unfortunately, due to a telemetry failure, the PICTURE mission did not achieve scientific success. Nonetheless, this flight validated the flight-worthiness of the lightweight primary and the VNC. The fine pointing system, a key requirement for future planet-imaging missions, demonstrated 5.1 mas RMS in-flight pointing stability. We describe the experiment, its subsystems and flight results. We outline the challenges we faced in developing this complex payload and our technical approaches.

  16. Transcranial Direct Current Stimulation in Epilepsy.

    Science.gov (United States)

    San-Juan, Daniel; Morales-Quezada, León; Orozco Garduño, Adolfo Josué; Alonso-Vanegas, Mario; González-Aragón, Maricarmen Fernández; Espinoza López, Dulce Anabel; Vázquez Gregorio, Rafael; Anschel, David J; Fregni, Felipe

    2015-01-01

    Transcranial direct current stimulation (tDCS) is an emerging non-invasive neuromodulation therapy in epilepsy with conflicting results in terms of efficacy and safety. Review the literature about the efficacy and safety of tDCS in epilepsy in humans and animals. We searched studies in PubMed, MedLine, Scopus, Web of Science and Google Scholar (January 1969 to October 2013) using the keywords 'transcranial direct current stimulation' or 'tDCS' or 'brain polarization' or 'galvanic stimulation' and 'epilepsy' in animals and humans. Original articles that reported tDCS safety and efficacy in epileptic animals or humans were included. Four review authors independently selected the studies, extracted data and assessed the methodological quality of the studies using the recommendations of the Cochrane Handbook for Systematic Reviews of Interventions, PRISMA guidelines and Jadad Scale. A meta-analysis was not possible due to methodological, clinical and statistical heterogeneity of included studies. We analyzed 9 articles with different methodologies (3 animals/6 humans) with a total of 174 stimulated individuals; 109 animals and 65 humans. In vivo and in vitro animal studies showed that direct current stimulation can successfully induce suppression of epileptiform activity without neurological injury and 4/6 (67%) clinical studies showed an effective decrease in epileptic seizures and 5/6 (83%) reduction of inter-ictal epileptiform activity. All patients tolerated tDCS well. tDCS trials have demonstrated preliminary safety and efficacy in animals and patients with epilepsy. Further larger studies are needed to define the best stimulation protocols and long-term follow-up. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. [Small scale direct oxide reduction (DOR) experiments

    International Nuclear Information System (INIS)

    1987-01-01

    Objectives were to provide process design information to the Plutonium Recovery Project and to produce DOR (direct oxide reduction) product which meets Foundry purity specifications and Oh-0 Foundry specifications

  18. Action research methodology in clinical pharmacy

    DEFF Research Database (Denmark)

    Nørgaard, Lotte Stig; Sørensen, Ellen Westh

    2016-01-01

    Introduction The focus in clinical pharmacy practice is and has for the last 30-35 years been on changing the role of pharmacy staff into service orientation and patient counselling. One way of doing this is by involving staff in change process and as a researcher to take part in the change process...... by establishing partnerships with staff. On the background of the authors' widespread action research (AR)-based experiences, recommendations and comments for how to conduct an AR-study is described, and one of their AR-based studies illustrate the methodology and the research methods used. Methodology AR...... is defined as an approach to research which is based on a problem-solving relationship between researchers and clients, which aims at both solving a problem and at collaboratively generating new knowledge. Research questions relevant in AR-studies are: what was the working process in this change oriented...

  19. Directly assessing interpersonal RSA influences in the frequency domain: An illustration with generalized partial directed coherence.

    Science.gov (United States)

    Liu, Siwei; Gates, Kathleen M; Blandon, Alysia Y

    2018-06-01

    Despite recent research indicating that interpersonal linkage in physiology is a common phenomenon during social interactions, and the well-established role of respiratory sinus arrhythmia (RSA) in socially facilitative physiological regulation, little research has directly examined interpersonal influences in RSA, perhaps due to methodological challenges in analyzing multivariate RSA data. In this article, we aim to bridge this methodological gap by introducing a new method for quantifying interpersonal RSA influences. Specifically, we show that a frequency-domain statistic, generalized partial directed coherence (gPDC), can be used to capture lagged relations in RSA between social partners without first estimating RSA for each person. We illustrate its utility by examining the relation between gPDC and marital conflict in a sample of married couples. Finally, we discuss how gPDC complements existing methods in the time domain and provide guidelines for choosing among these different statistical techniques. © 2018 Society for Psychophysiological Research.

  20. Environmental management of the OSBAT 24'' oil pipeline: methodological and conceptual innovations; Gestao ambiental do Oleoduto OSBAT 24{sup :} inovacoes metodologicas e conceituais

    Energy Technology Data Exchange (ETDEWEB)

    Garibaldi, Celia Maria; Serra, Ricardo Novaes; Martiniano, Flavio [LENC - Laboratorio de Engenharia e Consultoria Ltda., Cotia, SP (Brazil); Masumoto, Cinthia; Frazao, Luciana Rocha [PETROBRAS, Rio de Janeiro, RJ (Brazil)

    2008-07-01

    The objective of this article is to present considerations about the design, systematic and methodology used for conducting the environmental management of the work of maintenance of OSBAT 24'' Pipeline, what is located in the stretch between Sao Sebastiao City (center) and Camburi district, in Sao Paulo State. It presents a set of criteria, concepts, techniques, ideas and practices that stand out for its innovative character, and contribute effectively to the challenge of sustainable development, and new techniques of conciliation between environmental responsibility and investment in oil sector and gas. The general direction of the article is to reflect on the experience, seeking to broadcast both conceptual and methodological aspects responsible for the successes of OSBAT 24'' environmental management, and point out obstacles found in it's implementation. (author)

  1. Digraph-fault tree methodology for the assessment of material control systems

    International Nuclear Information System (INIS)

    Lambert, H.E.; Lim, J.J.; Gilman, F.M.

    1979-01-01

    The Lawrence Livermore Laboratory, under contract to the United States Nuclear Regulatory Commission, is developing a procedure to assess the effectiveness of material control and accounting systems at nuclear fuel cycle facilities. The purpose of a material control and accounting system is to prevent the theft of special nuclear material such as plutonium or highly enriched uranium. This report presents the use of a directed graph and fault tree analysis methodology in the assessment procedure. This methodology is demonstrated by assessing a simulated material control system design, the Test Bed

  2. Directed Forgetting of Recently Recalled Autobiographical Memories

    Science.gov (United States)

    Barnier, Amanda J.; Conway, Martin A.; Mayoh, Lyndel; Speyer, Joanne; Avizmil, Orit; Harris, Celia B.

    2007-01-01

    In 6 experiments, the authors investigated list-method directed forgetting of recently recalled autobiographical memories. Reliable directed forgetting effects were observed across all experiments. In 4 experiments, the authors examined the impact of memory valence on directed forgetting. The forget instruction impaired recall of negative,…

  3. Experiments to Improve Power Conversion Parameters in a Traveling Wave Direct Energy Converter Simulator

    International Nuclear Information System (INIS)

    Takeno, Hiromasa; Kiriyama, Yuusuke; Yasaka, Yasuyoshi

    2005-01-01

    An experimental study of direct power conversion for D- 3 He fusion is presented. In a small-scale simulator of direct energy converter, which is based on a principle of deceleration of 14.7MeV protons by traveling wave field, a new structure of an external transmission circuit in experiment is proposed for the purpose of enhancement of deceleration electrode voltages. A prototype circuit was designed and constructed, resulting improvement of voltage amplitude in an order of magnitude. A more practical circuit, in which inductor elements were manufactured by using coaxial cables, was also constructed and tested. An excitation of the third harmonic frequency with a significant amplitude was observed. The cause of this problem is attributed to the modulated ion beam which has a third harmonic component and fact that the inductance of the element nonlinearly depends on frequency. This problem is serious for a practical scale energy converter, and a careful design of the circuit could avoid the problem

  4. Scenario development methodologies

    International Nuclear Information System (INIS)

    Eng, T.; Hudson, J.; Stephansson, O.

    1994-11-01

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are a) Event tree analysis, b) Influence diagrams and c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs

  5. Computational methodology of sodium-water reaction phenomenon in steam generator of sodium-cooled fast reactor

    International Nuclear Information System (INIS)

    Takata, Takashi; Yamaguchi, Akira; Uchibori, Akihiro; Ohshima, Hiroyuki

    2009-01-01

    A new computational methodology of sodium-water reaction (SWR), which occurs in a steam generator of a liquid-sodium-cooled fast reactor when a heat transfer tube in the steam generator fails, has been developed considering multidimensional and multiphysics thermal hydraulics. Two kinds of reaction models are proposed in accordance with a phase of sodium as a reactant. One is the surface reaction model in which water vapor reacts directly with liquid sodium at the interface between the liquid sodium and the water vapor. The reaction heat will lead to a vigorous evaporation of liquid sodium, resulting in a reaction of gas-phase sodium. This is designated as the gas-phase reaction model. These two models are coupled with a multidimensional, multicomponent gas, and multiphase thermal hydraulics simulation method with compressibility (named the 'SERAPHIM' code). Using the present methodology, a numerical investigation of the SWR under a pin-bundle configuration (a benchmark analysis of the SWAT-1R experiment) has been carried out. As a result, the maximum gas temperature of approximately 1,300degC is predicted stably, which lies within the range of previous experimental observations. It is also demonstrated that the maximum temperature of the mass weighted average in the analysis agrees reasonably well with the experimental result measured by thermocouples. The present methodology will be promising to establish a theoretical and mechanical modeling of secondary failure propagation of heat transfer tubes due to such as an overheating rupture and a wastage. (author)

  6. eCodonOpt: a systematic computational framework for optimizing codon usage in directed evolution experiments

    OpenAIRE

    Moore, Gregory L.; Maranas, Costas D.

    2002-01-01

    We present a systematic computational framework, eCodonOpt, for designing parental DNA sequences for directed evolution experiments through codon usage optimization. Given a set of homologous parental proteins to be recombined at the DNA level, the optimal DNA sequences encoding these proteins are sought for a given diversity objective. We find that the free energy of annealing between the recombining DNA sequences is a much better descriptor of the extent of crossover formation than sequence...

  7. Sharp and blunt force trauma concealment by thermal alteration in homicides: an in-vitro experiment for methodology and protocol development in forensic anthropological analysis of burnt bones

    OpenAIRE

    Macoveciuc, I; Marquez-Grant, N; Horsfall, I; Zioupos, P

    2017-01-01

    Burning of human remains is one method used by perpetrators to conceal fatal trauma and expert opinions regarding the degree of skeletal evidence concealment are often disparate. This experiment aimed to reduce this incongruence in forensic anthropological interpretation of burned human remains and implicitly contribute to the development of research methodologies sufficiently robust to withstand forensic scrutiny in the courtroom. We have tested the influence of thermal alteration on pre-exi...

  8. A methodology to assess the economic impact of power storage technologies.

    Science.gov (United States)

    El-Ghandour, Laila; Johnson, Timothy C

    2017-08-13

    We present a methodology for assessing the economic impact of power storage technologies. The methodology is founded on classical approaches to the optimal stopping of stochastic processes but involves an innovation that circumvents the need to, ex ante , identify the form of a driving process and works directly on observed data, avoiding model risks. Power storage is regarded as a complement to the intermittent output of renewable energy generators and is therefore important in contributing to the reduction of carbon-intensive power generation. Our aim is to present a methodology suitable for use by policy makers that is simple to maintain, adaptable to different technologies and easy to interpret. The methodology has benefits over current techniques and is able to value, by identifying a viable optimal operational strategy, a conceived storage facility based on compressed air technology operating in the UK.This article is part of the themed issue 'Energy management: flexibility, risk and optimization'. © 2017 The Author(s).

  9. Measurement of direct CP-violation with the experiments NA31 and NA48 at CERN

    International Nuclear Information System (INIS)

    Renk, B.

    1992-01-01

    The NA31 experiment has measured the CP violation parameter var-epsilon '/var-epsilon. The result of data collected in 1988 is Re(var-epsilon '/var-epsilon)=(1.7±1.0)x10 -3 . A preliminary result of data collected in 1989 is Re(var-epsilon '/var-epsilon)=(2.1±0.9)x10 -3 . Combining these two results with the original result from the 1986 data set we obtain Re(var-epsilon '/var-epsilon)=(2.3±0.7)x10 -3 , which is a more than three standard deviation evidence for direct CP violation. A new experiment NA48 is under construction which aims for a significant reduction of the statistical and the systematical errors in order to reach a combined error not exceeding 2x10 -4

  10. The Methodological Imperatives of Feminist Ethnography

    Directory of Open Access Journals (Sweden)

    Richelle D. Schrock

    2013-12-01

    Full Text Available Feminist ethnography does not have a single, coherent definition and is caught between struggles over the definition and goals of feminism and the multiple practices known collectively as ethnography. Towards the end of the 1980s, debates emerged that problematized feminist ethnography as a productive methodology and these debates still haunt feminist ethnographers today. In this article, I provide a concise historiography of feminist ethnography that summarizes both its promises and its vulnerabilities. I address the three major challenges I argue feminist ethnographers currently face, which include responding productively to feminist critiques of representing "others," accounting for feminisms' commitment to social change while grappling with poststructuralist critiques of knowledge production, and confronting the historical and ongoing lack of recognition for significant contributions by feminist ethnographers. Despite these challenges, I argue that feminist ethnography is a productive methodology and I conclude by delineating its methodological imperatives. These imperatives include producing knowledge about women's lives in specific cultural contexts, recognizing the potential detriments and benefits of representation, exploring women's experiences of oppression along with the agency they exercise in their own lives, and feeling an ethical responsibility towards the communities in which the researchers work. I argue that this set of imperatives enables feminist ethnographers to successfully navigate the challenges they face.

  11. Specifics of methodology historical and philosophical researh of hesychasm

    Directory of Open Access Journals (Sweden)

    A. B. Klimenko

    2014-03-01

    Full Text Available The article is devoted to the research methodology of  Hesychasm ­ one of the most important schools of the Byzantine philosophy, which played a significant role in the development of modern civilization. However, to date it remains a kind of «terra incognita» for the world historical and philosophical thought. Hesychasm is a kind of Christian mystical worldview that is embodied in a certain spiritual practices that form the basis of Orthodox asceticism. Even half a century ago, history of philosophy left without attention of philosophical and theological teachings of the authors of the late antiquity and the early middle ages, be they Christian thinkers or the neo­Platonists. The era of post­Plotins philosophers Neoplatonists or commentators on Aristotle considered as a period of decline of this philosophy and the time of the rise of irrationality. For the same reason it was considered that the system of Christian thinkers cannot and should not be subject to the historical and philosophical science. This fully relates Hesychasm. However, on the basis of works of the French philosopher P. Ado, the paper argues that philosophy in late antiquity when there is Hesychasm is first of all a way of life, and therefore Hesychasm can be considered as a specific philosophical school of Christian asceticism. The main modern method of historical and philosophical studies is the hermeneutical reconstruction of cultural meaning of the philosophical texts, however, Hesychasm cannot be reduced to the «amount of texts» or rational philosophical discourses. When learning is impossible not to take into account the existing experience, what is behind the lyrics: the experience of the inner purification, «the noetic prayer, which often has verbal reflection. Therefore, along with the use of hermeneutic and semiotic principles of research work with the texts, there is a problem of the analysis of the experience of spiritual practices. This requires the use

  12. Methodology for qualification of wood-based ash according to REACH - prestudy

    Energy Technology Data Exchange (ETDEWEB)

    Sjoeblom, Rolf [Tekedo AB, Nykoeping (Sweden); Tivegaard, Anna-Maria [SSAB Merox AB, Oxeloesund (Sweden)

    2010-02-15

    The new European Union framework directive on waste is to be implemented during the year 2010. According to this directive, much of what today is regarded as waste will instead be assessed as by-products and in many cases fall under the new European union regulation REACH (Registration, Evaluation, Authorisation and Restriction of Chemicals). REACH applies in conjunction with the new European Union regulation CLP (Classification, Labelling and Packaging of substances and mixtures). There are introductory periods for both of these regulations, and in the case of CLP this regards transition from the present and previous rules under the dangerous substances and dangerous preparations directives (DSD and DPD, respectively). Similarly, the new framework directive on waste supersedes the previous directive and some other statements. There is a connection between the directives of waste and the rules for classification and labelling in that the classification of waste (in the categories hazardous and non-hazardous) build on (but are not identical to) the rules for labelling. Similarly, the national Swedish rules for acceptance of recycled material (waste) for use in geotechnical constructions relate to the provisions in REACH on assessment of chemical safety in the both request that the risk be assessed to be small, and that the same or similar methodologies can be applied to verify this. There is a 'reference alternative' in REACH that implies substantial testing prior to registration. Registration is the key to use of a substance even though a substance may be used as such, in a mixture, or to be released from an article. However, REACH as well as CLP contain a number of provisions for using literature data, data on similar chemicals e t c in order to avoid unnecessary testing. This especially applies to testing on humans and vertebrate animals. Vaermeforsk, through its Programme on Environmentally Friendly Use of Non-Coal Ashes has developed methodologies and

  13. Methodology for qualification of wood-based ash according to REACH - prestudy

    Energy Technology Data Exchange (ETDEWEB)

    Sjoeblom, Rolf (Tekedo AB, Nykoeping (Sweden)); Tivegaard, Anna-Maria (SSAB Merox AB, Oxeloesund (Sweden))

    2010-02-15

    The new European Union framework directive on waste is to be implemented during the year 2010. According to this directive, much of what today is regarded as waste will instead be assessed as by-products and in many cases fall under the new European union regulation REACH (Registration, Evaluation, Authorisation and Restriction of Chemicals). REACH applies in conjunction with the new European Union regulation CLP (Classification, Labelling and Packaging of substances and mixtures). There are introductory periods for both of these regulations, and in the case of CLP this regards transition from the present and previous rules under the dangerous substances and dangerous preparations directives (DSD and DPD, respectively). Similarly, the new framework directive on waste supersedes the previous directive and some other statements. There is a connection between the directives of waste and the rules for classification and labelling in that the classification of waste (in the categories hazardous and non-hazardous) build on (but are not identical to) the rules for labelling. Similarly, the national Swedish rules for acceptance of recycled material (waste) for use in geotechnical constructions relate to the provisions in REACH on assessment of chemical safety in the both request that the risk be assessed to be small, and that the same or similar methodologies can be applied to verify this. There is a 'reference alternative' in REACH that implies substantial testing prior to registration. Registration is the key to use of a substance even though a substance may be used as such, in a mixture, or to be released from an article. However, REACH as well as CLP contain a number of provisions for using literature data, data on similar chemicals e t c in order to avoid unnecessary testing. This especially applies to testing on humans and vertebrate animals. Vaermeforsk, through its Programme on Environmentally Friendly Use of Non-Coal Ashes has developed methodologies and

  14. A Design Methodology for Medical Processes

    Science.gov (United States)

    Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Summary Background Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient’s needs, the uncertainty of the patient’s response, and the indeterminacy of patient’s compliance to treatment. Also, the multiple actors involved in patient’s care need clear and transparent communication to ensure care coordination. Objectives In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. Methods The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. Results The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Conclusions Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution. PMID:27081415

  15. A Design Methodology for Medical Processes.

    Science.gov (United States)

    Ferrante, Simona; Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient's needs, the uncertainty of the patient's response, and the indeterminacy of patient's compliance to treatment. Also, the multiple actors involved in patient's care need clear and transparent communication to ensure care coordination. In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution.

  16. Ultra wideband antennas design, methodologies, and performance

    CERN Document Server

    Galvan-Tejada, Giselle M; Jardón Aguilar, Hildeberto

    2015-01-01

    Ultra Wideband Antennas: Design, Methodologies, and Performance presents the current state of the art of ultra wideband (UWB) antennas, from theory specific for these radiators to guidelines for the design of omnidirectional and directional UWB antennas. Offering a comprehensive overview of the latest UWB antenna research and development, this book:Discusses the developed theory for UWB antennas in frequency and time domainsDelivers a brief exposition of numerical methods for electromagnetics oriented to antennasDescribes solid-planar equivalen

  17. Project management methodology in the public and private sector: The case of an emerging market

    Directory of Open Access Journals (Sweden)

    Azamat Oinarov

    2017-03-01

    Full Text Available Application of project management methodologies in different countries is varied. The preference of a particular methodology largely depends on the specific features of a project management system in use. The aim of the paper is to draw the attention of project-involved readers to the need to develop, not a guide, but a specific project management methodology for projects in the public-private sector. The objective pursued by the paper is to provide useful recommendations for improving the existing methodologies on project management in the public-private sector. Kazakhstan’s experience in implementing project management methodologies in its public sector is sporadic while its private sector uses of modern methodologies build on external investor proven practices. At the background of the low exposure of the public sector to the best of project management methodologies, the paper reviews existing international project management methodologies and develops useful recommendations on the methodology, most suitable for a developing country’s public-private sector, on Kazakhstan’s example.

  18. Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving.

    Science.gov (United States)

    Elfring, Jos; Appeldoorn, Rein; van den Dries, Sjoerd; Kwakkernaat, Maurice

    2016-10-11

    The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle's surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture.

  19. Poverty Assessment in the Philippines and Indonesia: A Methodological Comparison

    OpenAIRE

    David, Isidoro P.; Asra, Abuzar; Virola, Romulo A.

    1997-01-01

    Existing official poverty statistics cannot be directly utilized for cross-country comparison. This paper illustrates why. It presents an assessment of poverty measurement in the Philippines and Indonesia by examining methodologies used and the disparity in their respective poverty statistics. A more comparable poverty estimates in these countries are provided.

  20. Gas Detection for Experiments

    CERN Document Server

    Hay, D

    2001-01-01

    Flammable gases are often used in detectors for physics experiments. The storage, distribution and manipulation of such flammable gases present several safety hazards. As most flammable gases cannot be detected by human senses, specific well-placed gas detection systems must be installed. Following a request from the user group and in collaboration with CERN safety officers, risk analyses are performed. An external contractor, who needs to receive detailed user requirements from CERN, performs the installations. The contract is passed on a guaranteed results basis. Co-ordination between all the CERN groups and verification of the technical installation is done by ST/AA/AS. This paper describes and focuses on the structured methodology applied to implement such installations based on goal directed project management techniques (GDPM). This useful supervision tool suited to small to medium sized projects facilitates the task of co-ordinating numerous activities to achieve a completely functional system.

  1. Parameters calculation of a shielding experiment and evaluation of calculation methodology

    International Nuclear Information System (INIS)

    Gavazza, S.; Otto, A.C.; Gomes, I.C.; Maiorino, J.R.

    1986-01-01

    In this text is carried out the evaluation of radiation transport methodology, comparying the calculated reactions and dose rates, for neutrons and gamma-rays, with the experimental measurements obtained on iron shield, irradiated in YAYOI reactor. Were employed the ENDF/B-IV and VITAMIN-C libraries and the AMPX-II modular system for generation of cross sections, collapsed by the ANISN code. The transport calculation were made by using the DOT 3.5 code, adjusting the spectrum of the iron shield boundary source to the reactions and dose rates, measured at the beginning of shield. The distributions calculated for neutrons and gamma-rays, on iron shield, presented coherence with the experimental measurements. (Author) [pt

  2. Experience of the application of direct democracy and mediate participation’s possibilities in territorial communities of Ukraine

    Directory of Open Access Journals (Sweden)

    N. M. Nahorna

    2016-10-01

    Taking into consideration such situation, the application of direct democracy and mediate participation’s possibilities in territorial communities of Ukraine appears to be more and more relevant for scientifically theoretical apprehension and practical introduction of the modern Ukraine’s political life. Thus, we consider that the improvement of the mechanisms of direct democracy and mediate citizens’ participation should become an important task of state’s policy in the process of raise of the Ukrainian political system efficiency. At the present stage of political transformations they are an actual instrument of influencing the population, aimed at the increase of control of the governmental institutions between the elections periods. Indisputably, such experience of the application of direct democracy and mediate participation’s possibilities in territorial communities of Ukraine together with the necessary improvement of legal sphere and the raise of the efficiency of the direct democracy’s mechanisms practical implementation will positively influence the development of social and political life and will allow to use the main means of the realization of direct democracy at the local level and at the national level effectively and will stimulate the participation of population in the political process.

  3. New Methodology For Use in Rotating Field Nuclear MagneticResonance

    Energy Technology Data Exchange (ETDEWEB)

    Jachmann, Rebecca C. [Univ. of California, Berkeley, CA (United States)

    2007-01-01

    High-resolution NMR spectra of samples with anisotropicbroadening are simplified to their isotropic spectra by fast rotation ofthe sample at the magic angle 54.7 circ. This dissertation concerns thedevelopment of novel Nuclear Magnetic Resonance (NMR) methodologies basedwhich would rotate the magnetic field instead of the sample, rotatingfield NMR. It provides an over of the NMR concepts, procedures, andexperiments needed to understand the methodologies that will be used forrotating field NMR. A simple two-dimensional shimming method based onharmonic corrector rings which can provide arbitrary multiple ordershimming corrections were developed for rotating field systems, but couldbe used in shimming other systems as well. Those results demonstrate, forexample, that quadrupolar order shimming improves the linewidth by up toan order of magnitude. An additional order of magnitude reduction is inprinciple achievable by utilizing this shimming method for z-gradientcorrection and higher order xy gradients. A specialized pulse sequencefor the rotating field NMR experiment is under development. The pulsesequence allows for spinning away from the magic angle and spinningslower than the anisotropic broadening. This pulse sequence is acombination of the projected magic angle spinning (p-MAS) and magic angleturning (MAT) pulse sequences. This will be useful to rotating field NMRbecause there are limits on how fast a field can be spun and spin at themagic angle is difficult. One of the goals of this project is forrotating field NMR to be used on biological systems. The p-MAS pulsesequence was successfully tested on bovine tissue samples which suggeststhat it will be a viable methodology to use in a rotating field set up. Aside experiment on steering magnetic particle by MRI gradients was alsocarried out. Some movement was seen in these experiment, but for totalcontrol over steering further experiments would need to bedone.

  4. New Methodology For Use in Rotating Field Nuclear MagneticResonance

    Energy Technology Data Exchange (ETDEWEB)

    Jachmann, Rebecca C. [Univ. of California, Berkeley, CA (United States)

    2007-05-18

    High-resolution NMR spectra of samples with anisotropicbroadening are simplified to their isotropic spectra by fast rotation ofthe sample at the magic angle 54.7 circ. This dissertation concerns thedevelopment of novel Nuclear Magnetic Resonance (NMR) methodologies basedwhich would rotate the magnetic field instead of the sample, rotatingfield NMR. It provides an over of the NMR concepts, procedures, andexperiments needed to understand the methodologies that will be used forrotating field NMR. A simple two-dimensional shimming method based onharmonic corrector rings which can provide arbitrary multiple ordershimming corrections were developed for rotating field systems, but couldbe used in shimming other systems as well. Those results demonstrate, forexample, that quadrupolar order shimming improves the linewidth by up toan order of magnitude. An additional order of magnitude reduction is inprinciple achievable by utilizing this shimming method for z-gradientcorrection and higher order xy gradients. A specialized pulse sequencefor the rotating field NMR experiment is under development. The pulsesequence allows for spinning away from the magic angle and spinningslower than the anisotropic broadening. This pulse sequence is acombination of the projected magic angle spinning (p-MAS) and magic angleturning (MAT) pulse sequences. This will be useful to rotating field NMRbecause there are limits on how fast a field can be spun and spin at themagic angle is difficult. One of the goals of this project is forrotating field NMR to be used on biological systems. The p-MAS pulsesequence was successfully tested on bovine tissue samples which suggeststhat it will be a viable methodology to use in a rotating field set up. Aside experiment on steering magnetic particle by MRI gradients was alsocarried out. Some movement was seen in these experiment, but for totalcontrol over steering further experiments would need to bedone.

  5. Empirical evidence of direct rebound effect in Catalonia

    International Nuclear Information System (INIS)

    Freire Gonzalez, Jaume

    2010-01-01

    This paper reviews the empirical literature concerning the direct rebound effect in households; it briefly analyzes the main theoretical and methodological aspects, and finally estimates the magnitude of direct rebound effect for all energy services using electricity in households of Catalonia (Spain) using econometric techniques. The main results show an estimated direct rebound effect of 35% in the short term and 49% in the long term. The existence of a rebound effect reduces the effectiveness of energy efficiency policies.

  6. Calibration methodology for proportional counters applied to yield measurements of a neutron burst

    Energy Technology Data Exchange (ETDEWEB)

    Tarifeño-Saldivia, Ariel, E-mail: atarifeno@cchen.cl, E-mail: atarisal@gmail.com; Pavez, Cristian; Soto, Leopoldo [Comisión Chilena de Energía Nuclear, Casilla 188-D, Santiago (Chile); Center for Research and Applications in Plasma Physics and Pulsed Power, P4, Santiago (Chile); Departamento de Ciencias Fisicas, Facultad de Ciencias Exactas, Universidad Andres Bello, Republica 220, Santiago (Chile); Mayer, Roberto E. [Instituto Balseiro and Centro Atómico Bariloche, Comisión Nacional de Energía Atómica and Universidad Nacional de Cuyo, San Carlos de Bariloche R8402AGP (Argentina)

    2014-01-15

    This paper introduces a methodology for the yield measurement of a neutron burst using neutron proportional counters. This methodology is to be applied when single neutron events cannot be resolved in time by nuclear standard electronics, or when a continuous current cannot be measured at the output of the counter. The methodology is based on the calibration of the counter in pulse mode, and the use of a statistical model to estimate the number of detected events from the accumulated charge resulting from the detection of the burst of neutrons. The model is developed and presented in full detail. For the measurement of fast neutron yields generated from plasma focus experiments using a moderated proportional counter, the implementation of the methodology is herein discussed. An experimental verification of the accuracy of the methodology is presented. An improvement of more than one order of magnitude in the accuracy of the detection system is obtained by using this methodology with respect to previous calibration methods.

  7. Calibration methodology for proportional counters applied to yield measurements of a neutron burst

    International Nuclear Information System (INIS)

    Tarifeño-Saldivia, Ariel; Pavez, Cristian; Soto, Leopoldo; Mayer, Roberto E.

    2014-01-01

    This paper introduces a methodology for the yield measurement of a neutron burst using neutron proportional counters. This methodology is to be applied when single neutron events cannot be resolved in time by nuclear standard electronics, or when a continuous current cannot be measured at the output of the counter. The methodology is based on the calibration of the counter in pulse mode, and the use of a statistical model to estimate the number of detected events from the accumulated charge resulting from the detection of the burst of neutrons. The model is developed and presented in full detail. For the measurement of fast neutron yields generated from plasma focus experiments using a moderated proportional counter, the implementation of the methodology is herein discussed. An experimental verification of the accuracy of the methodology is presented. An improvement of more than one order of magnitude in the accuracy of the detection system is obtained by using this methodology with respect to previous calibration methods

  8. Top Level Space Cost Methodology (TLSCM)

    Science.gov (United States)

    1997-12-02

    Software 7 6. ACEIT . 7 C. Ground Rules and Assumptions 7 D. Typical Life Cycle Cost Distribution 7 E. Methodologies 7 1. Cost/budget Threshold 9 2. Analogy...which is based on real-time Air Force and space programs. Ref.(25:2- 8, 2-9) 6. ACEIT : Automated Cost Estimating Integrated Tools( ACEIT ), Tecolote...Research, Inc. There is a way to use the ACEIT cost program to get a print-out of an expanded WBS. Therefore, find someone that has ACEIT experience and

  9. Phase-field simulation of peritectic solidification closely coupled with directional solidification experiments in an Al-36 wt% Ni alloy

    International Nuclear Information System (INIS)

    Siquieri, R; Emmerich, H; Doernberg, E; Schmid-Fetzer, R

    2009-01-01

    In this work we present experimental and theoretical investigations of the directional solidification of Al-36 wt% Ni alloy. A phase-field approach (Folch and Plapp 2005 Phys. Rev. E 72 011602) is coupled with the CALPHAD (calculation of phase diagrams) method to be able to simulate directional solidification of Al-Ni alloy including the peritectic phase Al 3 Ni. The model approach is calibrated by systematic comparison to microstructures grown under controlled conditions in directional solidification experiments. To illustrate the efficiency of the model it is employed to investigate the effect of temperature gradient on the microstructure evolution of Al-36 wt% Ni during solidification.

  10. Clinical Research with Transcranial Direct Current Stimulation (tDCS): Challenges and Future Directions

    Science.gov (United States)

    Brunoni, Andre Russowsky; Nitsche, Michael A.; Bolognini, Nadia; Bikson, Marom; Wagner, Tim; Merabet, Lotfi; Edwards, Dylan J.; Valero-Cabre, Antoni; Rotenberg, Alexander; Pascual-Leone, Alvaro; Ferrucci, Roberta; Priori, Alberto; Boggio, Paulo; Fregni, Felipe

    2011-01-01

    Background Transcranial direct current stimulation (tDCS) is a neuromodulatory technique that delivers low-intensity, direct current to cortical areas facilitating or inhibiting spontaneous neuronal activity. In the past ten years, tDCS physiological mechanisms of action have been intensively investigated giving support for the investigation of its applications in clinical neuropsychiatry and rehabilitation. However, new methodological, ethical, and regulatory issues emerge when translating the findings of preclinical and phase I studies into phase II and III clinical studies. The aim of this comprehensive review is to discuss the key challenges of this process and possible methods to address them. Methods We convened a workgroup of researchers in the field to review, discuss and provide updates and key challenges of neuromodulation use for clinical research. Main Findings/Discussion We reviewed several basic and clinical studies in the field and identified potential limitations, taking into account the particularities of the technique. We review and discuss the findings into four topics: (i) mechanisms of action of tDCS, parameters of use and computer-based human brain modeling investigating electric current fields and magnitude induced by tDCS; (ii) methodological aspects related to the clinical research of tDCS as divided according to study phase (i.e., preclinical, phase I, phase II and phase III studies); (iii) ethical and regulatory concerns; (iv) future directions regarding novel approaches, novel devices, and future studies involving tDCS. Finally, we propose some alternative methods to facilitate clinical research on tDCS. PMID:22037126

  11. Methodology for plastic fracture - a progress report

    International Nuclear Information System (INIS)

    Wilkinson, J.P.D.; Smith, R.E.E.

    1977-01-01

    This paper describes the progress of a study to develop a methodology for plastic fracture. Such a fracture mechanics methodology, having application in the plastic region, is required to assess the margin of safety inherent in nuclear reactor pressure vessels. The initiation and growth of flaws in pressure vessels under overload conditions is distinguished by a number of unique features, such as large scale yielding, three-dimensional structural and flaw configurations, and failure instabilities that may be controlled by either toughness or plastic flow. In order to develop a broadly applicable methodology of plastic fracture, these features require the following analytical and experimental studies: development of criteria for crack initiation and growth under large scale yielding; the use of the finite element method to describe elastic-plastic behaviour of both the structure and the crack tip region; and extensive experimental studies on laboratory scale and large scale specimens, which attempt to reproduce the pertinent plastic flow and crack growth phenomena. This discussion centers on progress to date on the selection, through analysis and laboratory experiments, of viable criteria for crack initiation and growth during plastic fracture. (Auth.)

  12. New methodology for simultaneous volumetric and calorimetric measurements: Direct determination of {alpha}{sub p} and C{sub p} for liquids under pressure

    Energy Technology Data Exchange (ETDEWEB)

    Casas, L. M. [Departamento de Fisica Aplicada, Facultad de Ciencias Experimentales, Universidad de Vigo, Lagoas Marcosende s/n, 36310 Vigo (Spain); Plantier, F.; Bessieres, D. [Laboratoire de Thermodynamique et Energetique des Fluides Complexes-UMR 5150, Universite de Pau et des Pays de l' Adour, BP 1155, 64013 Pau (France)

    2009-12-15

    A new batch cell has been developed to measure simultaneously both isobaric thermal expansion and isobaric heat capacity from calorimetric measurements. The isobaric thermal expansion is directly proportional to the linear displacement of an inner flexible below and the heat capacity is calculated from the calorimetric signal. The apparatus used was a commercial Setaram C-80 calorimeter and together with this type of vessels can be operated up to 20 MPa and in the temperature range of 303.15-523.15 K, In this work, calibration was carried out using 1-hexanol and subsequently both thermophysical properties were determined for 3-pentanol, 3-ethyl-3-pentanol, and 1-octanol at atmospheric pressure, 5 and 10 MPa, and from 303.15 to 423.15 K in temperature. Finally experimental values were compared with the literature in order to validate this new methodology, which allows a very accurate determination of isobaric thermal expansion and isobaric heat capacity.

  13. Direct identification of bacteria in blood culture by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry: a new methodological approach.

    Science.gov (United States)

    Kroumova, Vesselina; Gobbato, Elisa; Basso, Elisa; Mucedola, Luca; Giani, Tommaso; Fortina, Giacomo

    2011-08-15

    Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) has recently been demonstrated to be a powerful tool for the rapid identification of bacteria from growing colonies. In order to speed up the identification of bacteria, several authors have evaluated the usefulness of this MALDI-TOF MS technology for the direct and quick identification bacteria from positive blood cultures. The results obtained so far have been encouraging but have also shown some limitations, mainly related to the bacterial growth and to the presence of interference substances belonging to the blood cultures. In this paper, we present a new methodological approach that we have developed to overcome these limitations, based mainly on an enrichment of the sample into a growing medium before the extraction process, prior to mass spectrometric analysis. The proposed method shows important advantages for the identification of bacterial strains, yielding an increased identification score, which gives higher confidence in the results. Copyright © 2011 John Wiley & Sons, Ltd.

  14. The chatting gathering as a methodological strategy in in-service learning: moving along dialogical dynamics

    Directory of Open Access Journals (Sweden)

    María José Alonso

    2008-04-01

    Full Text Available This article focuses on an experience of in-service training carried out by a group of educators in literacy. The novelty of the undertaking lies in the methodological proposal analysed: using “chatting gatherings” as a methodological strategy, which supports critical reflection and the construction of knowledge, both in in-service training of professionals and in basic adult education. This experience reveals the nature of learning achieved through dialogical educational processes. Further, it allows us to observe the impact that they may have on the improvement of the professionals’ educational practices.

  15. Application of EIA/SEA system in land use planning: Experience from Serbia

    Directory of Open Access Journals (Sweden)

    Stojanović Božidar

    2005-01-01

    Full Text Available This paper discusses the experience and current status of EIA/SEA procedures and assessment methodologies in Serbia, aiming to propose strategies that can lead to effective integration of the SEA in spatial planning. Institutional and practical problems with regard to the regulations of EIA/SEA were considered. Experience from the past decade shows that implementation of EIA system in Serbia has not been effective as expected. New legislation on EIA and SEA is harmonized with corresponding EU Directives. First steps in the application of the SEA show that the main issues are screening, scooping and decision making. According to the research results, it is suggested that extra evaluation processes should be incorporated into current assessment procedures to improve their scientific validity and integrity.

  16. Measuring hand hygiene compliance rates in different special care settings: a comparative study of methodologies

    Directory of Open Access Journals (Sweden)

    Thyago Pereira Magnus

    2015-04-01

    Conclusions: Hand hygiene compliance was reasonably high in these units, as measured by direct observation. However, a lack of correlation with results obtained by other methodologies brings into question the validity of direct observation results, and suggests that periodic audits using other methods may be needed.

  17. Current Methodological Problems and Future Directions for Theory Development in the Psychology of Sport and Motor Behavior.

    Science.gov (United States)

    Bird, Anne Marie; Ross, Diane

    1984-01-01

    A brief history of research in sport psychology based on Lander's (1982) analysis is presented. A systematic approach to theory building is offered. Previous methodological inadequacies are identified using examples of observational learning and anxiety. (Author/DF)

  18. Thinkering through Experiments: Nurturing Transdisciplinary Approaches to the Design of Testing Tools

    Directory of Open Access Journals (Sweden)

    Kathryn B. Francis

    2017-11-01

    Full Text Available In order to assess and understand human behavior, traditional approaches to experimental design incorporate testing tools that are often artificial and devoid of corporeal features. Whilst these offer experimental control in situations in which, methodologically, real behaviors cannot be examined, there is increasing evidence that responses given in these contextually deprived experiments fail to trigger genuine responses. This may result from a lack of consideration regarding the material makeup and associations connected with the fabric of experimental tools. In a two-year collaboration, we began to experiment with the physicality of testing tools using the domain of moral psychology as a case study. This collaboration involved thinkering and prototyping methods that included direct contact and consideration of the materials involved in experimentation. Having explored the embodied nature of morality, we combined approaches from experimental psychology, moral philosophy, design thinking, and computer science to create a new testing tool for simulated moral behavior. Although the testing tool itself generated fruitful results, this paper considers the collaborative methodology through which it was produced as a route to highlight material questions within psychological research.

  19. Intelligent systems engineering methodology

    Science.gov (United States)

    Fouse, Scott

    1990-01-01

    An added challenge for the designers of large scale systems such as Space Station Freedom is the appropriate incorporation of intelligent system technology (artificial intelligence, expert systems, knowledge-based systems, etc.) into their requirements and design. This presentation will describe a view of systems engineering which successfully addresses several aspects of this complex problem: design of large scale systems, design with requirements that are so complex they only completely unfold during the development of a baseline system and even then continue to evolve throughout the system's life cycle, design that involves the incorporation of new technologies, and design and development that takes place with many players in a distributed manner yet can be easily integrated to meet a single view of the requirements. The first generation of this methodology was developed and evolved jointly by ISX and the Lockheed Aeronautical Systems Company over the past five years on the Defense Advanced Research Projects Agency/Air Force Pilot's Associate Program, one of the largest, most complex, and most successful intelligent systems constructed to date. As the methodology has evolved it has also been applied successfully to a number of other projects. Some of the lessons learned from this experience may be applicable to Freedom.

  20. Simulation Methodology in Nursing Education and Adult Learning Theory

    Science.gov (United States)

    Rutherford-Hemming, Tonya

    2012-01-01

    Simulation is often used in nursing education as a teaching methodology. Simulation is rooted in adult learning theory. Three learning theories, cognitive, social, and constructivist, explain how learners gain knowledge with simulation experiences. This article takes an in-depth look at each of these three theories as each relates to simulation.…

  1. Partnerships for the Design, Conduct, and Analysis of Effectiveness, and Implementation Research: Experiences of the Prevention Science and Methodology Group

    Science.gov (United States)

    Brown, C. Hendricks; Kellam, Sheppard G.; Kaupert, Sheila; Muthén, Bengt O.; Wang, Wei; Muthén, Linda K.; Chamberlain, Patricia; PoVey, Craig L.; Cady, Rick; Valente, Thomas W.; Ogihara, Mitsunori; Prado, Guillermo J.; Pantin, Hilda M.; Gallo, Carlos G.; Szapocznik, José; Czaja, Sara J.; McManus, John W.

    2012-01-01

    What progress prevention research has made comes through strategic partnerships with communities and institutions that host this research, as well as professional and practice networks that facilitate the diffusion of knowledge about prevention. We discuss partnership issues related to the design, analysis, and implementation of prevention research and especially how rigorous designs, including random assignment, get resolved through a partnership between community stakeholders, institutions, and researchers. These partnerships shape not only study design, but they determine the data that can be collected and how results and new methods are disseminated. We also examine a second type of partnership to improve the implementation of effective prevention programs into practice. We draw on social networks to studying partnership formation and function. The experience of the Prevention Science and Methodology Group, which itself is a networked partnership between scientists and methodologists, is highlighted. PMID:22160786

  2. Diretrizes metodológicas para investigar estados alterados de consciência e experiências anômalas Methodological guidelines to explore altered states of consciousness and anomalous experiences

    Directory of Open Access Journals (Sweden)

    Alexander Moreira de Almeida

    2003-01-01

    other set of data. This article presents some methodological guidelines to explore these experiences, among them: to avoid prejudice approaches and to "pathologize" the unusual, the value of a theory and a comprehensive review of literature, using various concepts of pathology and normality, the investigation of clinical and non-clinical populations, development of new search instruments, to be careful to choose terms and to decide causal nexus, to distinguish experiences and interpretations, to take into account the role of culture, to evaluate the validity and reliability of reports, and the last, but not least, creativity and diversity in choosing methods.

  3. Phenomenology as a potential methodology for subjective knowing in science education research

    OpenAIRE

    Koopman, Oscar

    2015-01-01

    This paper charts the journey that led to the author's discovery of phenomenology as a potential research methodology in the field of science education, and describes the impact on his own thinking and approach of his encounters with the work of Husserl and Heidegger, Merleau-Ponty and Van Manen. Drawing on this theoretical framework, the author argues that, as a methodology for investigating scientific thinking in relation to life experience, learning and curriculum design, phenomenology not...

  4. Design verification methodology for a solenoid valve for industrial applications

    International Nuclear Information System (INIS)

    Park, Chang Dae; Lim, Byung Ju; Chun, Kyung Yul

    2015-01-01

    Solenoid operated valves (SOV) are widely used in many applications due to their fast dynamic responses, cost effectiveness, and less contamination sensitive characteristics. In this paper, we tried to provide a convenient method of design verification of SOV to design engineers who depend on their experiences and experiment during design and development process of SOV. First, we summarize a detailed procedure for designing SOVs for industrial applications. All of the design constraints are defined in the first step of the design, and then the detail design procedure is presented based on design experiences as well as various physical and electromagnetic relationships. Secondly, we have suggested a verification method of this design using theoretical relationships, which enables optimal design of SOV from a point of view of safety factor of design attraction force. Lastly, experimental performance tests using several prototypes manufactured based on this design method show that the suggested design verification methodology is appropriate for designing new models of solenoids. We believe that this verification process is novel logic and useful to save time and expenses during development of SOV because verification tests with manufactured specimen may be substituted partly by this verification methodology.

  5. Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving

    Directory of Open Access Journals (Sweden)

    Jos Elfring

    2016-10-01

    Full Text Available The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle’s surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture.

  6. Headspace mass spectrometry methodology: application to oil spill identification in soils

    Energy Technology Data Exchange (ETDEWEB)

    Perez Pavon, J.L.; Garcia Pinto, C.; Moreno Cordero, B. [Universidad de Salamanca, Departamento de Quimica Analitica, Nutricion y Bromatologia, Facultad de Ciencias Quimicas, Salamanca (Spain); Guerrero Pena, A. [Universidad de Salamanca, Departamento de Quimica Analitica, Nutricion y Bromatologia, Facultad de Ciencias Quimicas, Salamanca (Spain); Laboratorio de Suelos, Plantas y Aguas, Campus Tabasco, Colegio de Postgraduados, Cardenas, Tabasco (Mexico)

    2008-05-15

    In the present work we report the results obtained with a methodology based on direct coupling of a headspace generator to a mass spectrometer for the identification of different types of petroleum crudes in polluted soils. With no prior treatment, the samples are subjected to the headspace generation process and the volatiles generated are introduced directly into the mass spectrometer, thereby obtaining a fingerprint of volatiles in the sample analysed. The mass spectrum corresponding to the mass/charge ratios (m/z) contains the information related to the composition of the headspace and is used as the analytical signal for the characterization of the samples. The signals obtained for the different samples were treated by chemometric techniques to obtain the desired information. The main advantage of the proposed methodology is that no prior chromatographic separation and no sample manipulation are required. The method is rapid, simple and, in view of the results, highly promising for the implementation of a new approach for oil spill identification in soils. (orig.)

  7. The Location Choice of Foreign Direct Investments

    DEFF Research Database (Denmark)

    Nielsen, Bo Bernhard; Geisler Asmussen, Christian; Weatherall, Cecilie Dohlmann

    2017-01-01

    The choice of location of foreign direct investments (FDI) by multinational enterprises (MNEs) has been the subject of intense scrutiny for decades and continues to be so. Yet, the vast diversity in methodological approaches, levels of analysis, and empirical evidence precludes a comprehensive...

  8. Development of a flight software testing methodology

    Science.gov (United States)

    Mccluskey, E. J.; Andrews, D. M.

    1985-01-01

    The research to develop a testing methodology for flight software is described. An experiment was conducted in using assertions to dynamically test digital flight control software. The experiment showed that 87% of typical errors introduced into the program would be detected by assertions. Detailed analysis of the test data showed that the number of assertions needed to detect those errors could be reduced to a minimal set. The analysis also revealed that the most effective assertions tested program parameters that provided greater indirect (collateral) testing of other parameters. In addition, a prototype watchdog task system was built to evaluate the effectiveness of executing assertions in parallel by using the multitasking features of Ada.

  9. Reflections on Using Pinhole Photography as a Pedagogical and Methodological Tool with Adolescents in Wild Nature

    Science.gov (United States)

    Socha, Teresa; Potter, Tom; Potter, Stephanie; Jickling, Bob

    2016-01-01

    This paper shares our experiences using pinhole photography with adolescents as both a pedagogical tool to support and deepen adolescent experiences in wild nature, and as a visual methodological tool to elucidate their experiences. Reflecting on a journey that explored the nature-based experiences of two adolescents on a family canoe trip in…

  10. Changes in the reflectance of ex situ leaves: A methodological approach

    Science.gov (United States)

    Ponzoni, Flavio Jorge; Inoe, Mario Takao

    1992-04-01

    The main aspects of the interaction between electromagnetic radiation and detached leaves are presented. An experiment with Eucalipto and Araucaria detached leaves is described, including the description of the methodologies utilized in the collection and storage of the reflectance.

  11. A statistical methodology for quantification of uncertainty in best estimate code physical models

    International Nuclear Information System (INIS)

    Vinai, Paolo; Macian-Juan, Rafael; Chawla, Rakesh

    2007-01-01

    A novel uncertainty assessment methodology, based on a statistical non-parametric approach, is presented in this paper. It achieves quantification of code physical model uncertainty by making use of model performance information obtained from studies of appropriate separate-effect tests. Uncertainties are quantified in the form of estimated probability density functions (pdf's), calculated with a newly developed non-parametric estimator. The new estimator objectively predicts the probability distribution of the model's 'error' (its uncertainty) from databases reflecting the model's accuracy on the basis of available experiments. The methodology is completed by applying a novel multi-dimensional clustering technique based on the comparison of model error samples with the Kruskall-Wallis test. This takes into account the fact that a model's uncertainty depends on system conditions, since a best estimate code can give predictions for which the accuracy is affected by the regions of the physical space in which the experiments occur. The final result is an objective, rigorous and accurate manner of assigning uncertainty to coded models, i.e. the input information needed by code uncertainty propagation methodologies used for assessing the accuracy of best estimate codes in nuclear systems analysis. The new methodology has been applied to the quantification of the uncertainty in the RETRAN-3D void model and then used in the analysis of an independent separate-effect experiment. This has clearly demonstrated the basic feasibility of the approach, as well as its advantages in yielding narrower uncertainty bands in quantifying the code's accuracy for void fraction predictions

  12. Progress in indirect and direct-drive planar experiments on hydrodynamic instabilities at the ablation front

    Energy Technology Data Exchange (ETDEWEB)

    Casner, A., E-mail: alexis.casner@cea.fr; Masse, L.; Huser, G.; Galmiche, D.; Liberatore, S.; Riazuelo, G. [CEA, DAM, DIF, F-91297 Arpajon (France); Delorme, B. [CEA, DAM, DIF, F-91297 Arpajon (France); CELIA, University of Bordeaux-CNRS-CEA, F-33400 Talence (France); Martinez, D.; Remington, B.; Smalyuk, V. A. [Lawrence Livermore National Laboratory, Livermore, California 94550 (United States); Igumenshchev, I.; Michel, D. T.; Froula, D.; Seka, W.; Goncharov, V. N. [Laboratory of Laser Energetics, Rochester, New York 14623-1299 (United States); Olazabal-Loumé, M.; Nicolaï, Ph.; Breil, J.; Tikhonchuk, V. T. [CELIA, University of Bordeaux-CNRS-CEA, F-33400 Talence (France); Fujioka, S. [Institute of Laser Engineering, Osaka University, Suita, Osaka 565 (Japan); and others

    2014-12-15

    Understanding and mitigating hydrodynamic instabilities and the fuel mix are the key elements for achieving ignition in Inertial Confinement Fusion. Cryogenic indirect-drive implosions on the National Ignition Facility have evidenced that the ablative Rayleigh-Taylor Instability (RTI) is a driver of the hot spot mix. This motivates the switch to a more flexible higher adiabat implosion design [O. A. Hurricane et al., Phys. Plasmas 21, 056313 (2014)]. The shell instability is also the main candidate for performance degradation in low-adiabat direct drive cryogenic implosions [Goncharov et al., Phys. Plasmas 21, 056315 (2014)]. This paper reviews recent results acquired in planar experiments performed on the OMEGA laser facility and devoted to the modeling and mitigation of hydrodynamic instabilities at the ablation front. In application to the indirect-drive scheme, we describe results obtained with a specific ablator composition such as the laminated ablator or a graded-dopant emulator. In application to the direct drive scheme, we discuss experiments devoted to the study of laser imprinted perturbations with special phase plates. The simulations of the Richtmyer-Meshkov phase reversal during the shock transit phase are challenging, and of crucial interest because this phase sets the seed of the RTI growth. Recent works were dedicated to increasing the accuracy of measurements of the phase inversion. We conclude by presenting a novel imprint mitigation mechanism based on the use of underdense foams. The foams induce laser smoothing by parametric instabilities thus reducing the laser imprint on the CH foil.

  13. NPDGamma: A Measurement of the Parity Violating Directional γ-Ray Asymmetry in Polarized Cold Neutron Capture on Hydrogen

    International Nuclear Information System (INIS)

    Fomin, Nadia

    2009-01-01

    The NPDGamma experiment aims to measure the correlation between the neutron spin and the direction of the emitted photon in neutron-proton capture at low momentum transfer. An up-down parity violating asymmetry from this process can be related to the strength of the hadronic weak interaction between nucleons.The first phase of the experiment was completed in 2006 at LANSCE. The methodology will be discussed and preliminary results will be presented. The next run will start in 2009 at the SNS at ORNL with many improvements that will yield a measurement with a projected statistical error of 1x10 -8 , 20% of the predicted value for the asymmetry. This will allow the determination of the long range n contribution in the weak interaction between nucleons.

  14. A bench-scale biotreatability methodology to evaluate field bioremediation

    International Nuclear Information System (INIS)

    Saberiyan, A.G.; MacPherson, J.R. Jr.; Moore, R.; Pruess, A.J.; Andrilenas, J.S.

    1995-01-01

    A bench-scale biotreatability methodology was designed to assess field bioremediation of petroleum contaminated soil samples. This methodology was performed successfully on soil samples from more than 40 sites. The methodology is composed of two phases, characterization and experimentation. The first phase is physical, chemical, and biological characterization of the contaminated soil sample. This phase determines soil parameters, contaminant type, presence of indigenous contaminant-degrading bacteria, and bacterial population size. The second phase, experimentation, consists of a respirometry test to measure the growth of microbes indirectly (via generation of CO 2 ) and the consumption of their food source directly (via contaminant loss). Based on a Monod kinetic analysis, the half-life of a contaminant can be calculated. Abiotic losses are accounted for based on a control test. The contaminant molecular structure is used to generate a stoichiometric equation. The stoichiometric equation yields a theoretical ratio for mg of contaminant degraded per mg of CO 2 produced. Data collected from the respirometry test are compared to theoretical values to evaluate bioremediation feasibility

  15. Research on the Effects of Process Parameters on Surface Roughness in Wet-Activated Silicon Direct Bonding Base on Orthogonal Experiments

    Directory of Open Access Journals (Sweden)

    Lei NIE

    2015-11-01

    Full Text Available Surface roughness is a very important index in silicon direct bonding and it is affected by processing parameters in the wet-activated process. These parameters include the concentration of activation solution, holding time and treatment temperature. The effects of these parameters were investigated by means of orthogonal experiments. In order to analyze the wafer roughness more accurately, the bear ratio of the surface was used as the evaluation index. From the results of the experiments, it could be concluded that the concentration of the activation solution affected the roughness directly and the higher the concentration, the lower the roughness. Holding time did not affect the roughness as acutely as that of the concentration, but a reduced activation time decreased the roughness perceptibly. It was also discovered that the treatment temperature had a weak correlation with the surface roughness. Based on these conclusions, the parameters of concentration, temperature and holding time were optimized respectively as NH4OH:H2O2=1:1 (without water, 70 °C and 5 min. The results of bonding experiments proved the validity of the conclusions of orthogonal experiments.DOI: http://dx.doi.org/10.5755/j01.ms.21.4.9711

  16. Learning Disability: Experience of Diagnosis

    Science.gov (United States)

    Kenyon, Elinor; Beail, Nigel; Jackson, Tom

    2014-01-01

    Studies have focused on the experience of diagnosis from the perspectives of parents of children with learning disabilities, but there has been limited methodologically rigorous investigation into the experience for the person themselves. Eight participants were recruited from a range of different backgrounds. Interviews were analysed using…

  17. Sliding spool design for reducing the actuation forces in direct operated proportional directional valves: Experimental validation

    International Nuclear Information System (INIS)

    Amirante, Riccardo; Distaso, Elia; Tamburrano, Paolo

    2016-01-01

    Highlights: • An innovative procedure to design a commercial proportional directional valve is shown. • Experimental tests are performed to demonstrate the flow force reduction. • The design is improved by means of a previously made optimization procedure. • Great reduction in the flow forces without reducing the flow rate is demonstrated. - Abstract: This paper presents the experimental validation of a new methodology for the design of the spool surfaces of four way three position direct operated proportional directional valves. The proposed methodology is based on the re-design of both the compensation profile (the central conical surface of the spool) and the lateral surfaces of the spool, in order to reduce the flow forces acting on the spool and hence the actuation forces. The aim of this work is to extend the application range of these valves to higher values of pressure and flow rate, thus avoiding the employment of more expensive two stage configurations in the case of high-pressure conditions and/or flow rate. The paper first presents a theoretical approach and a general strategy for the sliding spool design to be applied to any four way three position direct operated proportional directional valve. Then, the proposed approach is experimentally validated on a commercially available valve using a hydraulic circuit capable of measuring the flow rate as well as the actuation force over the entire spool stroke. The experimental results, performed using both the electronic driver provided by the manufacturer and a manual actuation system, show that the novel spool surface requires remarkably lower actuation forces compared to the commercial configuration, while maintaining the same flow rate trend as a function of the spool position.

  18. Methodology for performing RF reliability experiments on a generic test structure

    NARCIS (Netherlands)

    Sasse, G.T.; de Vries, Rein J.; Schmitz, Jurriaan

    2007-01-01

    This paper discusses a new technique developed for generating well defined RF large voltage swing signals for on wafer experiments. This technique can be employed for performing a broad range of different RF reliability experiments on one generic test structure. The frequency dependence of a

  19. Learning and Experience

    DEFF Research Database (Denmark)

    Olesen, Henning Salling

    2017-01-01

    Abstract: This chapter introduces a psycho-societal approach to theorizing learning, combining a materialist theory of socialization with a hermeneutic interpretation methodology. The term "approach" indicates the intrinsic connection between theory, empirical research process and epistemic subject....... Learning is theorized as dynamic subjective experience of (socially situated) realities, counting on individual subjectivity as well as subjective aspects of social interaction. This psycho-societal theory of subjective experiences conceptualizes individual psychic development as interactional experience...... of societal relations, producing an inner psycho-dynamic as a conscious and unconscious individual resource in future life. The symbolization of immediate sensual experiences form an individual life experience of social integration, language use being the medium of collective, social experience (knowledge...

  20. Simulation environment based on the Universal Verification Methodology

    International Nuclear Information System (INIS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  1. The Artistic Infant Directed Performance: A Mycroanalysis of the Adult's Movements and Sounds.

    Science.gov (United States)

    Español, Silvia; Shifres, Favio

    2015-09-01

    Intersubjectivity experiences established between adults and infants are partially determined by the particular ways in which adults are active in front of babies. An important amount of research focuses on the "musicality" of infant-directed speech (defined melodic contours, tonal and rhythm variations, etc.) and its role in linguistic enculturation. However, researchers have recently suggested that adults also bring a multimodal performance to infants. According to this, some scholars seem to find indicators of the genesis of the performing arts (mainly music and dance) in such a multimodal stimulation. We analyze the adult performance using analytical categories and methodologies of analysis broadly validated in the fields of music performance and movement analysis in contemporary dance. We present microanalyses of an adult-7 month old infant interaction scene that evidenced structural aspects of infant directed multimodal performance compatible with music and dance structures, and suggest functions of adult performance similar to performing arts functions or related to them.

  2. State-of-the-art report on the current status of methodologies for seismic PSA

    International Nuclear Information System (INIS)

    1998-01-01

    unlikely to exist at any other plant, even another similar plant. Sometimes the issue is site-related, and sometimes it is design-related. Also, even in areas where earthquakes are very uncommon phenomena, these types of accident sequences often appear as important contributors to the residual risk, typically because in such areas the attention given to designing nuclear stations against earthquakes is much less than in earthquake-prone areas. Given this background, it is obvious that no full-scope PSA can be considered complete without an examination of earthquakes. This report is a review of the methodology for conducting a seismic-PSA at a nuclear power station. The objective of this review is as follows: To provide an up-to-date review of the state-of-the-art of the various sub-methodologies that comprise the overall seismic- PSA methodology for addressing the safety of nuclear power stations, plus an overview of the whole methodological picture. In preparing this review, the author has had in mind several categories of readers and users: policy-level decision-makers (such as managers of nuclear power stations and regulators of nuclear safety), seismic- PSA practitioners, and PSA practitioners more broadly. The review will concentrate on evaluating the extent to which today's seismic-PSA methodology produces reliable and useful results and insights, at its current state-of-the-art level, for assessing nuclear-power station safety. Also, this review paper will deal exclusively with seismic-PSA for addressing nuclear-power-station safety. Because the author is based in the U.S., it is natural that this review will contain more emphasis on U.S. experience than on experience in other countries. However, significant experience elsewhere is a major part of the basis for this evaluation. In summary, this report is an up-to-date review of the state-of-the-art of the methodologies for conducting a seismic- PSA at a nuclear power station, including the six sub-methodologies that

  3. Sulfonylurea herbicides – methodological challenges in setting aquatic limit values

    DEFF Research Database (Denmark)

    Rosenkrantz, Rikke Tjørnhøj; Baun, Anders; Kusk, Kresten Ole

    according to the EU Water Framework Directive, the resulting Water Quality Standards (WQSs) are below the analytical quantification limit, making it difficult to verify compliance with the limit values. However, several methodological concerns may be raised in relation to the very low effect concentrations...... and rimsulfuron. The following parameters were varied during testing: pH, exposure duration, temperature and light/dark cycle. Preliminary results show that a decrease in pH causes an increase in toxicity for all compounds. Exposure to a high concentration for 24 hours caused a reduction in growth rate, from...... for setting limit values for SUs or if more detailed information should be gained by taking methodological considerations into account....

  4. Practical Aspects of Research Monitoring: Methodological and Functional Solutions

    Directory of Open Access Journals (Sweden)

    A A Onosov

    2013-12-01

    Full Text Available The article describes the experience of designing, testing and implementing the National system of monitoring the quality of meteorological services in Russia. Within the framework of this project a large-scale research program was carried out aimed to develop the conception, methodology, research tools and design of customer assessment of the Roshydromet services.

  5. Kinematics modeling and simulation of an autonomous omni-directional mobile robot

    Directory of Open Access Journals (Sweden)

    Daniel Garcia Sillas

    2015-05-01

    Full Text Available Although robotics has progressed to the extent that it has become relatively accessible with low-cost projects, there is still a need to create models that accurately represent the physical behavior of a robot. Creating a completely virtual platform allows us to test behavior algorithms such as those implemented using artificial intelligence, and additionally, it enables us to find potential problems in the physical design of the robot. The present work describes a methodology for the construction of a kinematic model and a simulation of the autonomous robot, specifically of an omni-directional wheeled robot. This paper presents the kinematic model development and its implementation using several tools. The result is a model that follows the kinematics of a triangular omni-directional mobile wheeled robot, which is then tested by using a 3D model imported from 3D Studio® and Matlab® for the simulation. The environment used for the experiment is very close to the real environment and reflects the kinematic characteristics of the robot.

  6. Mid-callosal plane determination using preferred directions from diffusion tensor images

    Science.gov (United States)

    Costa, André L.; Rittner, Letícia; Lotufo, Roberto A.; Appenzeller, Simone

    2015-03-01

    The corpus callosum is the major brain structure responsible for inter{hemispheric communication between neurons. Many studies seek to relate corpus callosum attributes to patient characteristics, cerebral diseases and psychological disorders. Most of those studies rely on 2D analysis of the corpus callosum in the mid-sagittal plane. However, it is common to find conflicting results among studies, once many ignore methodological issues and define the mid-sagittal plane based on precary or invalid criteria with respect to the corpus callosum. In this work we propose a novel method to determine the mid-callosal plane using the corpus callosum internal preferred diffusion directions obtained from diffusion tensor images. This plane is analogous to the mid-sagittal plane, but intended to serve exclusively as the corpus callosum reference. Our method elucidates the great potential the directional information of the corpus callosum fibers have to indicate its own referential. Results from experiments with five image pairs from distinct subjects, obtained under the same conditions, demonstrate the method effectiveness to find the corpus callosum symmetric axis relative to the axial plane.

  7. Design-based research as a “smart” methodology for studying learning in the context of work

    DEFF Research Database (Denmark)

    Kolbæk, Ditte

    Although Design-based Research (DBR) was developed for investigating class-room training this paper discusses methodological issues when DBR is employed for investigating learning in the context of work, as it is an authentic learning environment, a real-world setting for fostering learning...... and creating usable knowledge and knowing. The purpose of this paper is to provide new perspectives on DBR regarding how to conduct DBR for studying learning from experience in the context of work. The research question is: What to consider to make DBR a smart methodology for exploring learning from experience...

  8. Safety on Judo Children: Methodology and Results

    OpenAIRE

    Sacripanti, Attilio; De Blasis, Tania

    2017-01-01

    Many doctors although they have not firsthand experience of judo, describe it as a sport unsuitable for children. Theoretically speaking falls derived by Judo throwing techniques,could be potentially dangerous,especially for kids,if poorly managed.A lot of researches were focalized on trauma or injuries taking place in judo, both during training and competition The goal of this Research is to define and apply a scientific methodology to evaluate the hazard in falls by judo throws for children...

  9. Experience of plastic surgery registrars in a European Working Time Directive compliant rota.

    Science.gov (United States)

    de Blacam, Catherine; Tierney, Sean; Shelley, Odhran

    2017-08-01

    Surgical training requires exposure to clinical decision-making and operative experience in a supervised environment. It is recognised that learning ability is compromised when fatigued. The European Working Time Directive requires a decrease in working hours, but compliance reduces trainees' clinical exposure, which has profound implications for plastic surgery training. The aim of this study was to evaluate plastic surgery registrars' experience of an EWTD-compliant rota, and to examine its impact on patient care, education, and logbook activity. An electronic survey was distributed to plastic surgery registrars in a university teaching hospital. Registrars were asked to rate 31 items on a five-point Likert scale, including statements on patient care, clinical and operative duties, training, and quality-of-life. Interquartile deviations explored consensus among responses. Operative caseload was objectively evaluated using eLogbook data to compare activity at equal time points before and after implementation of the EWTD rota. Highest levels of consensus among respondents were found in positive statements addressing alertness and preparation for theatre, as well as time to read and study for exams. Registrars agreed that EWTD compliance improved their quality-of-life. However, it was felt that continuity of patient care was compromised by work hours restriction. Registrars were concerned about their operative experience. eLogbook data confirmed a fall-off in mean caseload of 31.8% compared to activity prior to EWTD rota implementation. While EWTD compliant rotas promote trainee quality-of-life and satisfaction with training, attention needs to be paid to optimising operative opportunities.

  10. Statistical Methodologies to Integrate Experimental and Computational Research

    Science.gov (United States)

    Parker, P. A.; Johnson, R. T.; Montgomery, D. C.

    2008-01-01

    Development of advanced algorithms for simulating engine flow paths requires the integration of fundamental experiments with the validation of enhanced mathematical models. In this paper, we provide an overview of statistical methods to strategically and efficiently conduct experiments and computational model refinement. Moreover, the integration of experimental and computational research efforts is emphasized. With a statistical engineering perspective, scientific and engineering expertise is combined with statistical sciences to gain deeper insights into experimental phenomenon and code development performance; supporting the overall research objectives. The particular statistical methods discussed are design of experiments, response surface methodology, and uncertainty analysis and planning. Their application is illustrated with a coaxial free jet experiment and a turbulence model refinement investigation. Our goal is to provide an overview, focusing on concepts rather than practice, to demonstrate the benefits of using statistical methods in research and development, thereby encouraging their broader and more systematic application.

  11. Performance-based methodology for assessing seismic vulnerability and capacity of buildings

    Science.gov (United States)

    Shibin, Lin; Lili, Xie; Maosheng, Gong; Ming, Li

    2010-06-01

    This paper presents a performance-based methodology for the assessment of seismic vulnerability and capacity of buildings. The vulnerability assessment methodology is based on the HAZUS methodology and the improved capacitydemand-diagram method. The spectral displacement ( S d ) of performance points on a capacity curve is used to estimate the damage level of a building. The relationship between S d and peak ground acceleration (PGA) is established, and then a new vulnerability function is expressed in terms of PGA. Furthermore, the expected value of the seismic capacity index (SCev) is provided to estimate the seismic capacity of buildings based on the probability distribution of damage levels and the corresponding seismic capacity index. The results indicate that the proposed vulnerability methodology is able to assess seismic damage of a large number of building stock directly and quickly following an earthquake. The SCev provides an effective index to measure the seismic capacity of buildings and illustrate the relationship between the seismic capacity of buildings and seismic action. The estimated result is compared with damage surveys of the cities of Dujiangyan and Jiangyou in the M8.0 Wenchuan earthquake, revealing that the methodology is acceptable for seismic risk assessment and decision making. The primary reasons for discrepancies between the estimated results and the damage surveys are discussed.

  12. Direct measurement of the cross section of neutron-neutron scattering at the YAGUAR reactor. Substantiation of the experiment technique

    International Nuclear Information System (INIS)

    Chernukhin, Yu.G.; Kandiev, Ya.Z.; Lartsev, V.D.; Levakov, B.G.; Modestov, D.G.; Simonenko, V.A.; Streltsov, S.I.; Khmel'nitskij, D.V.

    2006-01-01

    The main stage of experiment for direct measurement of cross section of neutron-neutron scattering σ nn at low energies (E nn determination. It was shown, that for achieving the criterion ε ∼ 4% it will be necessary to have 40-50 pulses of a reactor [ru

  13. Evaluation and monitoring of Research and Development projects and programmes: Experience of the task force on the evaluation of TAFTIE; La valutazione e il monitoraggio dei progetti e dei programmi di R&S: esperienza della Task Force sulla valutazione di TAFTIE

    Energy Technology Data Exchange (ETDEWEB)

    Scarpitti, L [ENEA, Rome (Italy). Funzione Centrale Studi

    1996-08-01

    This paper allows to compare the experiences in the evaluation and monitoring fields among 8 different European agencies (ENEA, Italian National Agency for New Technologies, Energy and the Environment, included) directly involved in technology transfer projects and programmes. In order to compare the different experiences three levels of analysis are used: evaluation methodologies, performance indicators, project characterisation and databases.

  14. Development of a novel methodology for indoor emission source identification

    DEFF Research Database (Denmark)

    Han, K.H.; Zhang, J.S.; Knudsen, H.N.

    2011-01-01

    The objective of this study was to develop and evaluate a methodology to identify individual sources of emissions based on the measurements of mixed air samples and the emission signatures of individual materials previously determined by Proton Transfer Reaction-Mass Spectrometry (PTR-MS), an on......-line analytical device. The methodology based on signal processing principles was developed by employing the method of multiple regression least squares (MRLS) and a normalization technique. Samples of nine typical building materials were tested individually and in combination, including carpet, ceiling material...... experiments and investigation are needed for cases where the relative emission rates among different compounds may change over a long-term period....

  15. Modernising educational programmes in ICT based on the Tuning methodology

    Directory of Open Access Journals (Sweden)

    Alexander Bedny

    2014-07-01

    Full Text Available An analysis is presented of the experience of modernising undergraduate educational programs using the TUNING methodology, based on the example of the area of studies “Fundamental computer science and information technology” (FCSIT implemented at Lobachevsky State University of Nizhni Novgorod (Russia. The algorithm for reforming curricula for the subject area of information technology in accordance with the TUNING methodology is explained. A comparison is drawn between the existing Russian and European standards in the area of ICT education, including the European e-Competence Framework, with the focus on relevant competences. Some guidelines for the preparation of educational programmes are also provided.

  16. A multi-modal approach to soft systems methodology

    OpenAIRE

    Bergvall-Kåreborn, Birgitta

    2002-01-01

    The main aim of my research is to explore ways of enriching Soft Systems Methodology by developing intellectual tools that can help designers to conceptualise, create and evaluate different design alternatives. This directs the focus on the methodology’s modelling phase even though some ideas related to analysis also will be presented. In order to realize this objective the study proposes the following supplements. Firstly, a framework of 15 modalities (knowledge areas) is suggested as a supp...

  17. Transcranial Direct Current Stimulation (tDCS: A Beginner's Guide for Design and Implementation

    Directory of Open Access Journals (Sweden)

    Hayley Thair

    2017-11-01

    Full Text Available Transcranial direct current stimulation (tDCS is a popular brain stimulation method that is used to modulate cortical excitability, producing facilitatory or inhibitory effects upon a variety of behaviors. There is, however, a current lack of consensus between studies, with many results suggesting that polarity-specific effects are difficult to obtain. This article explores some of these differences and highlights the experimental parameters that may underlie their occurrence. We provide a general, practical snapshot of tDCS methodology, including what it is used for, how to use it, and considerations for designing an effective and safe experiment. Our aim is to equip researchers who are new to tDCS with the essential knowledge so that they can make informed and well-rounded decisions when designing and running successful experiments. By summarizing the varied approaches, stimulation parameters, and outcomes, this article should help inform future tDCS research in a variety of fields.

  18. Transcranial Direct Current Stimulation (tDCS): A Beginner's Guide for Design and Implementation

    Science.gov (United States)

    Thair, Hayley; Holloway, Amy L.; Newport, Roger; Smith, Alastair D.

    2017-01-01

    Transcranial direct current stimulation (tDCS) is a popular brain stimulation method that is used to modulate cortical excitability, producing facilitatory or inhibitory effects upon a variety of behaviors. There is, however, a current lack of consensus between studies, with many results suggesting that polarity-specific effects are difficult to obtain. This article explores some of these differences and highlights the experimental parameters that may underlie their occurrence. We provide a general, practical snapshot of tDCS methodology, including what it is used for, how to use it, and considerations for designing an effective and safe experiment. Our aim is to equip researchers who are new to tDCS with the essential knowledge so that they can make informed and well-rounded decisions when designing and running successful experiments. By summarizing the varied approaches, stimulation parameters, and outcomes, this article should help inform future tDCS research in a variety of fields. PMID:29213226

  19. Getting the astrophysics and particle physics of dark matter out of next-generation direct detection experiments

    International Nuclear Information System (INIS)

    Peter, Annika H. G.

    2010-01-01

    The next decade will bring massive new data sets from experiments of the direct detection of weakly interacting massive particle dark matter. Mapping the data sets to the particle-physics properties of dark matter is complicated not only by the considerable uncertainties in the dark-matter model, but by its poorly constrained local distribution function (the 'astrophysics' of dark matter). I propose a shift in how to think about direct-detection data analysis. I show that by treating the astrophysical and particle-physics uncertainties of dark matter on equal footing, and by incorporating a combination of data sets into the analysis, one may recover both the particle physics and astrophysics of dark matter. Not only does such an approach yield more accurate estimates of dark-matter properties, but it may illuminate how dark matter coevolves with galaxies.

  20. Development and Current Status of Skull-Image Superimposition - Methodology and Instrumentation.

    Science.gov (United States)

    Lan, Y

    1992-12-01

    This article presents a review of the literature and an evaluation on the development and application of skull-image superimposition technology - both instrumentation and methodology - contributed by a number of scholars since 1935. Along with a comparison of the methodologies involved in the two superimposition techniques - photographic and video - the author characterized the techniques in action and the recent advances in computer image superimposition processing technology. The major disadvantage of conventional approaches is its relying on subjective interpretation. Through painstaking comparison and analysis, computer image processing technology can make more conclusive identifications by direct testing and evaluating the various programmed indices. Copyright © 1992 Central Police University.

  1. Direct Optimal Control of Duffing Dynamics

    Science.gov (United States)

    Oz, Hayrani; Ramsey, John K.

    2002-01-01

    The "direct control method" is a novel concept that is an attractive alternative and competitor to the differential-equation-based methods. The direct method is equally well applicable to nonlinear, linear, time-varying, and time-invariant systems. For all such systems, the method yields explicit closed-form control laws based on minimization of a quadratic control performance measure. We present an application of the direct method to the dynamics and optimal control of the Duffing system where the control performance measure is not restricted to a quadratic form and hence may include a quartic energy term. The results we present in this report also constitute further generalizations of our earlier work in "direct optimal control methodology." The approach is demonstrated for the optimal control of the Duffing equation with a softening nonlinear stiffness.

  2. Clinical trial methodology

    National Research Council Canada - National Science Library

    Peace, Karl E; Chen, Ding-Geng

    2011-01-01

    ... in the pharmaceutical industry, Clinical trial methodology emphasizes the importance of statistical thinking in clinical research and presents the methodology as a key component of clinical research...

  3. A proposed methodology for computational fluid dynamics code verification, calibration, and validation

    Science.gov (United States)

    Aeschliman, D. P.; Oberkampf, W. L.; Blottner, F. G.

    Verification, calibration, and validation (VCV) of Computational Fluid Dynamics (CFD) codes is an essential element of the code development process. The exact manner in which code VCV activities are planned and conducted, however, is critically important. It is suggested that the way in which code validation, in particular, is often conducted--by comparison to published experimental data obtained for other purposes--is in general difficult and unsatisfactory, and that a different approach is required. This paper describes a proposed methodology for CFD code VCV that meets the technical requirements and is philosophically consistent with code development needs. The proposed methodology stresses teamwork and cooperation between code developers and experimentalists throughout the VCV process, and takes advantage of certain synergisms between CFD and experiment. A novel approach to uncertainty analysis is described which can both distinguish between and quantify various types of experimental error, and whose attributes are used to help define an appropriate experimental design for code VCV experiments. The methodology is demonstrated with an example of laminar, hypersonic, near perfect gas, 3-dimensional flow over a sliced sphere/cone of varying geometrical complexity.

  4. Small Works, Big Stories. Methodological approaches to photogrammetry through crowd-sourcing experiences

    Directory of Open Access Journals (Sweden)

    Seren Griffiths

    2015-12-01

    Full Text Available A recent digital public archaeology project (HeritageTogether sought to build a series of 3D ditigal models using photogrammetry from crowd-sourced images. The project saw over 13000 digital images being donated, and resulted in models of some 78 sites, providing resources for researchers, and condition surveys. The project demonstrated that digital public archaeology does not stop at the 'trowel's edge', and that collaborative post-excavation analysis and generation of research processes are as important as time in the field. We emphasise in this contribution that our methodologies, as much as our research outputs, can be fruitfully co-produced in public archaeology projects.

  5. Orientaciones metodológicas de la disciplina anatomía humana en las sedes universitarias municipales Methodological directions on human anatomy discipline in the municipal university venues

    Directory of Open Access Journals (Sweden)

    Iraida Hidalgo Gato Castillo

    2007-06-01

    Full Text Available El proceso docente educativo en las sedes universitarias municipales se encuentran a cargo de médicos generales integrales, por lo que el colectivo de Anatomía Humana trazó orientaciones metodológicas que caracterizan la disciplina Anatomía Humana, explicando las formas de organización de la enseñanza a través de cinco sistemas: objetivos, conocimientos, habilidades, clases y evaluación. Se recomienda la bibliografía básica, complementaria, auxiliar y de consulta, así como el estudio independiente. De manera que garantizan la preparación metodológica de todos los facilitadores que están comprometidos con el proceso docente educativo del actual modelo de formación.The educative teaching process in the municipal university venues is in charge of the family doctors, so the teaching staff of Human Anatomy gave methodological directions characterizing the Human Anatomy discipline, explaining the ways of organization of Teaching by means of five systems: objectives, knowledge, training, classes and evaluation. The basic, complementary, auxiliary and consultation bibliography is recommended, as well as the individual study, so it will guarantee the methodological training of all the providers involved in the educative teaching process of the current training model.

  6. The interface between film and social roles in docudrama: A case study of directing methodology of Želimir Žilnik

    Directory of Open Access Journals (Sweden)

    Vojnović Miljan

    2015-01-01

    Full Text Available Theoretical observations as well as descriptions of applied concepts in this paper, represent one segment of a broader research by the author in the field of filmology and the creative stage process. The author is applying a comparative analysis of dramatic rules through the psychotherapeutic method of psychodrama in the case of film methodology of the director Želimir Žilnik. The aim of the paper is to consider a correlation of the psychodramatic role theory with the characteristics of specific documentary subgenre - docudrama, by identifying key elements of the stage expression realism and implied methodological tools in the creative process of the scenic and dramatic treatment of social topics. Our starting point is that the psychodramatic initiation of a spontaneous expression on stage, by recognizing the characteristics of social roles of an authentic personality - a character in real life, can create a functional methodological discourse for the interpretation of the treated content in film or theatre.

  7. Daycare Staff Emotions and Coping Related to Children of Divorce: A Q Methodological Study

    Science.gov (United States)

    Øverland, Klara; Størksen, Ingunn; Bru, Edvin; Thorsen, Arlene Arstad

    2014-01-01

    This Q methodological study explores emotional experiences and coping of daycare staff when working with children of divorce and their families. Two main coping strategies among daycare staff were identified: 1) Confident copers, and 2) Non-confident copers. Interviews exemplify the two main experiences. Both groups may struggle with coping in…

  8. DIRECTIONS FOR EFFECTIVE USE OF FOREST RESOURCES IN UKRAINE

    Directory of Open Access Journals (Sweden)

    Mariana Svyntukh

    2015-11-01

    Full Text Available Purpose. The aim of the article is determination and substantiation of directions of rational use of forest resources in Ukraine. Methodology of research. The theoretical and methodological basis of conducted research is the provision of economic theory, sustainable development, environmental economics and economics forest exploitation. The following methodological tools and techniques were used to achieve this goal: methods of analysis and synthesis (to identify problems of the relationship for using potential of forest resources with factors of influence on their reproduction, the studying essence of the term “forest resources”; monographic – to study the experience of forming rational use of forest resources and wood waste; systematic approach (in substantiating the use of instruments for regulation forest exploitation; scientific abstraction (in the study of capabilities to ensure the process of rational reproduction of forest resources; graphic (for visual images of some analytical observations. Results. Theoretical approach to forest regeneration as a major task in forest anagement, which includes the integrated use of all available organizational and technological measures to facilitate its natural regeneration has been formulated. It has been established the regularity of ensuring the efficient use of waste wood in places of billets, identified and systematized its forms for future use. The methodical approach to assess the effect of using wood waste for fuel production and related products during processing on the harmonization of economic and environmental interests in the area of forest exploitation has been formulated. Practical implications. The obtained results are the basis for solving practical problems of integrated management of forest resources in Ukraine, waste of forest felling in the places of timber harvesting and also for development of the system of measures to improve the ecological and economic mechanism of

  9. Estimation of CO2 emissions from China’s cement production: Methodologies and uncertainties

    International Nuclear Information System (INIS)

    Ke, Jing; McNeil, Michael; Price, Lynn; Khanna, Nina Zheng; Zhou, Nan

    2013-01-01

    In 2010, China’s cement output was 1.9 Gt, which accounted for 56% of world cement production. Total carbon dioxide (CO 2 ) emissions from Chinese cement production could therefore exceed 1.2 Gt. The magnitude of emissions from this single industrial sector in one country underscores the need to understand the uncertainty of current estimates of cement emissions in China. This paper compares several methodologies for calculating CO 2 emissions from cement production, including the three main components of emissions: direct emissions from the calcination process for clinker production, direct emissions from fossil fuel combustion and indirect emissions from electricity consumption. This paper examines in detail the differences between common methodologies for each emission component, and considers their effect on total emissions. We then evaluate the overall level of uncertainty implied by the differences among methodologies according to recommendations of the Joint Committee for Guides in Metrology. We find a relative uncertainty in China’s cement-related emissions in the range of 10 to 18%. This result highlights the importance of understanding and refining methods of estimating emissions in this important industrial sector. - Highlights: ► CO 2 emission estimates are critical given China’s cement production scale. ► Methodological differences for emission components are compared. ► Results show relative uncertainty in China’s cement-related emissions of about 10%. ► IPCC Guidelines and CSI Cement CO 2 and Energy Protocol are recommended

  10. Importance of the lipid peroxidation biomarkers and methodological aspects FOR malondialdehyde quantification

    Directory of Open Access Journals (Sweden)

    Denise Grotto

    2009-01-01

    Full Text Available Free radicals induce lipid peroxidation, playing an important role in pathological processes. The injury mediated by free radicals can be measured by conjugated dienes, malondialdehyde, 4-hydroxynonenal, and others. However, malondialdehyde has been pointed out as the main product to evaluate lipid peroxidation. Most assays determine malondialdehyde by its reaction with thiobarbituric acid, which can be measured by indirect (spectrometry and direct methodologies (chromatography. Though there is some controversy among the methodologies, the selective HPLC-based assays provide a more reliable lipid peroxidation measure. This review describes significant aspects about MDA determination, its importance in pathologies and biological samples treatment.

  11. Experience-based co-design in an adult psychological therapies service.

    Science.gov (United States)

    Cooper, Kate; Gillmore, Chris; Hogg, Lorna

    2016-01-01

    Experience-based co-design (EBCD) is a methodology for service improvement and development, which puts service-user voices at the heart of improving health services. The aim of this paper was to implement the EBCD methodology in a mental health setting, and to investigate the challenges which arise during this process. In order to achieve this, a modified version of the EBCD methodology was undertaken, which involved listening to the experiences of the people who work in and use the mental health setting and sharing these experiences with the people who could effect change within the service, through collaborative work between service-users, staff and managers. EBCD was implemented within the mental health setting and was well received by service-users, staff and stakeholders. A number of modifications were necessary in this setting, for example high levels of support available to participants. It was concluded that EBCD is a suitable methodology for service improvement in mental health settings.

  12. An alternative methodology for the mathematical treatment of GPS positioning

    Directory of Open Access Journals (Sweden)

    Aly M. El-naggar

    2011-12-01

    In this paper a simple alternative method is developed to solve the GPS navigation equations directly without linearization and iteration. A practical study was done to evaluate the new model. Performance analysis was conducted using data collected by Trimble 4000SSE dual frequency receiver. The results indicated that the alternative methodology is simple, fast, and accurate as compared to Taylor method.

  13. Attack Methodology Analysis: Emerging Trends in Computer-Based Attack Methodologies and Their Applicability to Control System Networks

    Energy Technology Data Exchange (ETDEWEB)

    Bri Rolston

    2005-06-01

    Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills, and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between

  14. Methodological individualism in experimental games: not so easily dismissed.

    Science.gov (United States)

    Krueger, Joachim I

    2008-06-01

    Orthodox game theory and social preference models cannot explain why people cooperate in many experimental games or how they manage to coordinate their choices. The theory of evidential decision making provides a solution, based on the idea that people tend to project their own choices onto others, whatever these choices might be. Evidential decision making preserves methodological individualism, and it works without recourse to social preferences. Rejecting methodological individualism, team reasoning is a thinly disguised resurgence of the group mind fallacy, and the experiments reported by Colman et al. [Colman, A. M., Pulford, B. D., & Rose, J. (this issue). Collective rationality in interactive decisions: Evidence for team reasoning. Acta Psychologica, doi:10.1016/j.actpsy.2007.08.003.] do not offer evidence that uniquely supports team reasoning.

  15. A Qualitative Methodology for Minority Language Media Production Research

    Directory of Open Access Journals (Sweden)

    Enrique Uribe-Jongbloed PhD

    2014-02-01

    Full Text Available This article presents a methodological construction for research on small groups of minority media producers, especially those who are involved in multilingual settings. The set of qualitative tools are explained and their advantages and disadvantages explored, based on the literature on the subject. Then, the debate is contrasted with the practical experience of its application with minority language producers, indigenous and ethnic radio broadcasters in Colombia and audiovisual producers in Wales. The reflection upon the results leads to a final discussion that brings the adjustments required to increase the advantages and diminish the disadvantages of the proposed combined three-step methodology of an interview to the double (ITTD, a day of participant observation, and a final lengthier semi-structured interview.

  16. Health effects assessment of chemical exposures: ARIES methodology

    Energy Technology Data Exchange (ETDEWEB)

    Sierra, L; Montero, M.; Rabago, I.; Vidania, R.

    1995-07-01

    In this work, we present ARIES* update: a system designed in order to facilitate the human health effects assessment produced by accidental release of toxic chemicals. The first version of ARIES was developed in relation to 82/501/EEC Directive about mayor accidents in the chemical industry. So, the first aim was the support of the effects assessment derived for the chemicals included into this directive. From this establishment, it was considered acute exposures for high concentrations. In this report, we present the actual methodology for considering other type of exposures, such as environmental and occupational. Likewise other versions, the methodology comprises two approaches: quantitative and qualitative assessments. Quantitative assessment incorporates the mathematical algorithms useful to evaluate the effects produced by the most important routes of exposure: inhalation, ingestion, eye contact and skin absorption, in a short, medium and long term. It has been included models that realizes an accurate quantification of doses, effects,... and so on, such as simple approaches when the available information is not enough. Qualitative assessment, designed in order to complement or replace the previous one, is incorporated into an informatics system, developed in Clipper. It executes and displays outstanding and important toxicological information of about 100 chemicals. This information comes from ECDIN (Environmental Chemicals Data and Information Network) database through a collaboration with JRC-ISPRA working group. (Author) 24 refs.

  17. Health effects assessment of chemical exposures: ARIES methodology

    International Nuclear Information System (INIS)

    Sierra, L; Montero, M.; Rabago, I.; Vidania, R.

    1995-01-01

    In this work, we present ARIES* update: a system designed in order to facilitate the human health effects assessment produced by accidental release of toxic chemicals. The first version of ARIES was developed in relation to 82/501/EEC Directive about mayor accidents in the chemical industry. So, the first aim was the support of the effects assessment derived for the chemicals included into this directive. From this establishment, it was considered acute exposures for high concentrations. In this report, we present the actual methodology for considering other type of exposures, such as environmental and occupational. Likewise other versions, the methodology comprises two approaches: quantitative and qualitative assessments. Quantitative assessment incorporates the mathematical algorithms useful to evaluate the effects produced by the most important routes of exposure: inhalation, ingestion, eye contact and skin absorption, in a short, medium and long term. It has been included models that realizes an accurate quantification of doses, effects,... and so on, such as simple approaches when the available information is not enough. Qualitative assessment, designed in order to complement or replace the previous one, is incorporated into an informatics system, developed in Clipper. It executes and displays outstanding and important toxicological information of about 100 chemicals. This information comes from ECDIN (Environmental Chemicals Data and Information Network) database through a collaboration with JRC-ISPRA working group. (Author) 24 refs

  18. Human Schedule Performance, Protocol Analysis, and the "Silent Dog" Methodology

    Science.gov (United States)

    Cabello, Francisco; Luciano, Carmen; Gomez, Inmaculada; Barnes-Holmes, Dermot

    2004-01-01

    The purpose of the current experiment was to investigate the role of private verbal behavior on the operant performances of human adults, using a protocol analysis procedure with additional methodological controls (the "silent dog" method). Twelve subjects were exposed to fixed ratio 8 and differential reinforcement of low rate 3-s schedules. For…

  19. MSSM A-funnel and the galactic center excess: prospects for the LHC and direct detection experiments

    Energy Technology Data Exchange (ETDEWEB)

    Freese, Katherine [Nordita (Nordic Institute for Theoretical Physics),KTH Royal Institute of Technology and Stockholm University,Roslagstullsbacken 23, SE-106 91 Stockholm (Sweden); The Oskar Klein Center for Cosmoparticle Physics, AlbaNova University Center,University of Stockholm,10691 Stockholm (Sweden); Michigan Center for Theoretical Physics, Department of Physics, University of Michigan,Ann Arbor, MI 48109 (United States); López, Alejandro [Michigan Center for Theoretical Physics, Department of Physics, University of Michigan,Ann Arbor, MI 48109 (United States); Shah, Nausheen R. [Michigan Center for Theoretical Physics, Department of Physics, University of Michigan,Ann Arbor, MI 48109 (United States); Department of Physics and Astronomy, Wayne State University,Detroit, Michigan 48201 (United States); Shakya, Bibhushan [Michigan Center for Theoretical Physics, Department of Physics, University of Michigan,Ann Arbor, MI 48109 (United States)

    2016-04-11

    The pseudoscalar resonance or “A-funnel' in the Minimal Supersymmetric Standard Model (MSSM) is a widely studied framework for explaining dark matter that can yield interesting indirect detection and collider signals. The well-known Galactic Center excess (GCE) at GeV energies in the gamma ray spectrum, consistent with annihilation of a ≲40 GeV dark matter particle, has more recently been shown to be compatible with significantly heavier masses following reanalysis of the background. In this paper, we explore the LHC and direct detection implications of interpreting the GCE in this extended mass window within the MSSM A-funnel framework. We find that compatibility with relic density, signal strength, collider constraints, and Higgs data can be simultaneously achieved with appropriate parameter choices. The compatible regions give very sharp predictions of 200–600 GeV CP-odd/even Higgs bosons at low tan β at the LHC and spin-independent cross sections ≈10{sup −11} pb at direct detection experiments. Regardless of consistency with the GCE, this study serves as a useful template of the strong correlations between indirect, direct, and LHC signatures of the MSSM A-funnel region.

  20. DESIGN OF EXPERIMENTS IN TRUCK COMPANY

    Directory of Open Access Journals (Sweden)

    Bibiana Kaselyova

    2015-07-01

    Full Text Available Purpose: Design of experiment (DOE represent very powerful tool for process improvement vastly supported by six sigma methodology. This approach is mostly used by large and manufacturing orientated companies. Presented research is focused on use of DOE in truck company, which is medium size and service orientated. Such study has several purposes. Firstly, detailed description of improvement effort based on DOE can be used as a methodological framework for companies similar to researched one. Secondly, it provides example of successfully implemented low cost design of experiment practise. Moreover, performed experiment identifies key factors, which influence the lifetime of truck tyres.Design/methodology: The research in this paper is based on experiment conducted in Slovakian Truck Company. It provides detailed case study of whole improvement effort, together with problem formulation, design creation and analysis, as well as the results interpretation. The company wants to improve lifetime of the truck tyres. Next to fuel consumption, consumption of tyres and their replacement represent according to them, one of most costly processes in company. Improvement effort was made through the use of PDCA cycle. It start with analysis of current state of tyres consumption. The variability of tyres consumption based on years and types was investigated. Then the causes of tyres replacement were identified and screening DOE was conducted. After a screening design, the full factorial design of experiment was used to identify main drivers of tyres deterioration and breakdowns. Based on result of DOE, the corrective action were propose and implement.Findings: Based on performed experiment our research describes process of tyres use and replacement. It defines main reasons for tyre breakdown and identify main drivers which influence truck tyres lifetime. Moreover it formulates corrective action to prolong tyres lifetime.Originality: The study represents full

  1. IMPLEMENTATION OF DIRECTIVE 2013/34/EU IN UKRAINE WITH INTERNATIONAL EXPERIENCE

    Directory of Open Access Journals (Sweden)

    N. Gura

    2015-10-01

    Full Text Available Basic changes that is contained by Directive of 2013/34/ЄС are grouped in the article. The debatable questions of Directive, certain the European researchers, are exposed. The features of introduction of Directive are distinguished in the separate countries of ЄС on the different levels of the normative adjusting of record-keeping. The necessity of taking into account of national terms is reasonable. The comparative analysis of positions of Directive is carried out with the Ukrainian legislation. Many questions of Directive, that is taken into account in the Ukrainian legislation, are certain. Desirable changes are grouped in the Ukrainian legislation in accordance with the divisions of Directive. The problem questions of імплементації of Directive are exposed in Ukraine.

  2. Studies for the electro-magnetic calorimeter SplitCal for the SHiP experiment at CERN with shower direction reconstruction capability

    Science.gov (United States)

    Bonivento, Walter M.

    2018-02-01

    This paper describes the basic ideas and the first simulation results of a new electro-magnetic calorimeter concept, named SplitCal, aimed at optimising the measurement of photon direction in fixed-target experiment configuration, with high photon detection efficiency. This calorimeter was designed for the invariant mass reconstruction of axion-like particles decaying into two photons in the mass range 200 MeV to 1 GeV for the proposed proton beam dump experiment SHiP at CERN. Preliminary results indicate that angular resolutions better than obtained by past experiments can be achieved with this design. An implementation of this concept with real technologies is under study.

  3. A Methodology for Retrieving Information from Malware Encrypted Output Files: Brazilian Case Studies

    Directory of Open Access Journals (Sweden)

    Nelson Uto

    2013-04-01

    Full Text Available This article presents and explains a methodology based on cryptanalytic and reverse engineering techniques that can be employed to quickly recover information from encrypted files generated by malware. The objective of the methodology is to minimize the effort with static and dynamic analysis, by using cryptanalysis and related knowledge as much as possible. In order to illustrate how it works, we present three case studies, taken from a big Brazilian company that was victimized by directed attacks focused on stealing information from a special purpose hardware they use in their environment.

  4. Qualitative interviewing: methodological challenges in Arab settings.

    Science.gov (United States)

    Hawamdeh, Sana; Raigangar, Veena

    2014-01-01

    To explore some of the main methodological challenges faced by interviewers in Arab settings, particularly during interviews with psychiatric nurses. Interviews are a tool used commonly in qualitative research. However, the cultural norms and practices of interviewees must be considered to ensure that an appropriate interviewing style is used, a good interviewee-interviewer relationship formed and consent for participation obtained sensitively. A study to explore the nature of psychiatric nurses' practices that used unstructured interviews. This is a methodology paper that discusses a personal experience of addressing many challenges that are specific to qualitative interviewing in Arab settings, supported by literature on the topic. Suggestions for improving the interview process to make it more culturally sensitive are provided and recommendations for future research are made. Openness, flexibility and a reflexive approach by the researcher can help manage challenges in Arab settings. Researchers should allow themselves to understand the cultural elements of a population to adapt interviewing methods with the aim of generating high quality qualitative research.

  5. Bayesian methodology for generic seismic fragility evaluation of components in nuclear power plants

    International Nuclear Information System (INIS)

    Yamaguchi, Akira; Campbell, R.D.; Ravindra, M.K.

    1991-01-01

    Bayesian methodology for updating the seismic fragility of components in nuclear power plants is presented. The generic fragility data which have been evaluated based on the past SPSAs are combined with the seismic experience data. Although the seismic experience is limited to the acceleration range below the median capacity of the components, it has been found that the evidence is effective to update the fragility tail. In other words, the uncertainty of the fragility is reduced although the median capacity itself is not modified to a great extent. The annual frequency of failure is also reduced as a result of the updating of the fragility tail. The PDF of the seismic capacity is handled in discrete form, which enables the use of arbitrary type of prior distribution. Accordingly, the Log-N prior can be used which is consistent with the widely used fragility model. For evaluating posterior fragility parameters (A m and B U ), two methods have been proposed. Furthermore, it has been found that the importance of evidence used in the Bayesian methodology can be quantified by the entropy of the evidence. Only the events with high entropy need to be considered in the Bayesian updating of the fragility. The currently available seismic experience database for typical components can be utilized to develop the fragility tail which is contributive to the seismically-induced failure frequency. The combined use of generic fragility and seismic experience data, with the aid of Bayesian methodology, provides refined generic fragility curves which are useful for SPSA studies. (author)

  6. Integral Design Methodology of Photocatalytic Reactors for Air Pollution Remediation

    Directory of Open Access Journals (Sweden)

    Claudio Passalía

    2017-06-01

    Full Text Available An integral reactor design methodology was developed to address the optimal design of photocatalytic wall reactors to be used in air pollution control. For a target pollutant to be eliminated from an air stream, the proposed methodology is initiated with a mechanistic derived reaction rate. The determination of intrinsic kinetic parameters is associated with the use of a simple geometry laboratory scale reactor, operation under kinetic control and a uniform incident radiation flux, which allows computing the local superficial rate of photon absorption. Thus, a simple model can describe the mass balance and a solution may be obtained. The kinetic parameters may be estimated by the combination of the mathematical model and the experimental results. The validated intrinsic kinetics obtained may be directly used in the scaling-up of any reactor configuration and size. The bench scale reactor may require the use of complex computational software to obtain the fields of velocity, radiation absorption and species concentration. The complete methodology was successfully applied to the elimination of airborne formaldehyde. The kinetic parameters were determined in a flat plate reactor, whilst a bench scale corrugated wall reactor was used to illustrate the scaling-up methodology. In addition, an optimal folding angle of the corrugated reactor was found using computational fluid dynamics tools.

  7. Assessing digital control system dependability using the dynamic flowgraph methodology

    International Nuclear Information System (INIS)

    Garrett, C.J.; Guarro, S.B.; Apostolakis, G.E.

    1993-01-01

    Dynamic Flowgraph Methodology (DFM) is a methodological approach to modeling and analyzing the behavior of software-driven embedded systems for the purpose of reliability/safety assessment and verification. The methodology has two fundamental goals: (a) to identify how certain postulated events may occur in a system and (b) to identify an appropriate testing strategy based on an analysis of system functional behavior. To achieve these goals, the methodology employs a modeling framework in which system models are developed in terms of causal relationships between physical variables and temporal characteristics of the execution of software modules. These models are then analyzed to determine how a certain state (desirable or undesirable) can be reached. This is done by developing timed fault trees, which take the form of logical combinations of static trees relating system parameters at different points in time. The prime implicants (multistate analog of minimal cut sets) of the fault trees can be used to identify and eliminate system faults resulting from unanticipated combinations of software logic errors, hardware failures, and adverse environmental conditions and to direct testing activity to more efficiently eliminate implementation errors by focusing on the neighborhood of potential failure modes arising from these combinations of system conditions

  8. A risk-based sensor placement methodology

    International Nuclear Information System (INIS)

    Lee, Ronald W.; Kulesz, James J.

    2008-01-01

    A risk-based sensor placement methodology is proposed to solve the problem of optimal location of sensors to protect population against the exposure to, and effects of, known and/or postulated chemical, biological, and/or radiological threats. Risk is calculated as a quantitative value representing population at risk from exposure at standard exposure levels. Historical meteorological data are used to characterize weather conditions as the frequency of wind speed and direction pairs. The meteorological data drive atmospheric transport and dispersion modeling of the threats, the results of which are used to calculate risk values. Sensor locations are determined via an iterative dynamic programming algorithm whereby threats detected by sensors placed in prior iterations are removed from consideration in subsequent iterations. In addition to the risk-based placement algorithm, the proposed methodology provides a quantification of the marginal utility of each additional sensor. This is the fraction of the total risk accounted for by placement of the sensor. Thus, the criteria for halting the iterative process can be the number of sensors available, a threshold marginal utility value, and/or a minimum cumulative utility achieved with all sensors

  9. Feminist approaches to social science: epistemological and methodological tenets.

    Science.gov (United States)

    Campbell, R; Wasco, S M

    2000-12-01

    This paper is a primer for community psychologists on feminist research. Much like the field of community psychology, feminist scholarship is defined by its values and process. Informed by the political ideologies of the 1970s women's movement (liberal, radical, socialist feminism, and womanism), feminist scholars reinterpreted classic concepts in philosophy of science to create feminist epistemologies and methodologies. Feminist epistemologies, such as feminist empiricism, standpoint theory, and postmodernism, recognize women's lived experiences as legitimate sources of knowledge. Feminist methodologies attempt to eradicate sexist bias in research and find ways to capture women's voices that are consistent with feminist ideals. Practically, the process of feminist research is characterized by four primary features: (1) expanding methodologies to include both quantitative and qualitative methods, (2) connecting women for group-level data collection, (3) reducing the hierarchical relationship between researchers and their participants to facilitate trust and disclosure, and (4) recognizing and reflecting upon the emotionality of women's lives. Recommendations for how community psychologists can integrate feminist scholarship into their practice are discussed.

  10. The Societal Nature of Subjectivity: An Interdisciplinary Methodological Challenge

    Directory of Open Access Journals (Sweden)

    Henning Salling Olesen

    2012-09-01

    Full Text Available The thematic issue presents a psycho-societal approach to qualitative empirical research in several areas of everyday social life. It is an approach which integrates a theory of subjectivity and an interpretation methodology which integrates hermeneutic experiences from text analysis and psychoanalysis. Its particular focus is on subjectivity—as an aspect of the research object and as an aspect of the research process. By the term "approach" is indicated the intrinsic connection between the theorizing of an empirical object and the reflection of the research process and the epistemic subject. In terms of methodology it revives the themes originally launched in FQS exactly ten years ago: "Subjectivity and Reflectivity in Qualitative Research" (BREUER, MRUCK & ROTH, 2002; MRUCK & BREUER, 2003. This editorial introduction presents the intellectual background of the psycho-societal methodology, reflects on its relevance and critical perspectives in a contemporary landscape of social science, and comments the way in which an international and interdisciplinary research group has developed this approach to profane empirical research. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs120345

  11. Understanding palliative care on the heart failure care team: an innovative research methodology.

    Science.gov (United States)

    Lingard, Lorelei A; McDougall, Allan; Schulz, Valerie; Shadd, Joshua; Marshall, Denise; Strachan, Patricia H; Tait, Glendon R; Arnold, J Malcolm; Kimel, Gil

    2013-05-01

    There is a growing call to integrate palliative care for patients with advanced heart failure (HF). However, the knowledge to inform integration efforts comes largely from interview and survey research with individual patients and providers. This work has been critically important in raising awareness of the need for integration, but it is insufficient to inform solutions that must be enacted not by isolated individuals but by complex care teams. Research methods are urgently required to support systematic exploration of the experiences of patients with HF, family caregivers, and health care providers as they interact as a care team. To design a research methodology that can support systematic exploration of the experiences of patients with HF, caregivers, and health care providers as they interact as a care team. This article describes in detail a methodology that we have piloted and are currently using in a multisite study of HF care teams. We describe three aspects of the methodology: the theoretical framework, an innovative sampling strategy, and an iterative system of data collection and analysis that incorporates four data sources and four analytical steps. We anticipate that this innovative methodology will support groundbreaking research in both HF care and other team settings in which palliative integration efforts are emerging for patients with advanced nonmalignant disease. Copyright © 2013 U.S. Cancer Pain Relief Committee. Published by Elsevier Inc. All rights reserved.

  12. Methodology for Thermal Behaviour Assessment of Homogeneous Façades in Heritage Buildings

    Directory of Open Access Journals (Sweden)

    Enrique Gil

    2017-01-01

    Full Text Available It is fundamental to study the thermal behaviour in all architectural constructions throughout their useful life, in order to detect early deterioration ensuring durability, in addition to achieving and maintaining the interior comfort with the minimum energy consumption possible. This research has developed a methodology to assess the thermal behaviour of façades in heritage buildings. This paper presents methodology validation and verification (V & V through a laboratory experiment. Guidelines and conclusions are extracted with the employment of three techniques in this experiment (thermal sensors, thermal imaging camera, and 3D thermal simulation in finite element software. A small portion of a homogeneous façade has been reproduced with indoor and outdoor thermal conditions. A closed chamber was constructed with wood panels and thermal insulation, leaving only one face exposed to the outside conditions, with a heat source inside the chamber that induces a temperature gradient in the wall. With this methodology, it is possible to better understand the thermal behaviour of the façade and to detect possible damage with the calibration and comparison of the results obtained by the experimental and theoretical techniques. This methodology can be extrapolated to the analysis of the thermal behaviour of façades in heritage buildings, usually made up of homogeneous material.

  13. [Methodologic inconsistency in anamnesis education at medical schools].

    Science.gov (United States)

    Zago, M A

    1989-01-01

    Some relevant points of the process of obtaining the medical anamnesis and physical examination, and the formulation of diagnostic hypotheses are analyzed. The main methodological features include: preponderance of qualitative data, absence of preselected hypotheses, direct involvement of the observer (physician) with the data source (patient), and selection of hypotheses and changes of the patient during the process. Thus, diagnostic investigation does not follow the paradigm of quantitative scientific method, rooted on the logic positivism, which dominates medical research and education.

  14. Experiments to investigate direct containment heating phenomena with scaled models of the Calvert Cliffs Nuclear Power Plant

    International Nuclear Information System (INIS)

    Blanchat, T.K.; Pilch, M.M.; Allen, M.D.

    1997-02-01

    The Surtsey Test Facility is used to perform scaled experiments simulating High Pressure Melt Ejection accidents in a nuclear power plant (NPP). The experiments investigate the effects of direct containment heating (DCH) on the containment load. The results from Zion and Surry experiments can be extrapolated to other Westinghouse plants, but predicted containment loads cannot be generalized to all Combustion Engineering (CE) plants. Five CE plants have melt dispersal flow paths which circumvent the main mitigation of containment compartmentalization in most Westinghouse PWRs. Calvert Cliff-like plant geometries and the impact of codispersed water were addressed as part of the DCH issue resolution. Integral effects tests were performed with a scale model of the Calvert Cliffs NPP inside the Surtsey test vessel. The experiments investigated the effects of codispersal of water, steam, and molten core stimulant materials on DCH loads under prototypic accident conditions and plant configurations. The results indicated that large amounts of coejected water reduced the DCH load by a small amount. Large amounts of debris were dispersed from the cavity to the upper dome (via the annular gap). 22 refs., 84 figs., 30 tabs

  15. Effect of Direction Type, Emotional Valence of Words And Gender on Directed Forgetting

    OpenAIRE

    Sayar, Filiz

    2018-01-01

    In the present study, the effects of emotional valence of words and gender on directed forgettingwere investigated. The directed forgetting effect was investigated by requiring from participants toforget the words that they have to recall and at the same time, to recall the words that they have toforget. The study was composed of two experiments. In the first experiment, the participants werepresented with a list of words consisting of neutral and emotional words once, while the participantsw...

  16. Design of experiments approach to engineer cell-secreted matrices for directing osteogenic differentiation.

    Science.gov (United States)

    Decaris, Martin L; Leach, J Kent

    2011-04-01

    The presentation of extracellular matrix (ECM) proteins provides an opportunity to instruct the phenotype and behavior of responsive cells. Decellularized cell-secreted matrix coatings (DM) represent a biomimetic culture surface that retains the complexity of the natural ECM. Microenvironmental culture conditions alter the composition of these matrices and ultimately the ability of DMs to direct cell fate. We employed a design of experiments (DOE) multivariable analysis approach to determine the effects and interactions of four variables (culture duration, cell seeding density, oxygen tension, and media supplementation) on the capacity of DMs to direct the osteogenic differentiation of human mesenchymal stem cells (hMSCs). DOE analysis revealed that matrices created with extended culture duration, ascorbate-2-phosphate supplementation, and in ambient oxygen tension exhibited significant correlations with enhanced hMSC differentiation. We validated the DOE model results using DMs predicted to have superior (DM1) or lesser (DM2) osteogenic potential for naïve hMSCs. Compared to cells on DM2, hMSCs cultured on DM1 expressed 2-fold higher osterix levels and deposited 3-fold more calcium over 3 weeks. Cells on DM1 coatings also exhibited greater proliferation and viability compared to DM2-coated substrates. This study demonstrates that DOE-based analysis is a powerful tool for optimizing engineered systems by identifying significant variables that have the greatest contribution to the target output.

  17. Urban Web Services—Experiences and Future Directions

    DEFF Research Database (Denmark)

    Hansen, Frank Allan; Grønbæk, Kaj

    2008-01-01

    This paper discusses experiences from implementing a mobile urban Web system using 2D visual barcodes as physical link anchors in the city and utilizing the users’ own mobile phones as interaction devices. We discuss the techniques and technologies used to create the system and the implemented...

  18. Method for the Direct Solve of the Many-Body Schrödinger Wave Equation

    Science.gov (United States)

    Jerke, Jonathan; Tymczak, C. J.; Poirier, Bill

    We report on theoretical and computational developments towards a computationally efficient direct solve of the many-body Schrödinger wave equation for electronic systems. This methodology relies on two recent developments pioneered by the authors: 1) the development of a Cardinal Sine basis for electronic structure calculations; and 2) the development of a highly efficient and compact representation of multidimensional functions using the Canonical tensor rank representation developed by Belykin et. al. which we have adapted to electronic structure problems. We then show several relevant examples of the utility and accuracy of this methodology, scaling with system size, and relevant convergence issues of the methodology. Method for the Direct Solve of the Many-Body Schrödinger Wave Equation.

  19. A novel methodology for interpreting air quality measurements from urban streets using CFD modelling

    Science.gov (United States)

    Solazzo, Efisio; Vardoulakis, Sotiris; Cai, Xiaoming

    2011-09-01

    In this study, a novel computational fluid dynamics (CFD) based methodology has been developed to interpret long-term averaged measurements of pollutant concentrations collected at roadside locations. The methodology is applied to the analysis of pollutant dispersion in Stratford Road (SR), a busy street canyon in Birmingham (UK), where a one-year sampling campaign was carried out between August 2005 and July 2006. Firstly, a number of dispersion scenarios are defined by combining sets of synoptic wind velocity and direction. Assuming neutral atmospheric stability, CFD simulations are conducted for all the scenarios, by applying the standard k-ɛ turbulence model, with the aim of creating a database of normalised pollutant concentrations at specific locations within the street. Modelled concentration for all wind scenarios were compared with hourly observed NO x data. In order to compare with long-term averaged measurements, a weighted average of the CFD-calculated concentration fields was derived, with the weighting coefficients being proportional to the frequency of each scenario observed during the examined period (either monthly or annually). In summary the methodology consists of (i) identifying the main dispersion scenarios for the street based on wind speed and directions data, (ii) creating a database of CFD-calculated concentration fields for the identified dispersion scenarios, and (iii) combining the CFD results based on the frequency of occurrence of each dispersion scenario during the examined period. The methodology has been applied to calculate monthly and annually averaged benzene concentration at several locations within the street canyon so that a direct comparison with observations could be made. The results of this study indicate that, within the simplifying assumption of non-buoyant flow, CFD modelling can aid understanding of long-term air quality measurements, and help assessing the representativeness of monitoring locations for population

  20. Design Methodology and Performance Evaluation of New Generation Sounding Rockets

    Directory of Open Access Journals (Sweden)

    Marco Pallone

    2018-01-01

    Full Text Available Sounding rockets are currently deployed for the purpose of providing experimental data of the upper atmosphere, as well as for microgravity experiments. This work provides a methodology in order to design, model, and evaluate the performance of new sounding rockets. A general configuration composed of a rocket with four canards and four tail wings is sized and optimized, assuming different payload masses and microgravity durations. The aerodynamic forces are modeled with high fidelity using the interpolation of available data. Three different guidance algorithms are used for the trajectory integration: constant attitude, near radial, and sun-pointing. The sun-pointing guidance is used to obtain the best microgravity performance while maintaining a specified attitude with respect to the sun, allowing for experiments which are temperature sensitive. Near radial guidance has instead the main purpose of reaching high altitudes, thus maximizing the microgravity duration. The results prove that the methodology at hand is straightforward to implement and capable of providing satisfactory performance in term of microgravity duration.

  1. Culture of science: strange history of the methodological thinking in psychology.

    Science.gov (United States)

    Toomela, Aaro

    2007-03-01

    In pre-World-War-II psychology, two directions in methodological thought-the German-Austrian and North American ways-could be differentiated. After the war, the German-Austrian methodological orientation has been largely abandoned. Compared to the pre-WWII German-Austrian psychology, modern mainstream psychology is more concerned with accumulation of facts than with general theory. Furthermore, the focus on qualitative data-in addition to quantitative data-is rarely visible. Only external-physical or statistical-rather than psychological controls are taken into account in empirical studies. Fragments--rather than wholes-and relationships are studied, and single cases that contradict group data are not analyzed. Instead of complex psychological types simple trait differences are studied, and prediction is not followed by thorough analysis of the whole situation. Last (but not least), data are not systematically related to complex theory. These limits have hindered the growth of knowledge in the behavioral sciences. A new return to an updated version of the German-Austrian methodological trajectory is suggested.

  2. Model identification methodology for fluid-based inerters

    Science.gov (United States)

    Liu, Xiaofu; Jiang, Jason Zheng; Titurus, Branislav; Harrison, Andrew

    2018-06-01

    Inerter is the mechanical dual of the capacitor via the force-current analogy. It has the property that the force across the terminals is proportional to their relative acceleration. Compared with flywheel-based inerters, fluid-based forms have advantages of improved durability, inherent damping and simplicity of design. In order to improve the understanding of the physical behaviour of this fluid-based device, especially caused by the hydraulic resistance and inertial effects in the external tube, this work proposes a comprehensive model identification methodology. Firstly, a modelling procedure is established, which allows the topological arrangement of the mechanical networks to be obtained by mapping the damping, inertance and stiffness effects directly to their respective hydraulic counterparts. Secondly, an experimental sequence is followed, which separates the identification of friction, stiffness and various damping effects. Furthermore, an experimental set-up is introduced, where two pressure gauges are used to accurately measure the pressure drop across the external tube. The theoretical models with improved confidence are obtained using the proposed methodology for a helical-tube fluid inerter prototype. The sources of remaining discrepancies are further analysed.

  3. An automated methodology development. [software design for combat simulation

    Science.gov (United States)

    Hawley, L. R.

    1985-01-01

    The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.

  4. Geotechnical site assessment methodology

    International Nuclear Information System (INIS)

    Tunbridge, L.W.; Richards, L.R.

    1985-09-01

    A final report summarizing the research conducted on geotechnical site assessment methodology at the Carwynnen test mine in Cornwall. The geological setting of the test site in the Cornubian granite batholith is described. The effect of structure imposed by discontinuities on the engineering behaviour of rock masses is discussed and the scanline survey method of obtaining data on discontinuities in the rock mass is described. The applicability of some methods of statistical analysis for discontinuity data is reviewed. The requirement for remote geophysical methods of characterizing the mass is discussed and experiments using seismic and ultrasonic velocity measurements are reported. Methods of determining the in-situ stresses are described and the final results of a programme of in-situ stress measurements using the overcoring and hydrofracture methods are reported. (author)

  5. An alternative approach in operator allocation labor intensive manufacturing system: A three-phase methodology framework

    Science.gov (United States)

    Mat Rani, Ruzanita; Ismail, Wan Rosmanira

    2013-04-01

    Operator allocation is one of the most important decisions that can affect productivity in labor-intensive manufacturing system. Improper decision on operator allocation will reduce company's throughput and increase waste. Therefore, all factors such as operators' performance and operational constraints need to be considered in making the best operator allocation decision. Most of previous studies used two phases methodology. In two phases methodology, they used operational constraints and treated all operators to have the same level of performance for making decision on operator allocation. Therefore, in this paper a three-phase methodology is proposed to determine the optimal operator allocation. This methodology advances the existing approach on operator allocation by combining operators' performance and operational constraints. The methodology starts with evaluating the operators' performance. Then, it is followed with determining inputs and outputs for operator allocation alternatives and it ends with determining the optimal operator allocation. This paper will give ideas and directions to the management of the manufacturing company in determining the optimal operator allocation decision.

  6. Tools and methodologies applied to eLearning

    OpenAIRE

    Seoane Pardo, Antonio M.; García-Peñalvo, Francisco José

    2006-01-01

    The aim of this paper is to show how eLearning technologies and methodologies should be useful for teaching and researching Logic. Firstly, a definition and explanation of eLearning and its main modalities will be given. Then, the most important elements and tools of eLearning activities will be shown. Finally, we will give three suggestions to improve learning experience with eLearning applied to Logic. Se muestran diversas tecnologías y metodologías de e-learning útiles en la enseñanza e...

  7. Methodology for definition of bending radius and pullback force in HDD (Horizontal Directional Drilling) operations

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Danilo Machado L. da; Rodrigues, Marcos V. [Det Norske Veritas (DNV), Rio de Janeiro, RJ (Brazil); Venaas, Asle [Det Norske Veritas (DNV), Oslo (Norway); Medeiros, Antonio Roberto de [Subsea 7 (Brazil)

    2009-12-19

    Bending is a primary loading experienced by pipelines during installation and operation. Significant bending in the presence of tension is experienced during installation by the S-lay method, as the pipe conforms to the curvature of the stinger and beyond in the over bend region. Bending in the presence of external pressure is experienced in the sag bend of all major installation methods (e.g., reeling, J-lay, S-lay) as well as in free-spans on the sea floor. Bending is also experienced by pipelines during installation by horizontal directional drilling. HDD procedures are increasingly being utilized around the world not only for crossings of rivers and other obstacles but also for shore approach of offshore pipelines. During installation the pipeline experience a combination of tensile, bending, and compressive stresses. The magnitude of these stresses is a function of the approach angle, bending radius, pipe diameter, length of the borehole, and the soil properties at the site. The objective of this paper is to present an overview of some aspects related to bending of the product pipe during HDD operations, which is closely related to the borehole path as the pipeline conforms to the curvature of the hole. An overview of the aspects related to tensile forces is also presented. The combined effect of bending and tensile forces during the pullback operation is discussed. (author)

  8. Prioritization methodology for chemical replacement

    Science.gov (United States)

    Cruit, Wendy; Goldberg, Ben; Schutzenhofer, Scott

    1995-01-01

    Since United States of America federal legislation has required ozone depleting chemicals (class 1 & 2) to be banned from production, The National Aeronautics and Space Administration (NASA) and industry have been required to find other chemicals and methods to replace these target chemicals. This project was initiated as a development of a prioritization methodology suitable for assessing and ranking existing processes for replacement 'urgency.' The methodology was produced in the form of a workbook (NASA Technical Paper 3421). The final workbook contains two tools, one for evaluation and one for prioritization. The two tools are interconnected in that they were developed from one central theme - chemical replacement due to imposed laws and regulations. This workbook provides matrices, detailed explanations of how to use them, and a detailed methodology for prioritization of replacement technology. The main objective is to provide a GUIDELINE to help direct the research for replacement technology. The approach for prioritization called for a system which would result in a numerical rating for the chemicals and processes being assessed. A Quality Function Deployment (QFD) technique was used in order to determine numerical values which would correspond to the concerns raised and their respective importance to the process. This workbook defines the approach and the application of the QFD matrix. This technique: (1) provides a standard database for technology that can be easily reviewed, and (2) provides a standard format for information when requesting resources for further research for chemical replacement technology. Originally, this workbook was to be used for Class 1 and Class 2 chemicals, but it was specifically designed to be flexible enough to be used for any chemical used in a process (if the chemical and/or process needs to be replaced). The methodology consists of comparison matrices (and the smaller comparison components) which allow replacement technology

  9. Let Me Put It Another Way: Methodological Considerations on the Use of Participatory Photography Based on an Experiment with Teenagers in Secondary Schools

    Directory of Open Access Journals (Sweden)

    Jose M. Coronel

    2013-06-01

    Full Text Available This article reflects on the use of participant photography as a methodological component of a qualitative research study into student intercultural relations in four secondary schools in Spain. Forty boys and girls took part and we selected over 400 photographs they had taken. The article draws attention to the importance of student ‘voices’ to show the interaction processes and the value of participatory photography as an approach that encourages their participation beyond the traditional interviews and field observations. The results acknowledge the value of photography to reflect the relationships among adolescents. However, while the experiment was positively rated by the participants, the study recognises the risks taken and the achievements, constraints, dilemmas and difficulties encountered by the investigators carrying out the research.

  10. Home/Work: Engaging the Methodological Dilemmas and Possibilities of Intimate Inquiry

    Science.gov (United States)

    Laura, Crystal T.

    2010-01-01

    The paucity of solutions to the persistent problem of youth entanglement with the school-to-prison pipeline demands that educational researchers experiment with research differently. In this methodological article, I briefly sketch the beginnings of an "intimate" approach to educational inquiry that researchers can use to connect with…

  11. Application of Response Surface Methodology in Optimizing a Three Echelon Inventory System

    Directory of Open Access Journals (Sweden)

    Seyed Hossein Razavi Hajiagha

    2014-01-01

    Full Text Available Inventory control is an important subject in supply chain management. In this paper, a three echelon production, distribution, inventory system composed of one producer, two wholesalers and a set of retailers has been considered. Costumers' demands follow a compound Poisson process and the inventory policy is a kind of continuous review (R, Q. In this paper, regarding the standard cost structure in an inventory model, the cost function of system has been approximated using Response Surface Methodology as a combination of designed experiments, simulation, regression analysis and optimization. The proposed methodology in this paper can be applied as a novel method in optimization of inventory policy of supply chains. Also, the joint optimization of inventory parameters, including reorder point and batch order size, is another advantage of the proposed methodology.

  12. The ECOUTER methodology for stakeholder engagement in translational research

    OpenAIRE

    Murtagh, Madeleine J.; Minion, Joel T.; Turner, Andrew; Wilson, Rebecca C.; Blell, Mwenza; Ochieng, Cynthia; Murtagh, Barnaby; Roberts, Stephanie; Butters, Oliver W.; Burton, Paul R

    2017-01-01

    Abstract Background Because no single person or group holds knowledge about all aspects of research, mechanisms are needed to support knowledge exchange and engagement. Expertise in the research setting necessarily includes scientific and methodological expertise, but also expertise gained through the experience of participating in research and/or being a recipient of research outcomes (as a patient or member of the public). Engagement i...

  13. The development of a methodology to assess population doses from multiple sources and exposure pathways of radioactivity

    International Nuclear Information System (INIS)

    Hancox, J.; Stansby, S.; Thorne, M.

    2002-01-01

    The Environment Agency (EA) has new duties in accordance with the Basic Safety Standards Directive under which it is required to ensure that doses to individuals received from exposure to anthropogenic sources of radioactivity are within defined limits. In order to assess compliance with these requirements, the EA needs to assess the doses to members of the most highly exposed population groups ('critical' groups) from all relevant potential sources of anthropogenic radioactivity and all relevant potential exposure pathways to such radioactivity. The EA has identified a need to develop a methodology for the retrospective assessment of effective doses from multiple sources of radioactive materials and exposure pathways associated with those sources. Under contract to the EA, AEA Technology has undertaken the development of a suitable methodology as part of EA R and D Project P3-070. The methodology developed under this research project has been designed to support the EA in meeting its obligations under the Euratom Basic Safety Standards Directive and is consistent with UK and international approaches to radiation dosimetry and radiological protection. The development and trial application of the methodology is described in this report

  14. Development of the evaluation methodology for the material relocation behavior in the core disruptive accident of sodium cooled fast reactors

    International Nuclear Information System (INIS)

    Tobita, Yoshiharu; Kamiyama, Kenji; Tagami, Hirotaka; Matsuba, Ken-ichi; Suzuki, Tohru; Isozaki, Mikio; Yamano, Hidemasa; Morita, Koji; Guo, Liancheng; Zhang, Bin

    2014-01-01

    The in-vessel retention (IVR) of core disruptive accident (CDA) is of prime importance in enhancing safety characteristics of sodium-cooled fast reactors (SFRs). In the CDA of SFRs, molten core material relocates to the lower plenum of reactor vessel and may impose significant thermal load on the structures, resulting in the melt through of the reactor vessel. In order to enable the assessment of this relocation process and prove that IVR of core material is the most probable consequence of the CDA in SFRs, a research program to develop the evaluation methodology for the material relocation behavior in the CDA of SFRs has been conducted. This program consists of three developmental studies, namely the development of the analysis method of molten material discharge from the core region, the development of evaluation methodology of molten material penetration into sodium pool, and the development of the simulation tool of debris bed behavior. The analysis method of molten material discharge was developed based on the computer code SIMMER-III since this code is designed to simulate the multi-phase, multi-component fluid dynamics with phase changes involved in the discharge process. Several experiments simulating the molten material discharge through duct using simulant materials were utilized as the basis of validation study of the physical models in this code. It was shown that SIMMER-III with improved physical models could simulate the molten material discharge behavior including the momentum exchange with duct wall and thermal interaction with coolant. In order to develop evaluation methodology of molten material penetration into sodium pool, a series of experiments simulating jet penetration behavior into sodium pool in SFR thermal condition were performed. These experiments revealed that the molten jet was fragmented in significantly shorter penetration length than the prediction by existing correlation for light water reactor conditions, due to the direct

  15. Development of the evaluation methodology for the material relocation behavior in the core disruptive accident of sodium-cooled fast reactors

    International Nuclear Information System (INIS)

    Tobita, Yoshiharu; Kamiyama, Kenji; Tagami, Hirotaka; Matsuba, Ken-ichi; Suzuki, Tohru; Isozaki, Mikio; Yamano, Hidemasa; Morita, Koji; Guo, LianCheng; Zhang, Bin

    2016-01-01

    The in-vessel retention (IVR) of core disruptive accident (CDA) is of prime importance in enhancing safety characteristics of sodium-cooled fast reactors (SFRs). In the CDA of SFRs, molten core material relocates to the lower plenum of reactor vessel and may impose significant thermal load on the structures, resulting in the melt-through of the reactor vessel. In order to enable the assessment of this relocation process and prove that IVR of core material is the most probable consequence of the CDA in SFRs, a research program to develop the evaluation methodology for the material relocation behavior in the CDA of SFRs has been conducted. This program consists of three developmental studies, namely the development of the analysis method of molten material discharge from the core region, the development of evaluation methodology of molten material penetration into sodium pool, and the development of the simulation tool of debris bed behavior. The analysis method of molten material discharge was developed based on the computer code SIMMER-III since this code is designed to simulate the multi-phase, multi-component fluid dynamics with phase changes involved in the discharge process. Several experiments simulating the molten material discharge through duct using simulant materials were utilized as the basis of validation study of the physical models in this code. It was shown that SIMMER-III with improved physical models could simulate the molten material discharge behavior, including the momentum exchange with duct wall and thermal interaction with coolant. In order to develop an evaluation methodology of molten material penetration into sodium pool, a series of experiments simulating jet penetration behavior into sodium pool in SFR thermal condition were performed. These experiments revealed that the molten jet was fragmented in significantly shorter penetration length than the prediction by existing correlation for light water reactor conditions, due to the direct

  16. Gadamerian philosophical hermeneutics as a useful methodological framework for the Delphi technique

    Directory of Open Access Journals (Sweden)

    Diana Guzys

    2015-05-01

    Full Text Available In this article we aim to demonstrate how Gadamerian philosophical hermeneutics may provide a sound methodological framework for researchers using the Delphi Technique (Delphi in studies exploring health and well-being. Reporting of the use of Delphi in health and well-being research is increasing, but less attention has been given to covering its methodological underpinnings. In Delphi, a structured anonymous conversation between participants is facilitated, via an iterative survey process. Participants are specifically selected for their knowledge and experience with the topic of interest. The purpose of structuring conversation in this manner is to cultivate collective opinion and highlight areas of disagreement, using a process that minimizes the influence of group dynamics. The underlying premise is that the opinion of a collective is more useful than that of an individual. In designing our study into health literacy, Delphi aligned well with our research focus and would enable us to capture collective views. However, we were interested in the methodology that would inform our study. As researchers, we believe that methodology provides the framework and principles for a study and is integral to research integrity. In assessing the suitability of Delphi for our research purpose, we found little information about underpinning methodology. The absence of a universally recognized or consistent methodology associated with Delphi was highlighted through a scoping review we undertook to assist us in our methodological thinking. This led us to consider alternative methodologies, which might be congruent with the key principles of Delphi. We identified Gadamerian philosophical hermeneutics as a methodology that could provide a supportive framework and principles. We suggest that this methodology may be useful in health and well-being studies utilizing the Delphi method.

  17. Gadamerian philosophical hermeneutics as a useful methodological framework for the Delphi technique.

    Science.gov (United States)

    Guzys, Diana; Dickson-Swift, Virginia; Kenny, Amanda; Threlkeld, Guinever

    2015-01-01

    In this article we aim to demonstrate how Gadamerian philosophical hermeneutics may provide a sound methodological framework for researchers using the Delphi Technique (Delphi) in studies exploring health and well-being. Reporting of the use of Delphi in health and well-being research is increasing, but less attention has been given to covering its methodological underpinnings. In Delphi, a structured anonymous conversation between participants is facilitated, via an iterative survey process. Participants are specifically selected for their knowledge and experience with the topic of interest. The purpose of structuring conversation in this manner is to cultivate collective opinion and highlight areas of disagreement, using a process that minimizes the influence of group dynamics. The underlying premise is that the opinion of a collective is more useful than that of an individual. In designing our study into health literacy, Delphi aligned well with our research focus and would enable us to capture collective views. However, we were interested in the methodology that would inform our study. As researchers, we believe that methodology provides the framework and principles for a study and is integral to research integrity. In assessing the suitability of Delphi for our research purpose, we found little information about underpinning methodology. The absence of a universally recognized or consistent methodology associated with Delphi was highlighted through a scoping review we undertook to assist us in our methodological thinking. This led us to consider alternative methodologies, which might be congruent with the key principles of Delphi. We identified Gadamerian philosophical hermeneutics as a methodology that could provide a supportive framework and principles. We suggest that this methodology may be useful in health and well-being studies utilizing the Delphi method.

  18. Social Experiments and Participatory Research as Method

    DEFF Research Database (Denmark)

    Dirckinck-Holmfeld, Lone

    2007-01-01

    Interdisciplinary research with stakeholders and users challenge the research methodologies to be used. These have to provide a shared language for all the participants, to build up trust, and to offer insights into the diverse perspectives of the participants. Further more it challenge ways to d...... practice-based methods where "social experiments with technology" and "dialogue research" are the key-words. ...... to discuss and validate contributions from each others - across different criteria for each discipline, and crosswise different agendas for stakeholders, politicians, practitioners and researchers. Participatory research and social experiments are methodologies which have been developed to cope......Interdisciplinary research with stakeholders and users challenge the research methodologies to be used. These have to provide a shared language for all the participants, to build up trust, and to offer insights into the diverse perspectives of the participants. Further more it challenge ways...

  19. Connecting experience and economy - aspects of disguised positioning

    DEFF Research Database (Denmark)

    Christensen, Bo Allesøe

    2013-01-01

    The focus of this article is the use of experience made within the literature of the “new” economical discipline of experience economy. By combining a methodological individualism with a causal and dehumanising picture of the process of experience, this discipline conceives economic interactions ...

  20. Twelve Years' Experience with Direct-to-Consumer Advertising of Prescription Drugs in Canada: A Cautionary Tale

    Science.gov (United States)

    Mintzes, Barbara; Morgan, Steve; Wright, James M.

    2009-01-01

    Background Direct-to-consumer advertising (DTCA) of prescription drugs is illegal in Canada as a health protection measure, but is permitted in the United States. However, in 2000, Canadian policy was changed to allow ‘reminder’ advertising of prescription drugs. This is a form of advertising that states the brand name without health claims. ‘Reminder’ advertising is prohibited in the US for drugs that have ‘black box’ warnings of serious risks. This study examines spending on DTCA in Canada from 1995 to 2006, 12 years spanning this policy shift. We ask how annual per capita spending compares to that in the US, and whether drugs with Canadian or US regulatory safety warnings are advertised to the Canadian public in reminder advertising. Methodology/Principal Findings Prescription drug advertising spending data were extracted from a data set on health sector spending in Canada obtained from a market research company, TNS Media Inc. Spending was adjusted for inflation and compared with US spending. Inflation-adjusted spending on branded DTCA in Canada grew from under CAD$2 million per year before 1999 to over $22 million in 2006. The major growth was in broadcast advertising, accounting for 83% of spending in 2006. US annual per capita spending was on average 24 times Canadian levels. Celebrex (celecoxib), which has a US black box and was subject to three safety advisories in Canada, was the most heavily advertised drug on Canadian television in 2005 and 2006. Of 8 brands with >$500,000 spending, which together accounted for 59% of branded DTCA in all media, 6 were subject to Canadian safety advisories, and 4 had US black box warnings. Conclusions/Significance Branded ‘reminder’ advertising has grown rapidly in Canada since 2000, mainly due to a growth in television advertising. Although DTCA spending per capita is much lower in Canada than in the US, there is no evidence of safer content or product choice; many heavily-advertised drugs in Canada have