WorldWideScience

Sample records for matter supportive methodology

  1. The Evaluation Methodology of Information Support

    Directory of Open Access Journals (Sweden)

    Lubos Necesal

    2016-01-01

    Full Text Available Knowledge, information and people are the motive force in today's organizations. Successful organizations need to find the right employees and provide them with the right and highquality information. This is a complex problem. In the world where information plays more and more important role, employees have to be skilled at information activities (searching, processing, saving, etc. of information and information system/-s (IS they work with. Organizations have to cover both these areas. Therefore, we need an effective instrument, which could be used to evaluate new employees within admission or as regular evaluating of current employees, to evaluate information system, whether it is an appropriate tool for fulfilling the employee’s tasks within the organization, and to evaluate how the organization covers the foregoing areas. Such instrument is the “Evaluation methodology of information support in organization”. This paper defines the term “information support“ and its role in organization. The body of the paper proposes the “Evaluation methodology of information support in organization”. The conclusion discusses contributions of information support evaluation

  2. Quark Matter 2017: Young Scientist Support

    Energy Technology Data Exchange (ETDEWEB)

    Evdokimov, Olga [University of Illinois at Chicago

    2017-07-31

    Quark Matter conference series are amongst the major scientific events for the Relativistic Heavy Ion community. With over 30 year long history, the meetings are held about every 1½ years to showcase the progress made in theoretical and experimental studies of nuclear matter under extreme conditions. The 26th International Conference on Ultra-relativistic Nucleus-Nucleus Collisions (Quark Matter 2017) was held at the Hyatt Regency Hotel in downtown Chicago from Sunday, February 5th through Saturday, February 11th, 2017. The conference featured about 180 plenary and parallel presentations of the most significant recent results in the field, a poster session for additional presentations, and an evening public lecture. Following the tradition of previous Quark Matter meetings, the first day of the conference was dedicated entirely to a special program for young scientists (graduate students and postdoctoral researchers). This grant will provided financial support for 235 young physicists facilitating their attendance of the conference.

  3. Fire Support Requirements Methodology Study, Phase 2 Proceedings of the Fire Support Methodology Workshop

    Science.gov (United States)

    1975-12-18

    It was not immediatei- clear that the -approach- would- succeed in overcoming the deficiencies of present fire support methodologies which demand- an...support require analysis up to Level 6. They also felt that deficiencies in f technique were most serious at Levels 3, 4 and 5. It was accepted that...defined as: Tk2 = _Tkl ilk2 kl (2) Tkt = Tk,t-l - ’lMktMk,t-l + 𔃼kt ,t-2 I t > (3. where Mt refers to the-number of type k targets killed in time

  4. A Performance Tuning Methodology with Compiler Support

    Directory of Open Access Journals (Sweden)

    Oscar Hernandez

    2008-01-01

    Full Text Available We have developed an environment, based upon robust, existing, open source software, for tuning applications written using MPI, OpenMP or both. The goal of this effort, which integrates the OpenUH compiler and several popular performance tools, is to increase user productivity by providing an automated, scalable performance measurement and optimization system. In this paper we describe our environment, show how these complementary tools can work together, and illustrate the synergies possible by exploiting their individual strengths and combined interactions. We also present a methodology for performance tuning that is enabled by this environment. One of the benefits of using compiler technology in this context is that it can direct the performance measurements to capture events at different levels of granularity and help assess their importance, which we have shown to significantly reduce the measurement overheads. The compiler can also help when attempting to understand the performance results: it can supply information on how a code was translated and whether optimizations were applied. Our methodology combines two performance views of the application to find bottlenecks. The first is a high level view that focuses on OpenMP/MPI performance problems such as synchronization cost and load imbalances; the second is a low level view that focuses on hardware counter analysis with derived metrics that assess the efficiency of the code. Our experiments have shown that our approach can significantly reduce overheads for both profiling and tracing to acceptable levels and limit the number of times the application needs to be run with selected hardware counters. In this paper, we demonstrate the workings of this methodology by illustrating its use with selected NAS Parallel Benchmarks and a cloud resolving code.

  5. Severe accident analysis methodology in support of accident management

    International Nuclear Information System (INIS)

    Boesmans, B.; Auglaire, M.; Snoeck, J.

    1997-01-01

    The author addresses the implementation at BELGATOM of a generic severe accident analysis methodology, which is intended to support strategic decisions and to provide quantitative information in support of severe accident management. The analysis methodology is based on a combination of severe accident code calculations, generic phenomenological information (experimental evidence from various test facilities regarding issues beyond present code capabilities) and detailed plant-specific technical information

  6. K-Means Subject Matter Expert Refined Topic Model Methodology

    Science.gov (United States)

    2017-01-01

    computing environment the Visual Basic for Applications ( VBA ) programming language presents the option as our programming language of choice. We propose...background, or access to other computational programming environments, to build topic models from free text datasets using a familiar Excel based...environment the restricts access to other software based text analytic tools. Opportunities to deploy developmental versions of the methodology and

  7. Methodology and Supporting Toolset Advancing Embedded Systems Quality

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Brewka, Lukasz Jerzy

    2013-01-01

    Software quality is of primary importance in the development of embedded systems that are often used in safety-critical applications. Moreover, as the life cycle of embedded products becomes increasingly tighter, productivity and quality are simultaneously required and closely interrelated towards...... delivering competitive products. In this context, the MODUS (Methodology and supporting toolset advancing embedded systems quality) project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. This paper...... will describe the MODUS project with focus on the technical methodologies that will be developed advancing embedded system quality....

  8. Post-Sale Customer Support Methodology in the TQM System

    Directory of Open Access Journals (Sweden)

    Dr.Sc. Elizabeta Mitreva

    2014-06-01

    Full Text Available In this paper a survey of the activities in the post-sale period of the product is made and based on the analysis of the results, a methodology that managers could use to design and implement the system of total quality management has been created. The implementation of this methodology is carried out in a simplified way and in less time, without having to study and deepen new knowledge for internal standardization, statistical process control, cost analysis and optimization of business processes The purpose of this paper is to lay a good foundation for Macedonian companies in their post-sale period activities of the product, to understand the philosophy of TQM (Total Quality Management and benefits will be achieved by implementing the system and setting strategic directions for success. These activities begin by identifying the wishes and needs of customers/users, reengineering business processes for sales support, satisfaction of employees and all stakeholders. As a result of the implementation of this methodology in practice, improved competitiveness, increased efficiency, reduction of quality costs and increased productivity are noted. The methodology proposed in this paper brings together all the activities in the spiral of quality in a company that deals with post-sales support. Due to the necessity of flow of information about quality in the entire enterprise, an information system is designed accordingly to the QC-CEPyramid model in several steps.

  9. Sampling and analytical methodologies for instrumental neutron activation analysis of airborne particulate matter

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1992-12-01

    The IAEA supports a number of projects having to do with the analysis of airborne particulate matter by nuclear techniques. Most of this work involves the use of activation analysis in its various forms, particularly instrumental neutron activation analysis (INAA). This technique has been widely used in many different countries for the analysis of airborne particulate matter, and there are already many publications in scientific journals, books and reports describing such work. The present document represents an attempt to summarize the most important features of INAA as applied to the analysis of airborne particulate matter. It is intended to serve as a set of guidelines for use by participants in the IAEA's own programmes, and other scientists, who are not yet fully experienced in the application of INAA to airborne particulate samples, and who wish either to make a start on using this technique or to improve their existing procedures. The methodologies for sampling described in this document are of rather general applicability, although they are presented here in a way that takes account of the particular requirements arising from the use of INAA as the analytical technique. The analytical part of the document, however, is presented in a form that is applicable only to INAA. (Subsequent publications in this series are expected to deal specifically with other nuclear related techniques such as energy dispersive X ray fluorescence (ED-XRF) and particle induced X ray emission (PIXE) analysis). Although the methods and procedures described here have been found through experience to yield acceptable results, they should not be considered mandatory. Any other procedure used should, however, be chosen to be capable of yielding results at least of equal quality to those described.

  10. Sampling and analytical methodologies for instrumental neutron activation analysis of airborne particulate matter

    International Nuclear Information System (INIS)

    1992-01-01

    The IAEA supports a number of projects having to do with the analysis of airborne particulate matter by nuclear techniques. Most of this work involves the use of activation analysis in its various forms, particularly instrumental neutron activation analysis (INAA). This technique has been widely used in many different countries for the analysis of airborne particulate matter, and there are already many publications in scientific journals, books and reports describing such work. The present document represents an attempt to summarize the most important features of INAA as applied to the analysis of airborne particulate matter. It is intended to serve as a set of guidelines for use by participants in the IAEA's own programmes, and other scientists, who are not yet fully experienced in the application of INAA to airborne particulate samples, and who wish either to make a start on using this technique or to improve their existing procedures. The methodologies for sampling described in this document are of rather general applicability, although they are presented here in a way that takes account of the particular requirements arising from the use of INAA as the analytical technique. The analytical part of the document, however, is presented in a form that is applicable only to INAA. (Subsequent publications in this series are expected to deal specifically with other nuclear related techniques such as energy dispersive X ray fluorescence (ED-XRF) and particle induced X ray emission (PIXE) analysis). Although the methods and procedures described here have been found through experience to yield acceptable results, they should not be considered mandatory. Any other procedure used should, however, be chosen to be capable of yielding results at least of equal quality to those described

  11. Establishment of Requirements and Methodology for the Development and Implementation of GreyMatters, a Memory Clinic Information System.

    Science.gov (United States)

    Tapuria, Archana; Evans, Matt; Curcin, Vasa; Austin, Tony; Lea, Nathan; Kalra, Dipak

    2017-01-01

    The aim of the paper is to establish the requirements and methodology for the development process of GreyMatters, a memory clinic system, outlining the conceptual, practical, technical and ethical challenges, and the experiences of capturing clinical and research oriented data along with the implementation of the system. The methodology for development of the information system involved phases of requirements gathering, modeling and prototype creation, and 'bench testing' the prototype with experts. The standard Institute of Electrical and Electronics Engineers (IEEE) recommended approach for the specifications of software requirements was adopted. An electronic health record (EHR) standard, EN13606 was used, and clinical modelling was done through archetypes and the project complied with data protection and privacy legislation. The requirements for GreyMatters were established. Though the initial development was complex, the requirements, methodology and standards adopted made the construction, deployment, adoption and population of a memory clinic and research database feasible. The electronic patient data including the assessment scales provides a rich source of objective data for audits and research and to establish study feasibility and identify potential participants for the clinical trials. The establishment of requirements and methodology, addressing issues of data security and confidentiality, future data compatibility and interoperability and medico-legal aspects such as access controls and audit trails, led to a robust and useful system. The evaluation supports that the system is an acceptable tool for clinical, administrative, and research use and forms a useful part of the wider information architecture.

  12. Methodology to estimate particulate matter emissions from certified commercial aircraft engines.

    Science.gov (United States)

    Wayson, Roger L; Fleming, Gregg G; Lovinelli, Ralph

    2009-01-01

    Today, about one-fourth of U.S. commercial service airports, including 41 of the busiest 50, are either in nonattainment or maintenance areas per the National Ambient Air Quality Standards. U.S. aviation activity is forecasted to triple by 2025, while at the same time, the U.S. Environmental Protection Agency (EPA) is evaluating stricter particulate matter (PM) standards on the basis of documented human health and welfare impacts. Stricter federal standards are expected to impede capacity and limit aviation growth if regulatory mandated emission reductions occur as for other non-aviation sources (i.e., automobiles, power plants, etc.). In addition, strong interest exists as to the role aviation emissions play in air quality and climate change issues. These reasons underpin the need to quantify and understand PM emissions from certified commercial aircraft engines, which has led to the need for a methodology to predict these emissions. Standardized sampling techniques to measure volatile and nonvolatile PM emissions from aircraft engines do not exist. As such, a first-order approximation (FOA) was derived to fill this need based on available information. FOA1.0 only allowed prediction of nonvolatile PM. FOA2.0 was a change to include volatile PM emissions on the basis of the ratio of nonvolatile to volatile emissions. Recent collaborative efforts by industry (manufacturers and airlines), research establishments, and regulators have begun to provide further insight into the estimation of the PM emissions. The resultant PM measurement datasets are being analyzed to refine sampling techniques and progress towards standardized PM measurements. These preliminary measurement datasets also support the continued refinement of the FOA methodology. FOA3.0 disaggregated the prediction techniques to allow for independent prediction of nonvolatile and volatile emissions on a more theoretical basis. The Committee for Aviation Environmental Protection of the International Civil

  13. Mindset Matters: Supporting Student Persistence Through The Developmental Mathematics Pipeline

    OpenAIRE

    Kiser, Tracey Nicole

    2016-01-01

    Abstract of the DissertationMindset Matters: Supporting Student Persistence Through The Developmental Mathematics PipelinebyTracey Nicole KiserDoctor of Education in Teaching and LearningUniversity of California, San Diego, 2016Christopher P. Halter, ChairDevelopmental mathematics is one of the most challenging leaks in the mathematics K-20 pipeline. Few students enter two-year colleges prepared to successfully engage in college-level mathematics classes. Many of students who place into devel...

  14. Supporting Space Systems Design via Systems Dependency Analysis Methodology

    Science.gov (United States)

    Guariniello, Cesare

    assess the behavior of each system based on its internal status and on the topology of its dependencies on systems connected to it. Designers and decision makers can therefore quickly analyze and explore the behavior of complex systems and evaluate different architectures under various working conditions. The methods support educated decision making both in the design and in the update process of systems architecture, reducing the need to execute extensive simulations. In particular, in the phase of concept generation and selection, the information given by the methods can be used to identify promising architectures to be further tested and improved, while discarding architectures that do not show the required level of global features. The methods, when used in conjunction with appropriate metrics, also allow for improved reliability and risk analysis, as well as for automatic scheduling and re-scheduling based on the features of the dependencies and on the accepted level of risk. This dissertation illustrates the use of the two methods in sample aerospace applications, both in the operational and in the developmental domain. The applications show how to use the developed methodology to evaluate the impact of failures, assess the criticality of systems, quantify metrics of interest, quantify the impact of delays, support informed decision making when scheduling the development of systems and evaluate the achievement of partial capabilities. A larger, well-framed case study illustrates how the Systems Operational Dependency Analysis method and the Systems Developmental Dependency Analysis method can support analysis and decision making, at the mid and high level, in the design process of architectures for the exploration of Mars. The case study also shows how the methods do not replace the classical systems engineering methodologies, but support and improve them.

  15. A Methodology to Support Decision Making in Flood Plan Mitigation

    Science.gov (United States)

    Biscarini, C.; di Francesco, S.; Manciola, P.

    2009-04-01

    . In the present paper we propose a novel methodology for supporting the priority setting in the assessment of such issues, beyond the typical "expected value" approach. Scientific contribution and management aspects are merged to create a simplified method for plan basin implementation, based on risk and economic analyses. However, the economic evaluation is not the sole criterion for flood-damage reduction plan selection. Among the different criteria that are relevant to the decision process, safety and quality of human life, economic damage, expenses related with the chosen measures and environmental issues should play a fundamental role on the decisions made by the authorities. Some numerical indices, taking in account administrative, technical, economical and risk aspects, are defined and are combined together in a mathematical formula that defines a Priority Index (PI). In particular, the priority index defines a ranking of priority interventions, thus allowing the formulation of the investment plan. The research is mainly focused on the technical factors of risk assessment, providing quantitative and qualitative estimates of possible alternatives, containing measures of the risk associated with those alternatives. Moreover, the issues of risk management are analyzed, in particular with respect to the role of decision making in the presence of risk information. However, a great effort is devoted to make this index easy to be formulated and effective to allow a clear and transparent comparison between the alternatives. Summarizing this document describes a major- steps for incorporation of risk analysis into the decision making process: framing of the problem in terms of risk analysis, application of appropriate tools and techniques to obtain quantified results, use of the quantified results in the choice of structural and non-structural measures. In order to prove the reliability of the proposed methodology and to show how risk-based information can be

  16. Effective management of changes: methodological and instrumental support

    Directory of Open Access Journals (Sweden)

    G. S. Merzlikina

    2017-01-01

    Full Text Available Small and medium-sized enterprises are characterized by maneuverability, readiness for change, and focus on innovation. But the growing instability of the external and internal environment requires the company to develop increasingly complex control systems. There were several models of management: by objectives, by processes and by changes. Achieving the goals involves the development and implementation of strategy and tactics. In business, strategy comes from the goal set by the owner before the organization. Process management describes and defines the main elements and categories of the process, observing the balance of responsibility and authority by creating a team to improve each business process. Management of changes is a special mechanism for the adoption and implementation of adequate management decisions. The article compares these management models, examines the criteria, indicators and factors for assessing the effectiveness of management of the organization. Comparative analysis showed that management of changes is more preferable for small and medium-sized businesses. Management of changes involves obtaining a certain idea of future trends in the development of the organization and the active use of entrepreneurial structure of modern management methods. This will ensure the economic stability and stability of the organization. Evaluation of the effectiveness of the enterprise can be carried out in accordance with performance indicators. The article suggests a matrix of selection of such indicators taking into account the sphere of influence. Recommendations are given on the choice of indicators of the effectiveness of achieving the goals. Also, the values under which the enterprise acquires stability of such key factors of management effectiveness as efficiency, capacity and sustainability of the organization are indicated. The theoretical and practical significance of this research is the development of methodological and

  17. Tools and methodologies to support more sustainable biofuel feedstock production.

    Science.gov (United States)

    Dragisic, Christine; Ashkenazi, Erica; Bede, Lucio; Honzák, Miroslav; Killeen, Tim; Paglia, Adriano; Semroc, Bambi; Savy, Conrad

    2011-02-01

    Increasingly, government regulations, voluntary standards, and company guidelines require that biofuel production complies with sustainability criteria. For some stakeholders, however, compliance with these criteria may seem complex, costly, or unfeasible. What existing tools, then, might facilitate compliance with a variety of biofuel-related sustainability criteria? This paper presents four existing tools and methodologies that can help stakeholders assess (and mitigate) potential risks associated with feedstock production, and can thus facilitate compliance with requirements under different requirement systems. These include the Integrated Biodiversity Assessment Tool (IBAT), the ARtificial Intelligence for Ecosystem Services (ARIES) tool, the Responsible Cultivation Areas (RCA) methodology, and the related Biofuels + Forest Carbon (Biofuel + FC) methodology.

  18. Sources of particulate matter components in the Athabasca oil sands region: investigation through a comparison of trace element measurement methodologies

    Science.gov (United States)

    Phillips-Smith, Catherine; Jeong, Cheol-Heon; Healy, Robert M.; Dabek-Zlotorzynska, Ewa; Celo, Valbona; Brook, Jeffrey R.; Evans, Greg

    2017-08-01

    The province of Alberta, Canada, is home to three oil sands regions which, combined, contain the third largest deposit of oil in the world. Of these, the Athabasca oil sands region is the largest. As part of Environment and Climate Change Canada's program in support of the Joint Canada-Alberta Implementation Plan for Oil Sands Monitoring program, concentrations of trace elements in PM2. 5 (particulate matter smaller than 2.5 µm in diameter) were measured through two campaigns that involved different methodologies: a long-term filter campaign and a short-term intensive campaign. In the long-term campaign, 24 h filter samples were collected once every 6 days over a 2-year period (December 2010-November 2012) at three air monitoring stations in the regional municipality of Wood Buffalo. For the intensive campaign (August 2013), hourly measurements were made with an online instrument at one air monitoring station; daily filter samples were also collected. The hourly and 24 h filter data were analyzed individually using positive matrix factorization. Seven emission sources of PM2. 5 trace elements were thereby identified: two types of upgrader emissions, soil, haul road dust, biomass burning, and two sources of mixed origin. The upgrader emissions, soil, and haul road dust sources were identified through both the methodologies and both methodologies identified a mixed source, but these exhibited more differences than similarities. The second upgrader emissions and biomass burning sources were only resolved by the hourly and filter methodologies, respectively. The similarity of the receptor modeling results from the two methodologies provided reassurance as to the identity of the sources. Overall, much of the PM2. 5-related trace elements were found to be anthropogenic, or at least to be aerosolized through anthropogenic activities. These emissions may in part explain the previously reported higher levels of trace elements in snow, water, and biota samples collected

  19. Pedagogical support of competence formation: methodological bases and experimental context

    OpenAIRE

    NABIEV VALERY SHARIFYANOVICH

    2016-01-01

    The article considers the problem of competence approach methodological basis. It discusses the topical issues of organizing a holistic educational process. The article presents the original solutions created by the author and the results of experimental verification of the specified conditions of pedagogical maintenance of educational and training activities.

  20. Methodological Challenges in Studies Comparing Prehospital Advanced Life Support with Basic Life Support.

    Science.gov (United States)

    Li, Timmy; Jones, Courtney M C; Shah, Manish N; Cushman, Jeremy T; Jusko, Todd A

    2017-08-01

    Determining the most appropriate level of care for patients in the prehospital setting during medical emergencies is essential. A large body of literature suggests that, compared with Basic Life Support (BLS) care, Advanced Life Support (ALS) care is not associated with increased patient survival or decreased mortality. The purpose of this special report is to synthesize the literature to identify common study design and analytic challenges in research studies that examine the effect of ALS, compared to BLS, on patient outcomes. The challenges discussed in this report include: (1) choice of outcome measure; (2) logistic regression modeling of common outcomes; (3) baseline differences between study groups (confounding); (4) inappropriate statistical adjustment; and (5) inclusion of patients who are no longer at risk for the outcome. These challenges may affect the results of studies, and thus, conclusions of studies regarding the effect of level of prehospital care on patient outcomes should require cautious interpretation. Specific alternatives for avoiding these challenges are presented. Li T , Jones CMC , Shah MN , Cushman JT , Jusko TA . Methodological challenges in studies comparing prehospital Advanced Life Support with Basic Life Support. Prehosp Disaster Med. 2017;32(4):444-450.

  1. WORK ALLOCATION IN COMPLEX PRODUCTION PROCESSES: A METHODOLOGY FOR DECISION SUPPORT

    OpenAIRE

    de Mello, Adriana Marotti; School of Economics, Business and Accounting at the University of São Paulo; Marx, Roberto; Polytechnic School, University of São Paulo; Zilbovicius, Mauro; Polytechnic School – University of São Paulo

    2013-01-01

    This article presents the development of a Methodology of Decision Support for Work Allocation in complex production processes. It is known that this decision is frequently taken empirically and that the methodologies available to support it are few and restricted in terms of its conceptual basis. The study of Times and Motion is one of these methodologies, but its applicability is restricted in cases of more complex production processes. The method presented here was developed as a result of...

  2. Sampling and analytical methodologies for energy dispersive X-ray fluorescence analysis of airborne particulate matter

    International Nuclear Information System (INIS)

    1993-01-01

    The present document represents an attempt to summarize the most important features of the different forms of ED-XFR as applied to the analysis of airborne particulate matter. It is intended to serve as a set of guidelines for use by participants in the IAEA's own programmes, and other scientists, who are not yet fully experienced in the application of ED-XRF to airborne particulate samples, and who wish either to make a start on using this technique or to improve their existing procedures. The methodologies for sampling described in this document are of rather general applicability. Emphasis is also placed on the sources of errors affecting the sampling of airborne particulate matter. The analytical part of the document describes the different forms of ED-XRF and their potential applications. Spectrum evaluation, a key step in X-ray spectrometry, is covered in depth, including discussion on several calibration and peak fitting techniques and computer programs especially designed for this purpose. 148 refs, 25 figs, 13 tabs

  3. Revisiting the dose calculation methodologies in European decision support systems

    DEFF Research Database (Denmark)

    Andersson, Kasper Grann; Roos, Per; Hou, Xiaolin

    2012-01-01

    The paper presents examples of current needs for improvement and extended applicability of the European decision support systems. The systems were originally created for prediction of the radiological consequences of accidents at nuclear installations. They could however also be of great value in...... for, to introduce new knowledge and thereby improve prognoses....

  4. A Cybernetic Design Methodology for 'Intelligent' Online Learning Support

    Science.gov (United States)

    Quinton, Stephen R.

    The World Wide Web (WWW) provides learners and knowledge workers convenient access to vast stores of information, so much that present methods for refinement of a query or search result are inadequate - there is far too much potentially useful material. The problem often encountered is that users usually do not recognise what may be useful until they have progressed some way through the discovery, learning, and knowledge acquisition process. Additional support is needed to structure and identify potentially relevant information, and to provide constructive feedback. In short, support for learning is needed. The learning envisioned here is not simply the capacity to recall facts or to recognise objects. The focus is on learning that results in the construction of knowledge. Although most online learning platforms are efficient at delivering information, most do not provide tools that support learning as envisaged in this chapter. It is conceivable that Web-based learning environments can incorporate software systems that assist learners to form new associations between concepts and synthesise information to create new knowledge. This chapter details the rationale and theory behind a research study that aims to evolve Web-based learning environments into 'intelligent thinking' systems that respond to natural language human input. Rather than functioning simply as a means of delivering information, it is argued that online learning solutions will 1 day interact directly with students to support their conceptual thinking and cognitive development.

  5. A multicriteria decision support methodology for evaluating airport expansion plans

    NARCIS (Netherlands)

    Vreeker, R.; Nijkamp, P.; ter Welle, C.

    2001-01-01

    Rational decision-making requires an assessment of advantages and disadvantages of choice possibilities, including non-market effects (such as externalities). This also applies to strategic decision-making in the transport sector (including aviation). In the past decades various decision support and

  6. Organization of the Master Tutor in Higher Education: Methodological Support

    OpenAIRE

    Asya Suchanu

    2013-01-01

    It reveals the uniqueness tutor support preparation of future teachers in humanities within the magistracy, the ways and means of professional development tomorrow's specialists. Substantiates the importance and meaning of revealed teaching tutor help first-year students, which manifests itself in optimizing individual learning trajectories, leading to efficient fulfillment and positive socialization of students.

  7. Particulate Matter Filtration Design Considerations for Crewed Spacecraft Life Support Systems

    Science.gov (United States)

    Agui, Juan H.; Vijayakumar, R.; Perry, Jay L.

    2016-01-01

    Particulate matter filtration is a key component of crewed spacecraft cabin ventilation and life support system (LSS) architectures. The basic particulate matter filtration functional requirements as they relate to an exploration vehicle LSS architecture are presented. Particulate matter filtration concepts are reviewed and design considerations are discussed. A concept for a particulate matter filtration architecture suitable for exploration missions is presented. The conceptual architecture considers the results from developmental work and incorporates best practice design considerations.

  8. Sources of particulate matter components in the Athabasca oil sands region: investigation through a comparison of trace element measurement methodologies

    Directory of Open Access Journals (Sweden)

    C. Phillips-Smith

    2017-08-01

    Full Text Available The province of Alberta, Canada, is home to three oil sands regions which, combined, contain the third largest deposit of oil in the world. Of these, the Athabasca oil sands region is the largest. As part of Environment and Climate Change Canada's program in support of the Joint Canada-Alberta Implementation Plan for Oil Sands Monitoring program, concentrations of trace elements in PM2. 5 (particulate matter smaller than 2.5 µm in diameter were measured through two campaigns that involved different methodologies: a long-term filter campaign and a short-term intensive campaign. In the long-term campaign, 24 h filter samples were collected once every 6 days over a 2-year period (December 2010–November 2012 at three air monitoring stations in the regional municipality of Wood Buffalo. For the intensive campaign (August 2013, hourly measurements were made with an online instrument at one air monitoring station; daily filter samples were also collected. The hourly and 24 h filter data were analyzed individually using positive matrix factorization. Seven emission sources of PM2. 5 trace elements were thereby identified: two types of upgrader emissions, soil, haul road dust, biomass burning, and two sources of mixed origin. The upgrader emissions, soil, and haul road dust sources were identified through both the methodologies and both methodologies identified a mixed source, but these exhibited more differences than similarities. The second upgrader emissions and biomass burning sources were only resolved by the hourly and filter methodologies, respectively. The similarity of the receptor modeling results from the two methodologies provided reassurance as to the identity of the sources. Overall, much of the PM2. 5-related trace elements were found to be anthropogenic, or at least to be aerosolized through anthropogenic activities. These emissions may in part explain the previously reported higher levels of trace elements in snow

  9. Methodology development to support NPR strategic planning. Final report

    International Nuclear Information System (INIS)

    1996-01-01

    This report covers the work performed in support of the Office of New Production Reactors during the 9 month period from January through September 1990. Because of the rapid pace of program activities during this time period, the emphasis on work performed shifted from the strategic planning emphasis toward supporting initiatives requiring a more immediate consideration and response. Consequently, the work performed has concentrated on researching and helping identify and resolve those issues considered to be of most immediate concern. Even though they are strongly interrelated, they can be separated into two broad categories as follows: The first category encompasses program internal concerns. Included are issues associated with the current demand for accelerating staff growth, satisfying the immediate need for appropriate skill and experience levels, team building efforts necessary to assure the development of an effective operating organization, ability of people and organizations to satisfactorily understand and execute their assigned roles and responsibilities, and the general facilitation of inter/intra organization communications and working relationships. The second category encompasses program execution concerns. These include those efforts required in development of realistic execution plans and implementation of appropriate control mechanisms which provide for effective forecasting, planning, managing, and controlling of on-going (or soon to be) program substantive activities according to the master integrated schedule and budget

  10. Teaching and Learning Methodologies Supported by ICT Applied in Computer Science

    Science.gov (United States)

    Capacho, Jose

    2016-01-01

    The main objective of this paper is to show a set of new methodologies applied in the teaching of Computer Science using ICT. The methodologies are framed in the conceptual basis of the following sciences: Psychology, Education and Computer Science. The theoretical framework of the research is supported by Behavioral Theory, Gestalt Theory.…

  11. Chemical plant innovative safety investments decision-support methodology.

    Science.gov (United States)

    Reniers, G L L; Audenaert, A

    2009-01-01

    This article examines the extent to which investing in safety during the creation of a new chemical installation proves profitable. The authors propose a management supporting cost-benefit model that identifies and evaluates investments in safety within a chemical company. This innovative model differentiates between serious accidents and less serious accidents, thus providing an authentic image of prevention-related costs and benefits. In classic cost-benefit analyses, which do not make such differentiations, only a rudimentary image of potential profitability resulting from investments in safety is obtained. The resulting management conclusions that can be drawn from such classical analyses are of a very limited nature. The proposed model, however, is applied to a real case study and the proposed investments in safety at an appointed chemical installation are weighed against the estimated hypothetical benefits resulting from the preventive measures to be installed at the installation. In the case-study carried out in question, it would appear that the proposed prevention investments are justified. Such an economic exercise may be very important to chemical corporations trying to (further) improve their safety investments.

  12. Supporting the Future Total Force: A Methodology for Evaluating Potential Air National Guard Mission Assignments

    National Research Council Canada - National Science Library

    Lynch, Kristin F; Drew, John G; Sleeper, Sally; Williams, William A; Masters, James M; Luangkesorn, Louis; Tripp, Robert S; Lichter, Dahlia S; Roll, Charles R

    2007-01-01

    ... trained, highly experienced personnel with no aircraft to operate and support. The authors develop a methodology to evaluate missions that could be transferred from the active component to the ANG without significant cost to the total force...

  13. A Performance-Based Technology Assessment Methodology to Support DoD Acquisition

    National Research Council Canada - National Science Library

    Mahafza, Sherry; Componation, Paul; Tippett, Donald

    2005-01-01

    .... This methodology is referred to as Technology Performance Risk Index (TPRI). The TPRI can track technology readiness through a life cycle, or it can be used at a specific time to support a particular system milestone decision...

  14. EVALUATION OF TRAINING AND‐METHODOLOGICAL SUPPORT OF UNIVERSITY COURSES (in Russian

    Directory of Open Access Journals (Sweden)

    Natalia BELKINA

    2012-04-01

    Full Text Available Quality of teaching at a Higher Education Institution certainly depends on the integrity and quality of its training and methodological support. However, in order to improve this quality it is necessary to have a sound methodology for evaluation of such support. This article contains a list of recommended university teaching course materials, criteria of their separate components evaluation and an approach to calculating the quality levels of separate components and teaching course materials as a whole.

  15. Defluoridation of water using activated alumina in presence of natural organic matter via response surface methodology.

    Science.gov (United States)

    Samarghandi, Mohammad Reza; Khiadani, Mehdi; Foroughi, Maryam; Zolghadr Nasab, Hasan

    2016-01-01

    Adsorption by activated alumina is considered to be one of the most practiced methods for defluoridation of freshwater. This study was conducted, therefore, to investigate the effect of natural organic matters (NOMs) on the removal of fluoride by activated alumina using response surface methodology. To the authors' knowledge, this has not been previously investigated. Physico-chemical characterization of the alumina was determined by scanning electron microscope (SEM), Brunauer-Emmett-Teller (BET), Fourier transform infrared spectroscopy (FTIR), X-ray fluorescence (XRF), and X-ray diffractometer (XRD). Response surface methodology (RSM) was applied to evaluate the effect of single and combined parameters on the independent variables such as the initial concentration of fluoride, NOMs, and pH on the process. The results revealed that while presence of NOM and increase of pH enhance fluoride adsorption on the activated alumina, initial concentration of fluoride has an adverse effect on the efficiency. The experimental data were analyzed and found to be accurately and reliably fitted to a second-order polynomial model. Under optimum removal condition (fluoride concentration 20 mg/L, NOM concentration 20 mg/L, and pH 7) with a desirability value of 0.93 and fluoride removal efficiency of 80.6%, no significant difference was noticed with the previously reported sequence of the co-exiting ion affinity to activated alumina for fluoride removal. Moreover, aluminum residual was found to be below the recommended value by the guideline for drinking water. Also, the increase of fluoride adsorption on the activated alumina, as NOM concentrations increase, could be due to the complexation between fluoride and adsorbed NOM. Graphical abstract ᅟ.

  16. TEACHING AND LEARNING METHODOLOGIES SUPPORTED BY ICT APPLIED IN COMPUTER SCIENCE

    Directory of Open Access Journals (Sweden)

    Jose CAPACHO

    2016-04-01

    Full Text Available The main objective of this paper is to show a set of new methodologies applied in the teaching of Computer Science using ICT. The methodologies are framed in the conceptual basis of the following sciences: Psychology, Education and Computer Science. The theoretical framework of the research is supported by Behavioral Theory, Gestalt Theory. Genetic-Cognitive Psychology Theory and Dialectics Psychology. Based on the theoretical framework the following methodologies were developed: Game Theory, Constructivist Approach, Personalized Teaching, Problem Solving, Cooperative Collaborative learning, Learning projects using ICT. These methodologies were applied to the teaching learning process during the Algorithms and Complexity – A&C course, which belongs to the area of ​​Computer Science. The course develops the concepts of Computers, Complexity and Intractability, Recurrence Equations, Divide and Conquer, Greedy Algorithms, Dynamic Programming, Shortest Path Problem and Graph Theory. The main value of the research is the theoretical support of the methodologies and their application supported by ICT using learning objects. The course aforementioned was built on the Blackboard platform evaluating the operation of methodologies. The results of the evaluation are presented for each of them, showing the learning outcomes achieved by students, which verifies that methodologies are functional.

  17. Thin-shell wormholes supported by total normal matter

    Energy Technology Data Exchange (ETDEWEB)

    Mazharimousavi, S.H.; Halilsoy, M. [Eastern Mediterranean University, Department of Physics, Gazimagusa (Turkey)

    2014-09-15

    The Zipoy-Voorhees-Weyl (ZVW) spacetime characterized by mass (M) and oblateness (δ) is proposed in the construction of viable thin-shell wormholes (TSWs). A departure from spherical/cylindrical symmetry yields a positive total energy in spite of the fact that the local energy density may take negative values. We show that oblateness of the bumpy sources/black holes can be incorporated as a new degree of freedom that may play a role in the resolution of the exotic matter problem in TSWs. A small velocity perturbation reveals, however, that the resulting TSW is unstable. (orig.)

  18. Problems and Issues in Using Computer- Based Support Tools to Enhance 'Soft' Systems Methodologies

    Directory of Open Access Journals (Sweden)

    Mark Stansfield

    2001-11-01

    Full Text Available This paper explores the issue of whether computer-based support tools can enhance the use of 'soft' systems methodologies as applied to real-world problem situations. Although work has been carried out by a number of researchers in applying computer-based technology to concepts and methodologies relating to 'soft' systems thinking such as Soft Systems Methodology (SSM, such attempts appear to be still in their infancy and have not been applied widely to real-world problem situations. This paper will highlight some of the problems that may be encountered in attempting to develop computer-based support tools for 'soft' systems methodologies. Particular attention will be paid to an attempt by the author to develop a computer-based support tool for a particular 'soft' systems method of inquiry known as the Appreciative Inquiry Method that is based upon Vickers' notion of 'appreciation' (Vickers, 196S and Checkland's SSM (Checkland, 1981. The final part of the paper will explore some of the lessons learnt from developing and applying the computer-based support tool to a real world problem situation, as well as considering the feasibility of developing computer-based support tools for 'soft' systems methodologies. This paper will put forward the point that a mixture of manual and computer-based tools should be employed to allow a methodology to be used in an unconstrained manner, but the benefits provided by computer-based technology should be utilised in supporting and enhancing the more mundane and structured tasks.

  19. Definition of supportive care: does the semantic matter?

    Science.gov (United States)

    Hui, David

    2014-07-01

    'Supportive care' is a commonly used term in oncology; however, no consensus definition exists. This represents a barrier to communication in both the clinical and research settings. In this review, we propose a unifying conceptual framework for supportive care and discuss the proper use of this term in the clinical and research settings. A recent systematic review revealed several themes for supportive care: a focus on symptom management and improvement of quality of life, and care for patients on treatments and those with advanced stage disease. These findings are consistent with a broad definition for supportive care: 'the provision of the necessary services for those living with or affected by cancer to meet their informational, emotional, spiritual, social, or physical needs during their diagnostic, treatment, or follow-up phases encompassing issues of health promotion and prevention, survivorship, palliation, and bereavement.' Supportive care can be classified as primary, secondary, and tertiary based on the level of specialization. For example, palliative care teams provide secondary supportive care for patients with advanced cancer. Until a consensus definition is available for supportive care, this term should be clearly defined or cited whenever it is used.

  20. Methodology development for estimating support behavior of spacer grid spring in core

    International Nuclear Information System (INIS)

    Yoon, Kyung Ho; Kang, Heung Seok; Kim, Hyung Kyu; Song, Kee Nam

    1998-04-01

    The fuel rod (FR) support behavior is changed during operation resulting from effects such as clad creep-down, spring force relaxation due to irradiation, and irradiation growth of spacer straps in accordance with time or increase of burnup. The FR support behavior is closely associated with time or increase of burnup. The FR support behavior is closely associated with FR damage due to fretting, therefore the analysis on the FR support behavior is normally required to minimize the damage. The characteristics of the parameters, which affect the FR support behavior, and the methodology developed for estimating the FR support behavior in the reactor core are described in this work. The FR support condition for the KOFA (KOrean Fuel Assembly) fuel has been analyzed by this method, and the results of the analysis show that the fuel failure due to the fuel rod fretting wear is closely related to the support behavior of FR in the core. Therefore, the present methodology for estimating the FR support condition seems to be useful for estimating the actual FR support condition. In addition, the optimization seems to be a reliable tool for establishing the optimal support condition on the basis of these results. (author). 15 refs., 3 tabs., 26 figs

  1. Evaluating electronic performance support systems: A methodology focused on future use-in-practice

    NARCIS (Netherlands)

    Collis, Betty; Verwijs, C.A.

    1995-01-01

    Electronic performance support systems, as an emerging type of software environment, present many new challenges in relation to effective evaluation. In this paper, a global approach to a 'usage-orientated' evaluation methodology for software product is presented, followed by a specific example of

  2. Learning to Support Learning Together: An Experience with the Soft Systems Methodology

    Science.gov (United States)

    Sanchez, Adolfo; Mejia, Andres

    2008-01-01

    An action research approach called soft systems methodology (SSM) was used to foster organisational learning in a school regarding the role of the learning support department within the school and its relation with the normal teaching-learning activities. From an initial situation of lack of coordination as well as mutual misunderstanding and…

  3. Search for supporting methodologies - Or how to support SEI for 35 years

    Science.gov (United States)

    Handley, Thomas H., Jr.; Masline, Richard C.

    1991-01-01

    Concepts relevant to the development of an evolvable information management system are examined in terms of support for the Space Exploration Initiative. The issues of interoperability within NASA and industry initiatives are studied including the Open Systems Interconnection standard and the operating system of the Open Software Foundation. The requirements of partitioning functionality into separate areas are determined with attention given to the infrastructure required to ensure system-wide compliance. The need for a decision-making context is a key to the distributed implementation of the program, and this environment is concluded to be next step in developing an evolvable, interoperable, and securable support network.

  4. (N+1)-dimensional Lorentzian evolving wormholes supported by polytropic matter

    Energy Technology Data Exchange (ETDEWEB)

    Cataldo, Mauricio [Universidad del Bio-Bio, Departamento de Fisica, Facultad de Ciencias, Concepcion (Chile); Arostica, Fernanda; Bahamonde, Sebastian [Universidad de Concepcion, Departamento de Fisica, Concepcion (Chile)

    2013-08-15

    In this paper we study (N+1)-dimensional evolving wormholes supported by energy satisfying a polytropic equation of state. The considered evolving wormhole models are described by a constant redshift function and generalizes the standard flat Friedmann-Robertson-Walker spacetime. The polytropic equation of state allows us to consider in (3+1)-dimensions generalizations of the phantom energy and the generalized Chaplygin gas sources. (orig.)

  5. Development of methodologies for coupled water-hammer analysis of piping systems and supports

    International Nuclear Information System (INIS)

    Kamil, H.; Gantayat, A.; Attia, A.; Goulding, H.

    1983-01-01

    The paper presents the results of an investigation on the development of methodologies for coupled water-hammer analyses. The study was conducted because the present analytical methods for calculation of loads on piping systems and supports resulting from water-hammer phenomena are overly conservative. This is mainly because the methods do not usually include interaction between the fluid and the piping and thus predict high loads on piping systems and supports. The objective of the investigation presented in this paper was to develop methodologies for coupled water-hammer analyses, including fluid-structure interaction effects, to be able to obtain realistic loads on piping systems and supports, resulting in production of more economical designs. (orig./RW)

  6. Design Methodology of a Sensor Network Architecture Supporting Urgent Information and Its Evaluation

    Science.gov (United States)

    Kawai, Tetsuya; Wakamiya, Naoki; Murata, Masayuki

    Wireless sensor networks are expected to become an important social infrastructure which helps our life to be safe, secure, and comfortable. In this paper, we propose design methodology of an architecture for fast and reliable transmission of urgent information in wireless sensor networks. In this methodology, instead of establishing single complicated monolithic mechanism, several simple and fully-distributed control mechanisms which function in different spatial and temporal levels are incorporated on each node. These mechanisms work autonomously and independently responding to the surrounding situation. We also show an example of a network architecture designed following the methodology. We evaluated the performance of the architecture by extensive simulation and practical experiments and our claim was supported by the results of these experiments.

  7. Improving life cycle assessment methodology for the application of decision support

    DEFF Research Database (Denmark)

    Herrmann, Ivan Tengbjerg

    for the application of decision support and evaluation of uncertainty in LCA. From a decision maker’s (DM’s) point of view there are at least three main “illness” factors influencing the quality of the information that the DM uses for making decisions. The factors are not independent of each other, but it seems......) refrain from making a decision based on an LCA and thus support a decision on other parameters than the LCA environmental parameters. Conversely, it may in some decision support contexts be acceptable to base a decision on highly uncertain information. This all depends on the specific decision support...... the different steps. A deterioration of the quality in each step is likely to accumulate through the statistical value chain in terms of increased uncertainty and bias. Ultimately this can make final decision support problematic. The "Law of large numbers" (LLN) is the methodological tool/probability theory...

  8. Optimization Of Methodological Support Of Application Tax Benefits In Regions: Practice Of Perm Region

    Directory of Open Access Journals (Sweden)

    Alexandr Ivanovich Tatarkin

    2015-03-01

    Full Text Available In the article, the problem of the methodological process support of regional tax benefits is reviewed. The method of tax benefits assessment, accepted in Perm Region, was chosen as an analysis object because the relatively long period of application of benefits has allowed to build enough statistics base. In the article, the reliability of budget, economic, investment, and social effectiveness assessments of application benefits, based on the Method, is investigated. The suggestions of its perfection are formulated

  9. Towards a Critical Health Equity Research Stance: Why Epistemology and Methodology Matter More than Qualitative Methods

    Science.gov (United States)

    Bowleg, Lisa

    2017-01-01

    Qualitative methods are not intrinsically progressive. Methods are simply tools to conduct research. Epistemology, the justification of knowledge, shapes methodology and methods, and thus is a vital starting point for a critical health equity research stance, regardless of whether the methods are qualitative, quantitative, or mixed. In line with…

  10. Identifying Opportunities for Decision Support Systems in Support of Regional Resource Use Planning: An Approach Through Soft Systems Methodology.

    Science.gov (United States)

    Zhu; Dale

    2000-10-01

    / Regional resource use planning relies on key regional stakeholder groups using and having equitable access to appropriate social, economic, and environmental information and assessment tools. Decision support systems (DSS) can improve stakeholder access to such information and analysis tools. Regional resource use planning, however, is a complex process involving multiple issues, multiple assessment criteria, multiple stakeholders, and multiple values. There is a need for an approach to DSS development that can assist in understanding and modeling complex problem situations in regional resource use so that areas where DSSs could provide effective support can be identified, and the user requirements can be well established. This paper presents an approach based on the soft systems methodology for identifying DSS opportunities for regional resource use planning, taking the Central Highlands Region of Queensland, Australia, as a case study.

  11. Methodology for the economic optimisation of energy storage systems for frequency support in wind power plants

    International Nuclear Information System (INIS)

    Johnston, Lewis; Díaz-González, Francisco; Gomis-Bellmunt, Oriol; Corchero-García, Cristina; Cruz-Zambrano, Miguel

    2015-01-01

    Highlights: • Optimisation of energy storage system with wind power plant for frequency response. • Energy storage option considered could be economically viable. • For a 50 MW wind farm, an energy storage system of 5.3 MW and 3 MW h was found. - Abstract: This paper proposes a methodology for the economic optimisation of the sizing of Energy Storage Systems (ESSs) whilst enhancing the participation of Wind Power Plants (WPP) in network primary frequency control support. The methodology was designed flexibly, so it can be applied to different energy markets and to include different ESS technologies. The methodology includes the formulation and solving of a Linear Programming (LP) problem. The methodology was applied to the particular case of a 50 MW WPP, equipped with a Vanadium Redox Flow battery (VRB) in the UK energy market. Analysis is performed considering real data on the UK regular energy market and the UK frequency response market. Data for wind power generation and energy storage costs are estimated from literature. Results suggest that, under certain assumptions, ESSs can be profitable for the operator of a WPP that is providing frequency response. The ESS provides power reserves such that the WPP can generate close to the maximum energy available. The solution of the optimisation problem establishes that an ESS with a power rating of 5.3 MW and energy capacity of about 3 MW h would be enough to provide such service whilst maximising the incomes for the WPP operator considering the regular and frequency regulation UK markets

  12. Methodological Aspects of In Vitro Assessment of Bio-accessible Risk Element Pool in Urban Particulate Matter

    Czech Academy of Sciences Publication Activity Database

    Sysalová, J.; Száková, J.; Tremlová, J.; Kašparovská, Kateřina; Kotlík, B.; Tlustoš, P.; Svoboda, Petr

    2014-01-01

    Roč. 161, č. 2 (2014), s. 216-222 ISSN 0163-4984 Grant - others:GA ČR(CZ) GA521/09/1150; GA ČR(CZ) GAP503/12/0682 Program:GA; GA Institutional support: RVO:67985823 Keywords : risk elements * urban particulate matter * in vitro tests * bio-accessibility Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 1.748, year: 2014

  13. Towards a Critical Health Equity Research Stance: Why Epistemology and Methodology Matter More Than Qualitative Methods.

    Science.gov (United States)

    Bowleg, Lisa

    2017-10-01

    Qualitative methods are not intrinsically progressive. Methods are simply tools to conduct research. Epistemology, the justification of knowledge, shapes methodology and methods, and thus is a vital starting point for a critical health equity research stance, regardless of whether the methods are qualitative, quantitative, or mixed. In line with this premise, I address four themes in this commentary. First, I criticize the ubiquitous and uncritical use of the term health disparities in U.S. public health. Next, I advocate for the increased use of qualitative methodologies-namely, photovoice and critical ethnography-that, pursuant to critical approaches, prioritize dismantling social-structural inequities as a prerequisite to health equity. Thereafter, I discuss epistemological stance and its influence on all aspects of the research process. Finally, I highlight my critical discourse analysis HIV prevention research based on individual interviews and focus groups with Black men, as an example of a critical health equity research approach.

  14. СONTENTS OF THE METHODOLOGICAL AND TECHNOLOGICAL SUPPORT OF THE EDUCATION QUALITY MANAGEMENT INFORMATION SYSTEM FOR FUTURE ECONOMISTS

    Directory of Open Access Journals (Sweden)

    Kostiantyn S. Khoruzhyi

    2014-12-01

    Full Text Available In the article, the content and nature of organizational activities in scope of methodological and technological support of the education quality management information system (EQMIS for future economists are described. The content of the organizational activities for the implementation of methodological and technological support of EQMIS for future economists includes four stages (preparatory, instructional/adaptational, methodological/basic, as well as experimental/evaluational and contains a set of methodological and technological measures for each of the stages of the EQMIS implementation. A study of the pedagogical impact of the proposed methodology of using EQMIS in the formation of professional competence of economics students was also conducted. The main stages, methods and sequence of implementation arrangements for the methodological and technological support of EQMIS are defined.

  15. 3 + 1-dimensional thin shell wormhole with deformed throat can be supported by normal matter

    Energy Technology Data Exchange (ETDEWEB)

    Mazharimousavi, S.H.; Halilsoy, M. [Eastern Mediterranean University, Department of Physics, Gazimagusa (Turkey)

    2015-06-15

    From the physics standpoint the exotic matter problem is a major difficulty in thin shell wormholes (TSWs) with spherical/cylindrical throat topologies.We aim to circumvent this handicap by considering angle dependent throats in 3 + 1 dimensions. By considering the throat of the TSW to be deformed spherical, i.e., a function of θ and φ, we present general conditions which are to be satisfied by the shape of the throat in order to have the wormhole supported by matter with positive density in the static reference frame. We provide particular solutions/examples to the constraint conditions. (orig.)

  16. Variability of insulin degludec and glargine U300: A matter of methodology or just marketing?

    Science.gov (United States)

    Heise, Tim; Heckermann, Sascha; DeVries, J Hans

    2018-05-17

    The variability in the time-action profiles of insulin preparations, in particular basal insulins, has been a matter of debate ever since the publication of a glucose clamp study comparing the day-to-day variability of three different basal insulins (glargine U100, detemir and NPH) in 2004 [1]. While critics did not contest the findings of a lower variability of some basal insulins in this and a later [2] glucose clamp study, they did question the relevance of a lower pharmacokinetic (PK) and pharmacodynamic (PD) variability for clinical endpoints [3, 4]. Nevertheless, this has not stopped marketeers to widely use the results of glucose clamp studies promoting insulins for higher predictability or a suggested flat PK/PD-profile fully covering 24 hours [5]. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  17. Size Matters: The Link between Staff Size and Perceived Organizational Support in Early Childhood Education

    Science.gov (United States)

    Ho, Dora; Lee, Moosung; Teng, Yue

    2016-01-01

    Purpose: The purpose of this paper is to examine the relationship between staff size and perceived organizational support (POS) in early childhood education (ECE) organizations. Design/methodology/approach: A territory-wide questionnaire survey was designed to investigate the perceptions of preschool teachers in Hong Kong on four dimensions of…

  18. Methodology supporting production control in a foundry applying modern DISAMATIC molding line

    Directory of Open Access Journals (Sweden)

    Sika Robert

    2017-01-01

    Full Text Available The paper presents methodology of production control using statistical methods in foundry conditions, using the automatic DISAMATIC molding line. The authors were inspired by many years of experience in implementing IT tools for foundries. The authors noticed that there is a lack of basic IT tools dedicated to specific casting processes, that would greatly facilitate their oversight and thus improve the quality of manufactured products. More and more systems are installed in the ERP or CAx area, but they integrate processes only partially, mainly in the area of technology design and business management from finance and control. Monitoring of foundry processes can generate a large amount of process-related data. This is particularly noticeable in automated processes. An example is the modern DISAMATIC molding line, which integrates several casting processes, such as mold preparation, assembly, pouring or shake out. The authors proposed a methodology that supports the control of the above-mentioned foundry processes using statistical methods. Such an approach can be successfully used, for example, during periodic external audits. The mentioned methodology in the innovative DISAM-ProdC computer tool was implemented.

  19. Catalytic combustion of particulate matter Catalysts of alkaline nitrates supported on hydrous zirconium

    International Nuclear Information System (INIS)

    Galdeano, N.F.; Carrascull, A.L.; Ponzi, M.I.; Lick, I.D.; Ponzi, E.N.

    2004-01-01

    In order to explore a method to remove particulate matter, catalysts of different alkaline nitrates (Li, K and Cs) supported on hydrous zirconium were prepared by the method of incipient humidity and tested as catalysts for particulate matter combustion. The catalytic activity was determined by using the temperature programmed oxidation technique (TPO), utilizing two equipments, a thermogravimetric reactor and other of fixed bed. In the first case the particulate matter/catalyst mixture was milled carefully in a mortar (tight contact) while in the second case more realistic operative conditions were used, particulate matter/catalyst mixture was made with a spatula (loose contact). All prepared catalysts showed good activity for the particulate matter combustion. The cesium catalyst was the one that presented higher activity, decreasing the combustion temperature between 200 and 250 deg. C with respect to the combustion without catalyst. The catalyst with lithium nitrate became active at higher temperature than its melting point and the same occurred with the potassium catalyst. This did not occur for the catalyst containing cesium nitrate that melts at 407 deg. C and became active from 350 deg. C

  20. Watermark: An Application and Methodology and Application for Interactive and intelligent Decision Support for Groundwater Systems

    Science.gov (United States)

    Pierce, S. A.; Wagner, K.; Schwartz, S.; Gentle, J. N., Jr.

    2016-12-01

    Critical water resources face the effects of historic drought, increased demand, and potential contamination, the need has never been greater to develop resources to effectively communicate conservation and protection across a broad audience and geographical area. The Watermark application and macro-analysis methodology merges topical analysis of context rich corpus from policy texts with multi-attributed solution sets from integrated models of water resource and other subsystems, such as mineral, food, energy, or environmental systems to construct a scalable, robust, and reproducible approach for identifying links between policy and science knowledge bases. The Watermark application is an open-source, interactive workspace to support science-based visualization and decision making. Designed with generalization in mind, Watermark is a flexible platform that allows for data analysis and inclusion of large datasets with an interactive front-end capable of connecting with other applications as well as advanced computing resources. In addition, the Watermark analysis methodology offers functionality that streamlines communication with non-technical users for policy, education, or engagement with groups around scientific topics of societal relevance. The technology stack for Watermark was selected with the goal of creating a robust and dynamic modular codebase that can be adjusted to fit many use cases and scale to support usage loads that range between simple data display to complex scientific simulation-based modelling and analytics. The methodology uses to topical analysis and simulation-optimization to systematically analyze the policy and management realities of resource systems and explicitly connect the social and problem contexts with science-based and engineering knowledge from models. A case example demonstrates use in a complex groundwater resources management study highlighting multi-criteria spatial decision making and uncertainty comparisons.

  1. UPC Scaling-up methodology for Deterministic Safety Assessment and Support to Plant Operation

    Energy Technology Data Exchange (ETDEWEB)

    Martínez-Quiroga, V.; Reventós, F.; Batet, Il.

    2015-07-01

    Best Estimate codes along with necessary nodalizations are widely used tools in nuclear engineering for both Deterministic Safety Assessment (DSA) and Support to Plant Operation and Control. In this framework, the application of quality assurance procedures in both codes and nodalizations becomes an essential step prior any significant study. Along these lines the present paper introduces the UPC SCUP, a systematic methodology based on the extrapolation of the Integral Test Facilities (ITF) post-test simulations by means of scaling analyses. In that sense, SCUP fulfills a gap in current nodalization qualification procedures, the related with the validation of NPP nodalizations for Design Basis Accidents conditions. Three are the pillars that support SCUP: judicial selection of the experimental transients, full confidence in the quality of the ITF simulations, and simplicity in justifying discrepancies that appear between ITF and NPP counterpart transients. The techniques that are presented include the socalled Kv scaled calculations as well as the use of two new approaches, ”Hybrid nodalizations” and ”Scaled-up nodalizations”. These last two methods have revealed themselves to be very helpful in producing the required qualification and in promoting further improvements in nodalization. The study of both LSTF and PKL counterpart tests have allowed to qualify the methodology by the comparison with experimental data. Post-test simulations at different sizes allowed to define which phenomena could be well reproduced by system codes and which not, in this way also establishing the basis for the extrapolation to an NPP scaled calculation. Furthermore, the application of the UPC SCUP methodology demonstrated that selected phenomena can be scaled-up and explained between counterpart simulations by carefully considering the differences in scale and design. (Author)

  2. A Multi-Criteria Methodology to Support Public Administration Decision Making Concerning Sustainable Energy Action Plans

    Directory of Open Access Journals (Sweden)

    Chiara Novello

    2013-08-01

    Full Text Available For municipalities that have joined the Covenant of Mayors promoted by the European Commission, the Sustainable Energy Action Plan (SEAP represents a strategic tool for achieving the greenhouse gas reductions required by 2020. So far as the energy retrofit actions in their residential building stock are concerned, which in the small-to-medium municipalities are responsible for more than 60% of CO2 emissions, the scenarios for intervening are normally decided on the basis of an economic (cost/performance analysis. This type of analysis, however, does not take into account important aspects for small and medium-sized communities such as social aspects, environmental impacts, local economic development and employment. A more comprehensive and effective tool to support the choices of public administrators is the multi-criteria analysis. This study proposes a methodology that integrates multi-criteria analysis in order to support Public Administration/Local Authorities in programming Sustainable Energy Action Plans with a more targeted approach to sustainability. The methodology, based on the ELECTRE III method, was applied to a medium-size municipality in the Lombardy region of Italy. The results obtained with this approach are discussed in this paper.

  3. Methodological Reflections on the Contribution of Qualitative Research to the Evaluation of Clinical Ethics Support Services.

    Science.gov (United States)

    Wäscher, Sebastian; Salloch, Sabine; Ritter, Peter; Vollmann, Jochen; Schildmann, Jan

    2017-05-01

    This article describes a process of developing, implementing and evaluating a clinical ethics support service intervention with the goal of building up a context-sensitive structure of minimal clinical-ethics in an oncology department without prior clinical ethics structure. Scholars from different disciplines have called for an improvement in the evaluation of clinical ethics support services (CESS) for different reasons over several decades. However, while a lot has been said about the concepts and methodological challenges of evaluating CESS up to the present time, relatively few empirical studies have been carried out. The aim of this article is twofold. On the one hand, it describes a process of development, modifying and evaluating a CESS intervention as part of the ETHICO research project, using the approach of qualitative-formative evaluation. On the other hand, it provides a methodological analysis which specifies the contribution of qualitative empirical methods to the (formative) evaluation of CESS. We conclude with a consideration of the strengths and limitations of qualitative evaluation research with regards to the evaluation and development of context sensitive CESS. We further discuss our own approach in contrast to rather traditional consult or committee models. © 2017 John Wiley & Sons Ltd.

  4. Update in the methodology of the chronic stress paradigm: internal control matters

    Directory of Open Access Journals (Sweden)

    Boyks Marco

    2011-04-01

    Full Text Available Abstract To date, the reliability of induction of a depressive-like state using chronic stress models is confronted by many methodological limitations. We believe that the modifications to the stress paradigm in mice proposed herein allow some of these limitations to be overcome. Here, we discuss a variant of the standard stress paradigm, which results in anhedonia. This anhedonic state was defined by a decrease in sucrose preference that was not exhibited by all animals. As such, we propose the use of non-anhedonic, stressed mice as an internal control in experimental mouse models of depression. The application of an internal control for the effects of stress, along with optimized behavioural testing, can enable the analysis of biological correlates of stress-induced anhedonia versus the consequences of stress alone in a chronic-stress depression model. This is illustrated, for instance, by distinct physiological and molecular profiles in anhedonic and non-anhedonic groups subjected to stress. These results argue for the use of a subgroup of individuals who are negative for the induction of a depressive phenotype during experimental paradigms of depression as an internal control, for more refined modeling of this disorder in animals.

  5. Update in the methodology of the chronic stress paradigm: internal control matters.

    Science.gov (United States)

    Strekalova, Tatyana; Couch, Yvonne; Kholod, Natalia; Boyks, Marco; Malin, Dmitry; Leprince, Pierre; Steinbusch, Harry Mw

    2011-04-27

    To date, the reliability of induction of a depressive-like state using chronic stress models is confronted by many methodological limitations. We believe that the modifications to the stress paradigm in mice proposed herein allow some of these limitations to be overcome. Here, we discuss a variant of the standard stress paradigm, which results in anhedonia. This anhedonic state was defined by a decrease in sucrose preference that was not exhibited by all animals. As such, we propose the use of non-anhedonic, stressed mice as an internal control in experimental mouse models of depression. The application of an internal control for the effects of stress, along with optimized behavioural testing, can enable the analysis of biological correlates of stress-induced anhedonia versus the consequences of stress alone in a chronic-stress depression model. This is illustrated, for instance, by distinct physiological and molecular profiles in anhedonic and non-anhedonic groups subjected to stress. These results argue for the use of a subgroup of individuals who are negative for the induction of a depressive phenotype during experimental paradigms of depression as an internal control, for more refined modeling of this disorder in animals.

  6. A Life-Cycle Cost Estimating Methodology for NASA-Developed Air Traffic Control Decision Support Tools

    Science.gov (United States)

    Wang, Jianzhong Jay; Datta, Koushik; Landis, Michael R. (Technical Monitor)

    2002-01-01

    This paper describes the development of a life-cycle cost (LCC) estimating methodology for air traffic control Decision Support Tools (DSTs) under development by the National Aeronautics and Space Administration (NASA), using a combination of parametric, analogy, and expert opinion methods. There is no one standard methodology and technique that is used by NASA or by the Federal Aviation Administration (FAA) for LCC estimation of prospective Decision Support Tools. Some of the frequently used methodologies include bottom-up, analogy, top-down, parametric, expert judgement, and Parkinson's Law. The developed LCC estimating methodology can be visualized as a three-dimensional matrix where the three axes represent coverage, estimation, and timing. This paper focuses on the three characteristics of this methodology that correspond to the three axes.

  7. Assessment of bioenergy potential in Sicily: A GIS-based support methodology

    International Nuclear Information System (INIS)

    Beccali, Marco; D'Alberti, Vincenzo; Franzitta, Vincenzo; Columba, Pietro

    2009-01-01

    A Geographical Information System (GIS) supported methodology has been developed in order to assess the technical and economic potential of biomass exploitation for energy production in Sicily. The methodology was based on the use of agricultural, economic, climatic, and infrastructural data in a GIS. Data about land use, transportation facilities, urban cartography, regional territorial planning, terrain digital model, lithology, climatic types, and civil and industrial users have been stored in the GIS to define potential areas for gathering the residues coming from the pruning of olive groves, vineyards, and other agricultural crops, and to assess biomass available for energy cultivation. Further, it was possible to assess the potential of biodiesel production, supposing the cultivation of rapeseed in arable crop areas. For the biomass used for direct combustion purposes, the economic availability has been assessed assuming a price of the biomass and comparing it with other fuels. This assessment has shown the strong competitiveness of firewood in comparison with traditional fossil fuels when the collection system is implemented in an efficient way. Moreover, the economic potential of biodiesel was assessed considering the on-going financial regime for fuel. At the same time, the study has shown a significant competitiveness of the finished biomass (pellets), and good potential for a long-term development of this market. An important result was the determination of biofuel production potential in Sicily. An outcome of the study was to show the opportunities stemming from the harmonisation of Energy Policy with the Waste Management System and Rural Development Plan. (author)

  8. A conceptual methodology to design a decision support system to leak detection programs in water networks

    International Nuclear Information System (INIS)

    Di Federico, V.; Bottarelli, M.; Di Federico, I.

    2005-01-01

    The paper outlines a conceptual methodology to develop a decision support system to assist technicians managing water networks in selecting the appropriate leak detection method(s). First, the necessary knowledge about the network is recapitulated: location and characteristics of its physical components, but also water demand, breaks in pipes, and water quality data. Second, the water balance in a typical Italian Agency is discussed, suggesting method and procedures to evacuate and/or estimate each term in the mass balance equation. Then the available methods for leak detection are described in detail, from those useful in the pre-localization phase to those commonly adopted to pinpoint pipe failures and allow a rapid repair. Criteria to estimate costs associated with each of these methods are provided. Finally, the proposed structure of the DSS is described [it

  9. CALS and the Product State Model - Methodology and Supporting Schools and Paradigms

    DEFF Research Database (Denmark)

    Larsen, Michael Holm

    1998-01-01

    incorporates relevant information about each stage of the production process.The paper will describe the research object, the model object and discuss a part of the methodology in developing a Product State Model. The project is primarily technological, however, organisational and human aspects......This paper address the preliminary considerations in a research project, initiated February 1997, regarding Continuous Acquisition and Life-cycle Support (CALS) which is a part of the activities in CALS Center Denmark. The CALS concept is presented focusing on the Product State Model (PSM). The PSM...... will be developed upon, will be discussed. Also, the parameters for evaluating the PSM will be considered. In establishing the theoretical body of knowledge with respect to CALS, an identification of schools and paradigms within the research area of applying information technology in a manufacturing environment...

  10. Adaptation of the Methodological Support to the Specifics of Management of “Smart” Environment

    Directory of Open Access Journals (Sweden)

    Lazebnyk Iuliia O.

    2017-12-01

    Full Text Available The aim of the article is to justify the analytic base of the methodological support for adopting new technology solutions necessary to improve management of the “smart” environment system. The article identifies the range of major problems facing modern large cities in the context of the growing urbanization and substantiates the need to introduce the concept of “smart environment for solving these problems. The main approaches to the definition of the concept of “smart” environment are considered. The main components of “smart” environment are identified and analyzed. The best world practices of leading cities, such as Dubai and Hong Kong, regarding the introduction of “smart” technologies are considered.

  11. Methodology of Young Minds Matter: The second Australian Child and Adolescent Survey of Mental Health and Wellbeing.

    Science.gov (United States)

    Hafekost, Jennifer; Lawrence, David; Boterhoven de Haan, Katrina; Johnson, Sarah E; Saw, Suzy; Buckingham, William J; Sawyer, Michael G; Ainley, John; Zubrick, Stephen R

    2016-09-01

    To describe the study design of Young Minds Matter: The second Australian Child and Adolescent Survey of Mental Health and Wellbeing. The aims of the study, sample design, development of survey content, field procedures and final questionnaires are detailed. During 2013-2014, a national household survey of the mental health and wellbeing of young people was conducted involving a sample of 6310 families selected at random from across Australia. The survey included a face-to-face diagnostic interview with parents/carers of 4- to 17-year-olds and a self-report questionnaire completed by young people aged 11-17 years. The overall response rate to the survey was 55% with 6310 parents/carers of eligible households participating in the survey. In addition, 2967 or 89% of young people aged 11-17 years in these participating households completed a questionnaire. The survey sample was found to be broadly representative of the Australian population on major demographic characteristics when compared with data from the Census of Population and Housing. However, adjustments were made for an over-representation of younger children aged 4 to 7 years and also families with more than one eligible child in the household. Young Minds Matter provides updated national prevalence estimates of common child and adolescent mental disorders, describes patterns of service use and will help to guide future decisions in the development of policy and provision of mental health services for children and adolescents. Advancements in interviewing methodology, addition of a data linkage component and informed content development contributed to improved breadth and quality of the data collected. © The Royal Australian and New Zealand College of Psychiatrists 2015.

  12. A new methodology to derive settleable particulate matter guidelines to assist policy-makers on reducing public nuisance

    Science.gov (United States)

    Machado, Milena; Santos, Jane Meri; Reisen, Valdério Anselmo; Reis, Neyval Costa; Mavroidis, Ilias; Lima, Ana T.

    2018-06-01

    Air quality standards for settleable particulate matter (SPM) are found in many countries around the world. As well known, annoyance caused by SPM can be considered a community problem even if only a small proportion of the population is bothered at rather infrequent occasions. Many authors have shown that SPM cause soiling in residential and urban environments and degradation of materials (eg, objects and surface painting) that can impair the use and enjoyment of property and alter the normal activities of society. In this context, this paper has as main contribution to propose a guidance to establish air quality standards for annoyance caused by SPM in metropolitan industrial areas. To attain this objective, a new methodology is proposed which is based on the nonlinear correlation between the perceived annoyance (qualitative variable) and particles deposition rate (quantitative variable). Since the response variable is binary (annoyed and not annoyed), the logistic regression model is used to estimate the probability of people being annoyed at different levels of particles deposition rate and to compute the odds ratio function which gives, under a specific level of particles deposition rate, the estimated expected value of the population perceived annoyance. The proposed methodology is verified in a data set measured in the metropolitan area of Great Vitória, Espirito Santo, Brazil. As a general conclusion, the estimated probability function of perceived annoyance as a function of SPM has shown that 17% of inhabitants report annoyance to very low particles deposition levels of 5 g/(m2•30 days). In addition, for an increasing of 1 g/(m2•30 days) of SPM, the smallest estimated odds ratio of perceived annoyance by a factor of 1.5, implying that the probability of occurrence is almost 2 times as large as the probability of no occurrence of annoyance.

  13. Autonomy support physical education: history, design, methodology and analysis regarding motivation in teenage students

    Directory of Open Access Journals (Sweden)

    Marina Martínez-Molina

    2013-07-01

    Full Text Available In any area of education it is recognized how important is that students are motivated. But this requires teachers who motivate and actions that cause this state on students. The autonomy support may be the key to improve the motivation of learners, as well as an indicator to search for other improvements in the teaching-learning process. The aim of this study was to analyze the potential importance of supporting autonomy in students (both in learning and in the acquisition of habits and exemplify the design, methodology and analysis to make possible to get the objectives. This will draw a sample of 758 high school students (347 men, 45.8%; 411 women, 54.2% of the Region of Murcia, aged between 12 and 18 (M = 15.22, SD = 1.27. The instrument to be used is a questionnaire consisting of scales: Learning Climate Quetionarire (LCQ, Sport Motivation Scale (SMS, Intention to partake in leisure-time physical activity (Intention-PFTL, Sport Satisfaction Instrument to Physical Education (SSI-EF and the scale of Importance and usefulness of Physical Education (IEF. Possible results may improve and discuss many of the existing work and provide further guidance to be used for teachers to improve their teaching performance.

  14. Transboundary Water: Improving Methodologies and Developing Integrated Tools to Support Water Security

    Science.gov (United States)

    Hakimdavar, Raha; Wood, Danielle; Eylander, John; Peters-Lidard, Christa; Smith, Jane; Doorn, Brad; Green, David; Hummel, Corey; Moore, Thomas C.

    2018-01-01

    River basins for which transboundary coordination and governance is a factor are of concern to US national security, yet there is often a lack of sufficient data-driven information available at the needed time horizons to inform transboundary water decision-making for the intelligence, defense, and foreign policy communities. To address this need, a two-day workshop entitled Transboundary Water: Improving Methodologies and Developing Integrated Tools to Support Global Water Security was held in August 2017 in Maryland. The committee that organized and convened the workshop (the Organizing Committee) included representatives from the National Aeronautics and Space Administration (NASA), the US Army Corps of Engineers Engineer Research and Development Center (ERDC), and the US Air Force. The primary goal of the workshop was to advance knowledge on the current US Government and partners' technical information needs and gaps to support national security interests in relation to transboundary water. The workshop also aimed to identify avenues for greater communication and collaboration among the scientific, intelligence, defense, and foreign policy communities. The discussion around transboundary water was considered in the context of the greater global water challenges facing US national security.

  15. A methodology to support the development of 4-year pavement management plan.

    Science.gov (United States)

    2014-07-01

    A methodology for forming and prioritizing pavement maintenance and rehabilitation (M&R) projects was developed. : The Texas Department of Transportation (TxDOT) can use this methodology to generate defensible and cost-effective : 4-year pavement man...

  16. Impact of renewables on electricity markets – Do support schemes matter?

    International Nuclear Information System (INIS)

    Winkler, Jenny; Gaio, Alberto; Pfluger, Benjamin; Ragwitz, Mario

    2016-01-01

    Rising renewable shares influence electricity markets in several ways: among others, average market prices are reduced and price volatility increases. Therefore, the “missing money problem” in energy-only electricity markets is more likely to occur in systems with high renewable shares. Nevertheless, renewables are supported in many countries due to their expected benefits. The kind of support instrument can however influence the degree to which renewables influence the market. While fixed feed-in tariffs lead to higher market impacts, more market-oriented support schemes such as market premiums, quota systems and capacity-based payments decrease the extent to which markets are affected. This paper analyzes the market impacts of different support schemes. For this purpose, a new module is added to an existing bottom-up simulation model of the electricity market. In addition, different degrees of flexibility in the electricity system are considered. A case study for Germany is used to derive policy recommendations regarding the choice of support scheme. - Highlights: •Renewable support schemes matter regarding the impact on electricity markets. •Market-oriented support schemes reduce the impact on electricity markets. •More flexible electricity systems reduce the need for market participation. •Sliding premiums combine market integration with a productive risk allocation.

  17. Investigation of optimal seismic design methodology for piping systems supported by elasto-plastic dampers. Part 1. Evaluation functions

    International Nuclear Information System (INIS)

    Ito, Tomohiro; Michiue, Masashi; Fujita, Katsuhisa

    2009-01-01

    In this study, the optimal seismic design methodology that can consider the structural integrity of not only the piping systems but also elasto-plastic supporting devices is developed. This methodology employs a genetic algorithm and can search the optimal conditions such as the supporting location, capacity and stiffness of the supporting devices. Here, a lead extrusion damper is treated as a typical elasto-plastic damper. Four types of evaluation functions are considered. It is found that the proposed optimal seismic design methodology is very effective and can be applied to the actual seismic design for piping systems supported by elasto-plastic dampers. The effectiveness of the evaluation functions is also clarified. (author)

  18. Sustainability of evidence-based healthcare: research agenda, methodological advances, and infrastructure support.

    Science.gov (United States)

    Proctor, Enola; Luke, Douglas; Calhoun, Annaliese; McMillen, Curtis; Brownson, Ross; McCrary, Stacey; Padek, Margaret

    2015-06-11

    Little is known about how well or under what conditions health innovations are sustained and their gains maintained once they are put into practice. Implementation science typically focuses on uptake by early adopters of one healthcare innovation at a time. The later-stage challenges of scaling up and sustaining evidence-supported interventions receive too little attention. This project identifies the challenges associated with sustainability research and generates recommendations for accelerating and strengthening this work. A multi-method, multi-stage approach, was used: (1) identifying and recruiting experts in sustainability as participants, (2) conducting research on sustainability using concept mapping, (3) action planning during an intensive working conference of sustainability experts to expand the concept mapping quantitative results, and (4) consolidating results into a set of recommendations for research, methodological advances, and infrastructure building to advance understanding of sustainability. Participants comprised researchers, funders, and leaders in health, mental health, and public health with shared interest in the sustainability of evidence-based health care. Prompted to identify important issues for sustainability research, participants generated 91 distinct statements, for which a concept mapping process produced 11 conceptually distinct clusters. During the conference, participants built upon the concept mapping clusters to generate recommendations for sustainability research. The recommendations fell into three domains: (1) pursue high priority research questions as a unified agenda on sustainability; (2) advance methods for sustainability research; (3) advance infrastructure to support sustainability research. Implementation science needs to pursue later-stage translation research questions required for population impact. Priorities include conceptual consistency and operational clarity for measuring sustainability, developing evidence

  19. FlooDSuM - a decision support methodology for assisting local authorities in flood situations

    Science.gov (United States)

    Schwanbeck, Jan; Weingartner, Rolf

    2014-05-01

    Decision making in flood situations is a difficult task, especially in small to medium-sized mountain catchments (30 - 500 km2) which are usually characterized by complex topography, high drainage density and quick runoff response to rainfall events. Operating hydrological models driven by numerical weather prediction systems, which have a lead-time of several hours up to few even days, would be beneficial in this case as time for prevention could be gained. However, the spatial and quantitative accuracy of such meteorological forecasts usually decrease with increasing lead-time. In addition, the sensitivity of rainfall-runoff models to inaccuracies in estimations of areal rainfall increases with decreasing catchment size. Accordingly, decisions on flood alerts should ideally be based on areal rainfall from high resolution and short-term numerical weather prediction, nowcasts or even real-time measurements, which is transformed into runoff by a hydrological model. In order to benefit from the best possible rainfall data while retaining enough time for alerting and for prevention, the hydrological model should be fast and easily applicable by decision makers within local authorities themselves. The proposed decision support methodology FlooDSuM (Flood Decision Support Methodology) aims to meet those requirements. Applying FlooDSuM, a few successive binary decisions of increasing complexity have to be processed following a flow-chart-like structure. Prepared data and straightforwardly applicable tools are provided for each of these decisions. Maps showing the current flood disposition are used for the first step. While danger of flooding cannot be excluded more and more complex and time consuming methods will be applied. For the final decision, a set of scatter-plots relating areal precipitation to peak flow is provided. These plots take also further decisive parameters into account such as storm duration, distribution of rainfall intensity in time as well as the

  20. A methodology for system-of-systems design in support of the engineering team

    Science.gov (United States)

    Ridolfi, G.; Mooij, E.; Cardile, D.; Corpino, S.; Ferrari, G.

    2012-04-01

    Space missions have experienced a trend of increasing complexity in the last decades, resulting in the design of very complex systems formed by many elements and sub-elements working together to meet the requirements. In a classical approach, especially in a company environment, the two steps of design-space exploration and optimization are usually performed by experts inferring on major phenomena, making assumptions and doing some trial-and-error runs on the available mathematical models. This is done especially in the very early design phases where most of the costs are locked-in. With the objective of supporting the engineering team and the decision-makers during the design of complex systems, the authors developed a modelling framework for a particular category of complex, coupled space systems called System-of-Systems. Once modelled, the System-of-Systems is solved using a computationally cheap parametric methodology, named the mixed-hypercube approach, based on the utilization of a particular type of fractional factorial design-of-experiments, and analysis of the results via global sensitivity analysis and response surfaces. As an applicative example, a system-of-systems of a hypothetical human space exploration scenario for the support of a manned lunar base is presented. The results demonstrate that using the mixed-hypercube to sample the design space, an optimal solution is reached with a limited computational effort, providing support to the engineering team and decision makers thanks to sensitivity and robustness information. The analysis of the system-of-systems model that was implemented shows that the logistic support of a human outpost on the Moon for 15 years is still feasible with currently available launcher classes. The results presented in this paper have been obtained in cooperation with Thales Alenia Space—Italy, in the framework of a regional programme called STEPS. STEPS—Sistemi e Tecnologie per l'EsPlorazione Spaziale is a research

  1. Estimation of the laser cutting operating cost by support vector regression methodology

    Science.gov (United States)

    Jović, Srđan; Radović, Aleksandar; Šarkoćević, Živče; Petković, Dalibor; Alizamir, Meysam

    2016-09-01

    Laser cutting is a popular manufacturing process utilized to cut various types of materials economically. The operating cost is affected by laser power, cutting speed, assist gas pressure, nozzle diameter and focus point position as well as the workpiece material. In this article, the process factors investigated were: laser power, cutting speed, air pressure and focal point position. The aim of this work is to relate the operating cost to the process parameters mentioned above. CO2 laser cutting of stainless steel of medical grade AISI316L has been investigated. The main goal was to analyze the operating cost through the laser power, cutting speed, air pressure, focal point position and material thickness. Since the laser operating cost is a complex, non-linear task, soft computing optimization algorithms can be used. Intelligent soft computing scheme support vector regression (SVR) was implemented. The performance of the proposed estimator was confirmed with the simulation results. The SVR results are then compared with artificial neural network and genetic programing. According to the results, a greater improvement in estimation accuracy can be achieved through the SVR compared to other soft computing methodologies. The new optimization methods benefit from the soft computing capabilities of global optimization and multiobjective optimization rather than choosing a starting point by trial and error and combining multiple criteria into a single criterion.

  2. An integrated probabilistic risk analysis decision support methodology for systems with multiple state variables

    International Nuclear Information System (INIS)

    Sen, P.; Tan, John K.G.; Spencer, David

    1999-01-01

    Probabilistic risk analysis (PRA) methods have been proven to be valuable in risk and reliability analysis. However, a weak link seems to exist between methods for analysing risks and those for making rational decisions. The integrated decision support system (IDSS) methodology presented in this paper attempts to address this issue in a practical manner. In consists of three phases: a PRA phase, a risk sensitivity analysis (SA) phase and an optimisation phase, which are implemented through an integrated computer software system. In the risk analysis phase the problem is analysed by the Boolean representation method (BRM), a PRA method that can deal with systems with multiple state variables and feedback loops. In the second phase the results obtained from the BRM are utilised directly to perform importance and risk SA. In the third phase, the problem is formulated as a multiple objective decision making problem in the form of multiple objective reliability optimisation. An industrial example is included. The resultant solutions of a five objective reliability optimisation are presented, on the basis of which rational decision making can be explored

  3. [Extraction Optimization of Rhizome of Curcuma longa by Response Surface Methodology and Support Vector Regression].

    Science.gov (United States)

    Zhou, Pei-pei; Shan, Jin-feng; Jiang, Jian-lan

    2015-12-01

    To optimize the optimal microwave-assisted extraction method of curcuminoids from Curcuma longa. On the base of single factor experiment, the ethanol concentration, the ratio of liquid to solid and the microwave time were selected for further optimization. Support Vector Regression (SVR) and Central Composite Design-Response Surface Methodology (CCD) algorithm were utilized to design and establish models respectively, while Particle Swarm Optimization (PSO) was introduced to optimize the parameters of SVR models and to search optimal points of models. The evaluation indicator, the sum of curcumin, demethoxycurcumin and bisdemethoxycurcumin by HPLC, were used. The optimal parameters of microwave-assisted extraction were as follows: ethanol concentration of 69%, ratio of liquid to solid of 21 : 1, microwave time of 55 s. On those conditions, the sum of three curcuminoids was 28.97 mg/g (per gram of rhizomes powder). Both the CCD model and the SVR model were credible, for they have predicted the similar process condition and the deviation of yield were less than 1.2%.

  4. A Methodological Approach to Support Collaborative Media Creation in an E-Learning Higher Education Context

    Science.gov (United States)

    Ornellas, Adriana; Muñoz Carril, Pablo César

    2014-01-01

    This article outlines a methodological approach to the creation, production and dissemination of online collaborative audio-visual projects, using new social learning technologies and open-source video tools, which can be applied to any e-learning environment in higher education. The methodology was developed and used to design a course in the…

  5. [Exploration of a quantitative methodology to characterize the retention of PM2.5 and other atmospheric particulate matter by plant leaves: taking Populus tomentosa as an example].

    Science.gov (United States)

    Zhang, Zhi-Dan; Xi, Ben-Ye; Cao, Zhi-Guo; Jia, Li-Ming

    2014-08-01

    Taking Populus tomentosa as an example, a methodology called elution-weighing-particle size-analysis (EWPA) was proposed to evaluate quantitatively the ability of retaining fine particulate matter (PM2.5, diameter d ≤ 2.5 μm) and atmospheric particulate matter by plant leaves using laser particle size analyzer and balance. This method achieved a direct, accurate measurement with superior operability about the quality and particle size distribution of atmospheric particulate matter retained by plant leaves. First, a pre-experiment was taken to test the stability of the method. After cleaning, centrifugation and drying, the particulate matter was collected and weighed, and then its particle size distribution was analyzed by laser particle size analyzer. Finally, the mass of particulate matter retained by unit area of leaf and stand was translated from the leaf area and leaf area index. This method was applied to a P. tomentosa stand which had not experienced rain for 27 days in Beijing Olympic Forest Park. The results showed that the average particle size of the atmospheric particulate matter retained by P. tomentosa was 17.8 μm, and the volume percentages of the retained PM2.5, inhalable particulate matter (PM10, d ≤ 10 μm) and total suspended particle (TSP, d ≤ 100 μm) were 13.7%, 47.2%, and 99.9%, respectively. The masses of PM2.5, PM10, TSP and total particulate matter were 8.88 x 10(-6), 30.6 x 10(-6), 64.7 x 10(-6) and 64.8 x 10(-6) g x cm(-2) respectively. The retention quantities of PM2.5, PM10, TSP and total particulate matter by the P. tomentosa stand were 0.963, 3.32, 7.01 and 7.02 kg x hm(-2), respectively.

  6. A methodology for supporting decisions on the establishment of protective measures after severe nuclear accidents

    International Nuclear Information System (INIS)

    Papazoglou, I.A.; Kollas, J.G.

    1994-06-01

    The objective of this report is to demonstrate the use of a methology supporting decisions on protective measures following severe nuclear accidents. A multicriteria decision analysis approach is recommended where value tradeoffs are postponed until the very last stage of the decision process. Use of efficient frontiers is made to exclude all technically inferior solutions and present the decision maker with all nondominated solutions. A choice among these solutions implies a value trade-off among the multiple criteria. An interactive computer packge has been developed where the decision maker can choose a point on the efficient frontier in the consequence space and immediately see the alternative in the decision space resulting in the chosen consequences. The methodology is demonstrated through an application on the choice among possible protective measures in contaminated areas of the former USSR after the Chernobyl accident. Two distinct cases are considered: First a decision is to be made only on the basis of the level of soil contamination with Cs-137 and the total cost of the chosen protective policy; Next the decision is based on the geographic dimension of the contamination ant the total cost. Three alternative countermeasure actions are considered for population segments living on soil contaminated at a certain level or in a specific geographic region: (a) relocation of the population; (b) improvement of the living conditions; and, (c) no countermeasures at all. This is final deliverable of the CEC-CIS Joint Study Project 2, Task 5: Decision-Aiding-System for Establishing Intervention Levels, performed under Contracts COSU-CT91-0007 and COSU-CT92-0021 with the Commission of European Communities through CEPN

  7. Status of Activities to Implement a Sustainable System of MC and A Equipment and Methodological Support at Rosatom Facilities

    International Nuclear Information System (INIS)

    Sanders, J.D.

    2010-01-01

    Under the U.S.-Russian Material Protection, Control and Accounting (MPC and A) Program, the Material Control and Accounting Measurements (MCAM) Project has supported a joint U.S.-Russian effort to coordinate improvements of the Russian MC and A measurement system. These efforts have resulted in the development of a MC and A Equipment and Methodological Support (MEMS) Strategic Plan (SP), developed by the Russian MEM Working Group. The MEMS SP covers implementation of MC and A measurement equipment, as well as the development, attestation and implementation of measurement methodologies and reference materials at the facility and industry levels. This paper provides an overview of the activities conducted under the MEMS SP, as well as a status on current efforts to develop reference materials, implement destructive and nondestructive assay measurement methodologies, and implement sample exchange, scrap and holdup measurement programs across Russian nuclear facilities.

  8. Status of Activities to Implement a Sustainable System of MC&A Equipment and Methodological Support at Rosatom Facilities

    Energy Technology Data Exchange (ETDEWEB)

    J.D. Sanders

    2010-07-01

    Under the U.S.-Russian Material Protection, Control and Accounting (MPC&A) Program, the Material Control and Accounting Measurements (MCAM) Project has supported a joint U.S.-Russian effort to coordinate improvements of the Russian MC&A measurement system. These efforts have resulted in the development of a MC&A Equipment and Methodological Support (MEMS) Strategic Plan (SP), developed by the Russian MEM Working Group. The MEMS SP covers implementation of MC&A measurement equipment, as well as the development, attestation and implementation of measurement methodologies and reference materials at the facility and industry levels. This paper provides an overview of the activities conducted under the MEMS SP, as well as a status on current efforts to develop reference materials, implement destructive and nondestructive assay measurement methodologies, and implement sample exchange, scrap and holdup measurement programs across Russian nuclear facilities.

  9. Applying Costs, Risks and Values Evaluation (CRAVE) methodology to Engineering Support Request (ESR) prioritization

    Science.gov (United States)

    Joglekar, Prafulla N.

    1994-01-01

    Given limited budget, the problem of prioritization among Engineering Support Requests (ESR's) with varied sizes, shapes, and colors is a difficult one. At the Kennedy Space Center (KSC), the recently developed 4-Matrix (4-M) method represents a step in the right direction as it attempts to combine the traditional criteria of technical merits only with the new concern for cost-effectiveness. However, the 4-M method was not adequately successful in the actual prioritization of ESRs for the fiscal year 1995 (FY95). This research identifies a number of design issues that should help us to develop better methods. It emphasizes that given the variety and diversity of ESR's one should not expect that a single method could help in the assessment of all ESR's. One conclusion is that a methodology such as Costs, Risks, and Values Evaluation (CRAVE) should be adopted. It also is clear that the development of methods such as 4-M requires input not only from engineers with technical expertise in ESR's but also from personnel with adequate background in the theory and practice of cost-effectiveness analysis. At KSC, ESR prioritization is one part of the Ground Support Working Teams (GSWT) Integration Process. It was discovered that the more important barriers to the incorporation of cost-effectiveness considerations in ESR prioritization lie in this process. The culture of integration, and the corresponding structure of review by a committee of peers, is not conducive to the analysis and confrontation necessary in the assessment and prioritization of ESR's. Without assistance from appropriately trained analysts charged with the responsibility to analyze and be confrontational about each ESR, the GSWT steering committee will continue to make its decisions based on incomplete understanding, inconsistent numbers, and at times, colored facts. The current organizational separation of the prioritization and the funding processes is also identified as an important barrier to the

  10. NASA’s Universe of Learning: Engaging Subject Matter Experts to Support Museum Alliance Science Briefings

    Science.gov (United States)

    Marcucci, Emma; Slivinski, Carolyn; Lawton, Brandon L.; Smith, Denise A.; Squires, Gordon K.; Biferno, Anya A.; Lestition, Kathleen; Cominsky, Lynn R.; Lee, Janice C.; Rivera, Thalia; Walker, Allyson; Spisak, Marilyn

    2018-06-01

    NASA's Universe of Learning creates and delivers science-driven, audience-driven resources and experiences designed to engage and immerse learners of all ages and backgrounds in exploring the universe for themselves. The project is a unique partnership between the Space Telescope Science Institute, Caltech/IPAC, Jet Propulsion Laboratory, Smithsonian Astrophysical Observatory, and Sonoma State University and is part of the NASA SMD Science Activation Collective. The NASA’s Universe of Learning projects pull on the expertise of subject matter experts (scientist and engineers) from across the broad range of NASA Astrophysics themes and missions. One such project, which draws strongly on the expertise of the community, is the NASA’s Universe of Learning Science Briefings, which is done in collaboration with the NASA Museum Alliance. This collaboration presents a monthly hour-long discussion on relevant NASA astrophysics topics or events to an audience composed largely of informal educators from informal learning environments. These professional learning opportunities use experts and resources within the astronomical community to support increased interest and engagement of the informal learning community in NASA Astrophysics-related concepts and events. Briefings are designed to create a foundation for this audience using (1) broad science themes, (2) special events, or (3) breaking science news. The NASA’s Universe of Learning team engages subject matter experts to be speakers and present their science at these briefings to provide a direct connection to NASA Astrophysics science and provide the audience an opportunity to interact directly with scientists and engineers involved in NASA missions. To maximize the usefulness of the Museum Alliance Science Briefings, each briefing highlights resources related to the science theme to support informal educators in incorporating science content into their venues and/or interactions with the public. During this

  11. Reactor analysis support package (RASP). Volume 7. PWR set-point methodology. Final report

    International Nuclear Information System (INIS)

    Temple, S.M.; Robbins, T.R.

    1986-09-01

    This report provides an overview of the basis and methodology requirements for determining Pressurized Water Reactor (PWR) technical specifications related setpoints and focuses on development of the methodology for a reload core. Additionally, the report documents the implementation and typical methods of analysis used by PWR vendors during the 1970's to develop Protection System Trip Limits (or Limiting Safety System Settings) and Limiting Conditions for Operation. The descriptions of the typical setpoint methodologies are provided for Nuclear Steam Supply Systems as designed and supplied by Babcock and Wilcox, Combustion Engineering, and Westinghouse. The description of the methods of analysis includes the discussion of the computer codes used in the setpoint methodology. Next, the report addresses the treatment of calculational and measurement uncertainties based on the extent to which such information was available for each of the three types of PWR. Finally, the major features of the setpoint methodologies are compared, and the principal effects of each particular methodology on plant operation are summarized for each of the three types of PWR

  12. Summary of the Supplemental Model Reports Supporting the Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    Brownson, D. A.

    2002-01-01

    The Department of Energy (DOE) Office of Civilian Radioactive Waste Management (OCRWM) has committed to a series of model reports documenting the methodology to be utilized in the Disposal Criticality Analysis Methodology Topical Report (YMP 2000). These model reports detail and provide validation of the methodology to be utilized for criticality analyses related to: (1) Waste form/waste package degradation; (2) Waste package isotopic inventory; (3) Criticality potential of degraded waste form/waste package configurations (effective neutron multiplication factor); (4) Probability of criticality (for each potential critical configuration as well as total event); and (5) Criticality consequences. This purpose of this summary report is to provide a status of the model reports and a schedule for their completion. This report also provides information relative to the model report content and validation. The model reports and their revisions are being generated as a result of: (1) Commitments made in the Disposal Criticality Analysis Methodology Topical Report (YMP 2000); (2) Open Items from the Safety Evaluation Report (Reamer 2000); (3) Key Technical Issue agreements made during DOE/U.S. Nuclear Regulatory Commission (NRC) Technical Exchange Meeting (Reamer and Williams 2000); and (4) NRC requests for additional information (Schlueter 2002)

  13. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    International Nuclear Information System (INIS)

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.; Lin, T.; Haley, T.A.; Barto, A.B.; Stutzke, M.A.

    1996-01-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document

  14. Integrated cost estimation methodology to support high-performance building design

    Energy Technology Data Exchange (ETDEWEB)

    Vaidya, Prasad; Greden, Lara; Eijadi, David; McDougall, Tom [The Weidt Group, Minnetonka (United States); Cole, Ray [Axiom Engineers, Monterey (United States)

    2007-07-01

    Design teams evaluating the performance of energy conservation measures (ECMs) calculate energy savings rigorously with established modelling protocols, accounting for the interaction between various measures. However, incremental cost calculations do not have a similar rigor. Often there is no recognition of cost reductions with integrated design, nor is there assessment of cost interactions amongst measures. This lack of rigor feeds the notion that high-performance buildings cost more, creating a barrier for design teams pursuing aggressive high-performance outcomes. This study proposes an alternative integrated methodology to arrive at a lower perceived incremental cost for improved energy performance. The methodology is based on the use of energy simulations as means towards integrated design and cost estimation. Various points along the spectrum of integration are identified and characterized by the amount of design effort invested, the scheduling of effort, and relative energy performance of the resultant design. It includes a study of the interactions between building system parameters as they relate to capital costs. Several cost interactions amongst energy measures are found to be significant.The value of this approach is demonstrated with alternatives in a case study that shows the differences between perceived costs for energy measures along various points on the integration spectrum. These alternatives show design tradeoffs and identify how decisions would have been different with a standard costing approach. Areas of further research to make the methodology more robust are identified. Policy measures to encourage the integrated approach and reduce the barriers towards improved energy performance are discussed.

  15. Influence of natural organic matter (NOM) coatings on nanoparticle adsorption onto supported lipid bilayers.

    Science.gov (United States)

    Bo, Zhang; Avsar, Saziye Yorulmaz; Corliss, Michael K; Chung, Minsub; Cho, Nam-Joon

    2017-10-05

    As the worldwide usage of nanoparticles in commercial products continues to increase, there is growing concern about the environmental risks that nanoparticles pose to biological systems, including potential damage to cellular membranes. A detailed understanding of how different types of nanoparticles behave in environmentally relevant conditions is imperative for predicting and mitigating potential membrane-associated toxicities. Herein, we investigated the adsorption of two popular nanoparticles (silver and buckminsterfullerene) onto biomimetic supported lipid bilayers of varying membrane charge (positive and negative). The quartz crystal microbalance-dissipation (QCM-D) measurement technique was employed to track the adsorption kinetics. Particular attention was focused on understanding how natural organic matter (NOM) coatings affect nanoparticle-bilayer interactions. Both types of nanoparticles preferentially adsorbed onto the positively charged bilayers, although NOM coatings on the nanoparticle and lipid bilayer surfaces could either inhibit or promote adsorption in certain electrolyte conditions. While past findings showed that NOM coatings inhibit membrane adhesion, our findings demonstrate that the effects of NOM coatings are more nuanced depending on the type of nanoparticle and electrolyte condition. Taken together, the results demonstrate that NOM coatings can modulate the lipid membrane interactions of various nanoparticles, suggesting a possible way to improve the environmental safety of nanoparticles. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Views from inside the "Black Box": A Q-Methodology Study of Mentoring Support for Entrepreneurs

    Science.gov (United States)

    Stanigar, Jennifer Jill

    2016-01-01

    Aspiring entrepreneurs give and receive support in growth-fostering interactions with seasoned entrepreneurs, mentors, peers, and others. This dissertation investigates viewpoints held by entrepreneurs about their experiences of effective mentoring support. Little is known about how an entrepreneur learns through interacting with different…

  17. An ultrasonic methodology for muscle cross section measurement of support space flight

    Science.gov (United States)

    Hatfield, Thomas R.; Klaus, David M.; Simske, Steven J.

    2004-09-01

    The number one priority for any manned space mission is the health and safety of its crew. The study of the short and long term physiological effects on humans is paramount to ensuring crew health and mission success. One of the challenges associated in studying the physiological effects of space flight on humans, such as loss of bone and muscle mass, has been that of readily attaining the data needed to characterize the changes. The small sampling size of astronauts, together with the fact that most physiological data collection tends to be rather tedious, continues to hinder elucidation of the underlying mechanisms responsible for the observed changes that occur in space. Better characterization of the muscle loss experienced by astronauts requires that new technologies be implemented. To this end, we have begun to validate a 360° ultrasonic scanning methodology for muscle measurements and have performed empirical sampling of a limb surrogate for comparison. Ultrasonic wave propagation was simulated using 144 stations of rotated arm and calf MRI images. These simulations were intended to provide a preliminary check of the scanning methodology and data analysis before its implementation with hardware. Pulse-echo waveforms were processed for each rotation station to characterize fat, muscle, bone, and limb boundary interfaces. The percentage error between MRI reference values and calculated muscle areas, as determined from reflection points for calf and arm cross sections, was -2.179% and +2.129%, respectively. These successful simulations suggest that ultrasound pulse scanning can be used to effectively determine limb cross-sectional areas. Cross-sectional images of a limb surrogate were then used to simulate signal measurements at several rotation angles, with ultrasonic pulse-echo sampling performed experimentally at the same stations on the actual limb surrogate to corroborate the results. The objective of the surrogate sampling was to compare the signal

  18. Building an integrated methodology of learning that can optimally support improvements in healthcare.

    Science.gov (United States)

    Lynn, Joanne

    2011-04-01

    The methods for healthcare reform are strikingly underdeveloped, with much reliance on political power. A methodology that combined methods from sources such as clinical trials, experience-based wisdom, and improvement science could be among the aims of the upcoming work in the USA on comparative effectiveness and on the agenda of the Center for Medicare and Medicaid Innovation in the Centers for Medicare and Medicaid Services. Those working in quality improvement have an unusual opportunity to generate substantial input into these processes through professional organisations such as the Academy for Healthcare Improvement and dominant leadership organisations such as the Institute for Healthcare Improvement.

  19. Directed Graph Methodology for Acquisition Path Analysis: a possible tool to support the state-level approach

    International Nuclear Information System (INIS)

    Vincze, Arpad; Nemeth, Andras

    2013-01-01

    According to a recent statement, the IAEA seeks to develop a more effective safeguards system to achieve greater deterrence, because deterrence of proliferation is much more effective than detection. To achieve this goal, a less predictive safeguards system is being developed based on the advanced state-level approach that is driven by all available safeguards-relevant information. The 'directed graph analysis' is recommended as a possible methodology to implement acquisition path analysis by the IAEA to support the State evaluation process. The basic methodology is simple, well established, powerful, and its adaptation to the modelling of the nuclear profile of a State requires minimum software development. Based on this methodology the material flow network model has been developed under the Hungarian Support Programme to the IAEA, which is described in detail. In the proposed model, materials in different chemical and physical form can flow through pipes representing declared processes, material transports, diversions or undeclared processes. The nodes of the network are the material types, while the edges of the network are the pipes. A state parameter (p) is assigned to each node and edge representing the probability of their existence in the State. The possible application of this model in the State-level analytical approach will be discussed and outlook for further work will be given. The paper is followed by the slides of the presentation

  20. Core melt progression and consequence analysis methodology development in support of the Savannah River Reactor PSA

    International Nuclear Information System (INIS)

    O'Kula, K.R.; Sharp, D.A.; Amos, C.N.; Wagner, K.C.; Bradley, D.R.

    1992-01-01

    A three-level Probabilistic Safety Assessment (PSA) of production reactor operation has been underway since 1985 at the US Department of Energy's Savannah River Site (SRS). The goals of this analysis are to: Analyze existing margins of safety provided by the heavy-water reactor (HWR) design challenged by postulated severe accidents; Compare measures of risk to the general public and onsite workers to guideline values, as well as to those posed by commercial reactor operation; and Develop the methodology and database necessary to prioritize improvements to engineering safety systems and components, operator training, and engineering projects that contribute significantly to improving plant safety. PSA technical staff from the Westinghouse Savannah River Company (WSRC) and Science Applications International Corporation (SAIC) have performed the assessment despite two obstacles: A variable baseline plant configuration and power level; and a lack of technically applicable code methodology to model the SRS reactor conditions. This paper discusses the detailed effort necessary to modify the requisite codes before accident analysis insights for the risk assessment were obtained

  1. Methodology for Analyzing and Developing Information Management Infrastructure to Support Telerehabilitation

    Directory of Open Access Journals (Sweden)

    Andi Saptono

    2009-09-01

    Full Text Available The proliferation of advanced technologies led researchers within the Rehabilitation Engineering Research Center on Telerehabilitation (RERC-TR to devise an integrated infrastructure for clinical services using the University of Pittsburgh (PITT model. This model describes five required characteristics for a telerehabilitation (TR infrastructure: openness, extensibility, scalability, cost-effectiveness, and security. The infrastructure is to deliver clinical services over distance to improve access to health services for people living in underserved or remote areas. The methodological approach to design, develop, and employ this infrastructure is explained and detailed for the remote wheelchair prescription project, a research task within the RERC-TR. The availability of this specific clinical service and personnel outside of metropolitan areas is limited due to the lack of specialty expertise and access to resources. The infrastructure is used to deliver expertise in wheeled mobility and seating through teleconsultation to remote clinics, and has been successfully deployed to five rural clinics in Western Pennsylvania. Keywords: Telerehabilitation, Information Management, Infrastructure Development Methodology, Videoconferencing, Online Portal, Database

  2. Equity, land register and government of the territory. A methodological proposal to support the Public Administration

    Directory of Open Access Journals (Sweden)

    Rocco Curto

    2013-12-01

    Full Text Available The paper presents a research project submitted to the MIUR (Ministry of Education, University and Research, in response to the Notice of PRIN (Scientific Research Programmes of Relevant National Interest for the year 2012. The "Fairness, Land register and territory government" project addresses the core of property taxation and, in particular, how such fairness can be guaranteed in Italy only through a process of revision of the land register estimates. In Italy, in fact, land register values have been completely disconnected from the real market values of the assets and, therefore, from their characteristics and quality. The project aims to define the most appropriate methodology for the revision of land register values of the entire national heritage, which satisfy the requirements of scientific rigour and which are also applicable. It considers the timing of the estimate of the values as a specific step in a broader methodology, which includes the node of technological infrastructures and databases. In fact it conceives the land register, with its databases, as the heart of Lis (Land Information System and also considers the process of reviewing the estimates from the perspective of providing the basis for more modern property taxation, able to recognise and delimit territorially the dynamics of values and, in particular, those that manifest themselves in the form of exogenous monetary factors produced by public interventions, whether they are large projects and/or infrastructure developments.

  3. Supporting the Future Total Force: A Methodology for Evaluating Potential Air National Guard Mission Assignments

    National Research Council Canada - National Science Library

    Lynch, Kristin F; Drew, John G; Sleeper, Sally; Williams, William A; Masters, James M; Luangkesorn, Louis; Tripp, Robert S; Lichter, Dahlia S; Roll, Charles R

    2007-01-01

    Manpower end-strength reductions of active duty personnel in the U.S. Air Force are making it more difficult to support the air and space expeditionary force construct using current force employment practices...

  4. Microbial chlorination of organic matter in forest soil: investigation using 36Cl-chloride and its methodology.

    Science.gov (United States)

    Rohlenová, J; Gryndler, M; Forczek, S T; Fuksová, K; Handova, V; Matucha, M

    2009-05-15

    Chloride, which comes into the forest ecosystem largely from the sea as aerosol (and has been in the past assumed to be inert), causes chlorination of soil organic matter. Studies of the chlorination showed that the content of organically bound chlorine in temperate forest soils is higher than that of chloride, and various chlorinated compounds are produced. Our study of chlorination of organic matter in the fermentation horizon of forest soil using radioisotope 36Cl and tracer techniques shows that microbial chlorination clearly prevails over abiotic, chlorination of soil organic matter being enzymatically mediated and proportional to chloride content and time. Long-term (>100 days) chlorination leads to more stable chlorinated substances contained in the organic layer of forest soil (overtime; chlorine is bound progressively more firmly in humic acids) and volatile organochlorines are formed. Penetration of chloride into microorganisms can be documented by the freezing/thawing technique. Chloride absorption in microorganisms in soil and in litter residues in the fermentation horizon complicates the analysis of 36Cl-chlorinated soil. The results show that the analytical procedure used should be tested for every soil type under study.

  5. Business Planning Methodology to Support the Development of Strategic Academic Programs

    Science.gov (United States)

    Philbin, Simon P.; Mallo, Charles A.

    2016-01-01

    Higher education institutions are often required to design and deliver a range of strategic academic programs in order to remain competitive, support growth and ensure operations are financially sustainable. Such programs may include the creation of new research centers and institutes as well as the installation of major new research facilities.…

  6. A Methodology for Building Faculty Support for the United Nations Principles for Responsible Management Education

    Science.gov (United States)

    Maloni, Michael J.; Smith, Shane D.; Napshin, Stuart

    2012-01-01

    Evidence from extant literature indicates that faculty support is a critical driver for implementing the United Nations Principles for Responsible Management Education (PRME), particularly for schools pursuing an advanced, cross-disciplinary level of sustainability integration. However, there is limited existing research offering insight into how…

  7. A game-based decision support methodology for competitive systems design

    Science.gov (United States)

    Briceno, Simon Ignacio

    This dissertation describes the development of a game-based methodology that facilitates the exploration and selection of research and development (R&D) projects under uncertain competitive scenarios. The proposed method provides an approach that analyzes competitor positioning and formulates response strategies to forecast the impact of technical design choices on a project's market performance. A critical decision in the conceptual design phase of propulsion systems is the selection of the best architecture, centerline, core size, and technology portfolio. This selection can be challenging when considering evolving requirements from both the airframe manufacturing company and the airlines in the market. Furthermore, the exceedingly high cost of core architecture development and its associated risk makes this strategic architecture decision the most important one for an engine company. Traditional conceptual design processes emphasize performance and affordability as their main objectives. These areas alone however, do not provide decision-makers with enough information as to how successful their engine will be in a competitive market. A key objective of this research is to examine how firm characteristics such as their relative differences in completing R&D projects, differences in the degree of substitutability between different project types, and first/second-mover advantages affect their product development strategies. Several quantitative methods are investigated that analyze business and engineering strategies concurrently. In particular, formulations based on the well-established mathematical field of game theory are introduced to obtain insights into the project selection problem. The use of game theory is explored in this research as a method to assist the selection process of R&D projects in the presence of imperfect market information. The proposed methodology focuses on two influential factors: the schedule uncertainty of project completion times and

  8. Assessing Confidence in Performance Assessments Using an Evidence Support Logic Methodology: An Application of Tesla

    International Nuclear Information System (INIS)

    Egan, M.; Paulley, A.; Lehman, L.; Lowe, J.; Rochette, E.; Baker, St.

    2009-01-01

    The assessment of uncertainties and their implications is a key requirement when undertaking performance assessment (PA) of radioactive waste facilities. Decisions based on the outcome of such assessments become translated into judgments about confidence in the information they provide. This confidence, in turn, depends on uncertainties in the underlying evidence. Even if there is a large amount of information supporting an assessment, it may be only partially relevant, incomplete or less than completely reliable. In order to develop a measure of confidence in the outcome, sources of uncertainty need to be identified and adequately addressed in the development of the PA, or in any overarching strategic decision-making processes. This paper describes a trial application of the technique of Evidence Support Logic (ESL), which has been designed for application in support of 'high stakes' decisions, where important aspects of system performance are subject to uncertainty. The aims of ESL are to identify the amount of uncertainty or conflict associated with evidence relating to a particular decision, and to guide understanding of how evidence combines to support confidence in judgments. Elicitation techniques are used to enable participants in the process to develop a logical hypothesis model that best represents the relationships between different sources of evidence to the proposition under examination. The aim is to identify key areas of subjectivity and other sources of potential bias in the use of evidence (whether for or against the proposition) to support judgments of confidence. Propagation algorithms are used to investigate the overall implications of the logic according to the strength of the underlying evidence and associated uncertainties. (authors)

  9. Digital System Categorization Methodology to Support Integration of Digital Instrumentation and Control Models into PRAs

    International Nuclear Information System (INIS)

    Arndt, Steven A.

    2011-01-01

    It has been suggested that by categorizing the various digital systems used in safety critical applications in nuclear power plants, it would be possible to determine which systems should be modeled in the analysis of the larger plant wide PRA, at what level of detail the digital system should be modeled and using which methods. The research reported in this paper develops a categorization method using system attributes to permit a modeler to more effectively model the systems that will likely have the most critical contributions to the overall plant safety and to more effectively model system interactions for those digital systems where the interactions are most important to the overall accuracy and completeness of the plant PRA. The proposed methodology will categorize digital systems based on certain attributes of the systems themselves and how they will be used in the specific application. This will help determine what digital systems need to be modeled and at what level of detail, and can be used to guide PRA analysis and regulatory reviews. The three-attribute categorization strategy that was proposed by Arndt is used as the basis for the categorization methodology developed here. The first attribute, digital system complexity, is based on Type Il interactions defined by Aldemir and an overall digital system size and complexity index. The size and complexity index used are previously defined software complexity metrics. Potential sub-attributes of digital system complexity include, design complexity, software complexity, hardware complexity, system function complexity and testability. The second attribute, digital system interactions/inter-conductivity, is a combination of Rushby's coupling and Ademir's Type I interactions. Digital systems that are loosely coupled and/or have very few Type I interaction would not interact dynamically with the overall system and would have a low interactions/inter-conductivity score. Potential sub-attributes of digital system

  10. Digital System Categorization Methodology to Support Integration of Digital Instrumentation and Control Models into PRAs

    Energy Technology Data Exchange (ETDEWEB)

    Arndt, Steven A. [U.S. Nuclear Regulatory Commission, Washington D.C. (United States)

    2011-08-15

    It has been suggested that by categorizing the various digital systems used in safety critical applications in nuclear power plants, it would be possible to determine which systems should be modeled in the analysis of the larger plant wide PRA, at what level of detail the digital system should be modeled and using which methods. The research reported in this paper develops a categorization method using system attributes to permit a modeler to more effectively model the systems that will likely have the most critical contributions to the overall plant safety and to more effectively model system interactions for those digital systems where the interactions are most important to the overall accuracy and completeness of the plant PRA. The proposed methodology will categorize digital systems based on certain attributes of the systems themselves and how they will be used in the specific application. This will help determine what digital systems need to be modeled and at what level of detail, and can be used to guide PRA analysis and regulatory reviews. The three-attribute categorization strategy that was proposed by Arndt is used as the basis for the categorization methodology developed here. The first attribute, digital system complexity, is based on Type Il interactions defined by Aldemir and an overall digital system size and complexity index. The size and complexity index used are previously defined software complexity metrics. Potential sub-attributes of digital system complexity include, design complexity, software complexity, hardware complexity, system function complexity and testability. The second attribute, digital system interactions/inter-conductivity, is a combination of Rushby's coupling and Ademir's Type I interactions. Digital systems that are loosely coupled and/or have very few Type I interaction would not interact dynamically with the overall system and would have a low interactions/inter-conductivity score. Potential sub-attributes of

  11. When Family-Supportive Supervision Matters: Relations between Multiple Sources of Support and Work-Family Balance

    Science.gov (United States)

    Greenhaus, Jeffrey H.; Ziegert, Jonathan C.; Allen, Tammy D.

    2012-01-01

    This study examines the mechanisms by which family-supportive supervision is related to employee work-family balance. Based on a sample of 170 business professionals, we found that the positive relation between family-supportive supervision and balance was fully mediated by work interference with family (WIF) and partially mediated by family…

  12. Integrating cost information with health management support system: an enhanced methodology to assess health care quality drivers.

    Science.gov (United States)

    Kohli, R; Tan, J K; Piontek, F A; Ziege, D E; Groot, H

    1999-08-01

    Changes in health care delivery, reimbursement schemes, and organizational structure have required health organizations to manage the costs of providing patient care while maintaining high levels of clinical and patient satisfaction outcomes. Today, cost information, clinical outcomes, and patient satisfaction results must become more fully integrated if strategic competitiveness and benefits are to be realized in health management decision making, especially in multi-entity organizational settings. Unfortunately, traditional administrative and financial systems are not well equipped to cater to such information needs. This article presents a framework for the acquisition, generation, analysis, and reporting of cost information with clinical outcomes and patient satisfaction in the context of evolving health management and decision-support system technology. More specifically, the article focuses on an enhanced costing methodology for determining and producing improved, integrated cost-outcomes information. Implementation issues and areas for future research in cost-information management and decision-support domains are also discussed.

  13. A methodology and decision support tool for informing state-level bioenergy policymaking: New Jersey biofuels as a case study

    Science.gov (United States)

    Brennan-Tonetta, Margaret

    This dissertation seeks to provide key information and a decision support tool that states can use to support long-term goals of fossil fuel displacement and greenhouse gas reductions. The research yields three outcomes: (1) A methodology that allows for a comprehensive and consistent inventory and assessment of bioenergy feedstocks in terms of type, quantity, and energy potential. Development of a standardized methodology for consistent inventorying of biomass resources fosters research and business development of promising technologies that are compatible with the state's biomass resource base. (2) A unique interactive decision support tool that allows for systematic bioenergy analysis and evaluation of policy alternatives through the generation of biomass inventory and energy potential data for a wide variety of feedstocks and applicable technologies, using New Jersey as a case study. Development of a database that can assess the major components of a bioenergy system in one tool allows for easy evaluation of technology, feedstock and policy options. The methodology and decision support tool is applicable to other states and regions (with location specific modifications), thus contributing to the achievement of state and federal goals of renewable energy utilization. (3) Development of policy recommendations based on the results of the decision support tool that will help to guide New Jersey into a sustainable renewable energy future. The database developed in this research represents the first ever assessment of bioenergy potential for New Jersey. It can serve as a foundation for future research and modifications that could increase its power as a more robust policy analysis tool. As such, the current database is not able to perform analysis of tradeoffs across broad policy objectives such as economic development vs. CO2 emissions, or energy independence vs. source reduction of solid waste. Instead, it operates one level below that with comparisons of kWh or

  14. A Methodology for Decision Support for Implementation of Cloud Computing IT Services

    Directory of Open Access Journals (Sweden)

    Adela Tušanová

    2014-07-01

    Full Text Available The paper deals with the decision of small and medium-sized software companies in transition to SaaS model. The goal of the research is to design a comprehensive methodic to support decision making based on actual data of the company itself. Based on a careful analysis, taxonomy of costs, revenue streams and decision-making criteria are proposed in the paper. On the basis of multi-criteria decision-making methods, each alternative is evaluated and the alternative with the highest score is identified as the most appropriate. The proposed methodic is implemented as a web application and verified through  case studies.

  15. Architecture-Level Exploration of Alternative Interconnection Schemes Targeting 3D FPGAs: A Software-Supported Methodology

    Directory of Open Access Journals (Sweden)

    Kostas Siozios

    2008-01-01

    Full Text Available In current reconfigurable architectures, the interconnection structures increasingly contribute more to the delay and power consumption. The demand for increased clock frequencies and logic density (smaller area footprint makes the problem even more important. Three-dimensional (3D architectures are able to alleviate this problem by accommodating a number of functional layers, each of which might be fabricated in different technology. However, the benefits of such integration technology have not been sufficiently explored yet. In this paper, we propose a software-supported methodology for exploring and evaluating alternative interconnection schemes for 3D FPGAs. In order to support the proposed methodology, three new CAD tools were developed (part of the 3D MEANDER Design Framework. During our exploration, we study the impact of vertical interconnection between functional layers in a number of design parameters. More specifically, the average gains in operation frequency, power consumption, and wirelength are 35%, 32%, and 13%, respectively, compared to existing 2D FPGAs with identical logic resources. Also, we achieve higher utilization ratio for the vertical interconnections compared to existing approaches by 8% for designing 3D FPGAs, leading to cheaper and more reliable devices.

  16. Methodology for prediction and estimation of consequences of possible atmospheric releases of hazardous matter: "Kursk"? submarine study

    Science.gov (United States)

    Baklanov, A.; Mahura, A.; Sørensen, J. H.

    2003-03-01

    There are objects with some periods of higher than normal levels of risk of accidental atmospheric releases (nuclear, chemical, biological, etc.). Such accidents or events may occur due to natural hazards, human errors, terror acts, and during transportation of waste or various operations at high risk. A methodology for risk assessment is suggested and it includes two approaches: 1) probabilistic analysis of possible atmospheric transport patterns using long-term trajectory and dispersion modelling, and 2) forecast and evaluation of possible contamination and consequences for the environment and population using operational dispersion modelling. The first approach could be applied during the preparation stage, and the second - during the operation stage. The suggested methodology is applied on an example of the most important phases (lifting, transportation, and decommissioning) of the "Kursk" nuclear submarine operation. It is found that the temporal variability of several probabilistic indicators (fast transport probability fields, maximum reaching distance, maximum possible impact zone, and average integral concentration of 137Cs) showed that the fall of 2001 was the most appropriate time for the beginning of the operation. These indicators allowed to identify the hypothetically impacted geographical regions and territories. In cases of atmospheric transport toward the most populated areas, the forecasts of possible consequences during phases of the high and medium potential risk levels based on a unit hypothetical release are performed. The analysis showed that the possible deposition fractions of 1011 over the Kola Peninsula, and 10-12 - 10-13 for the remote areas of the Scandinavia and Northwest Russia could be observed. The suggested methodology may be used successfully for any potentially dangerous object involving risk of atmospheric release of hazardous materials of nuclear, chemical or biological nature.

  17. Methodology for prediction and estimation of consequences of possible atmospheric releases of hazardous matter: "Kursk" submarine study

    Science.gov (United States)

    Baklanov, A.; Mahura, A.; Sørensen, J. H.

    2003-06-01

    There are objects with some periods of higher than normal levels of risk of accidental atmospheric releases (nuclear, chemical, biological, etc.). Such accidents or events may occur due to natural hazards, human errors, terror acts, and during transportation of waste or various operations at high risk. A methodology for risk assessment is suggested and it includes two approaches: 1) probabilistic analysis of possible atmospheric transport patterns using long-term trajectory and dispersion modelling, and 2) forecast and evaluation of possible contamination and consequences for the environment and population using operational dispersion modelling. The first approach could be applied during the preparation stage, and the second - during the operation stage. The suggested methodology is applied on an example of the most important phases (lifting, transportation, and decommissioning) of the ``Kursk" nuclear submarine operation. It is found that the temporal variability of several probabilistic indicators (fast transport probability fields, maximum reaching distance, maximum possible impact zone, and average integral concentration of 137Cs) showed that the fall of 2001 was the most appropriate time for the beginning of the operation. These indicators allowed to identify the hypothetically impacted geographical regions and territories. In cases of atmospheric transport toward the most populated areas, the forecasts of possible consequences during phases of the high and medium potential risk levels based on a unit hypothetical release (e.g. 1 Bq) are performed. The analysis showed that the possible deposition fractions of 10-11 (Bq/m2) over the Kola Peninsula, and 10-12 - 10-13 (Bq/m2) for the remote areas of the Scandinavia and Northwest Russia could be observed. The suggested methodology may be used successfully for any potentially dangerous object involving risk of atmospheric release of hazardous materials of nuclear, chemical or biological nature.

  18. Methodology for prediction and estimation of consequences of possible atmospheric releases of hazardous matter: 'Kursk' submarine study

    Directory of Open Access Journals (Sweden)

    A. Baklanov

    2003-01-01

    Full Text Available There are objects with some periods of higher than normal levels of risk of accidental atmospheric releases (nuclear, chemical, biological, etc.. Such accidents or events may occur due to natural hazards, human errors, terror acts, and during transportation of waste or various operations at high risk. A methodology for risk assessment is suggested and it includes two approaches: 1 probabilistic analysis of possible atmospheric transport patterns using long-term trajectory and dispersion modelling, and 2 forecast and evaluation of possible contamination and consequences for the environment and population using operational dispersion modelling. The first approach could be applied during the preparation stage, and the second - during the operation stage. The suggested methodology is applied on an example of the most important phases (lifting, transportation, and decommissioning of the ``Kursk" nuclear submarine operation. It is found that the temporal variability of several probabilistic indicators (fast transport probability fields, maximum reaching distance, maximum possible impact zone, and average integral concentration of 137Cs showed that the fall of 2001 was the most appropriate time for the beginning of the operation. These indicators allowed to identify the hypothetically impacted geographical regions and territories. In cases of atmospheric transport toward the most populated areas, the forecasts of possible consequences during phases of the high and medium potential risk levels based on a unit hypothetical release (e.g. 1 Bq are performed. The analysis showed that the possible deposition fractions of 10-11 (Bq/m2 over the Kola Peninsula, and 10-12 - 10-13 (Bq/m2 for the remote areas of the Scandinavia and Northwest Russia could be observed. The suggested methodology may be used successfully for any potentially dangerous object involving risk of atmospheric release of hazardous materials of nuclear, chemical or biological nature.

  19. Clarifying the links between social support and health: culture, stress, and neuroticism matter.

    Science.gov (United States)

    Park, Jiyoung; Kitayama, Shinobu; Karasawa, Mayumi; Curhan, Katherine; Markus, Hazel R; Kawakami, Norito; Miyamoto, Yuri; Love, Gayle D; Coe, Christopher L; Ryff, Carol D

    2013-02-01

    Although it is commonly assumed that social support positively predicts health, the empirical evidence has been inconsistent. We argue that three moderating factors must be considered: (1) support-approving norms (cultural context); (2) support-requiring situations (stressful events); and (3) support-accepting personal style (low neuroticism). Our large-scale cross-cultural survey of Japanese and US adults found significant associations between perceived support and health. The association was more strongly evident among Japanese (from a support-approving cultural context) who reported high life stress (in a support-requiring situation). Moreover, the link between support and health was especially pronounced if these Japanese were low in neuroticism.

  20. PASSCAL Instrument Center Support for Cryoseismology: Methodologies, Challenges, Development and Instrumentation

    Science.gov (United States)

    Beaudoin, B. C.; Anderson, K. R.; Bilek, S. L.; Carpenter, P.; Childs, D.; Chung, P.; Huerta, A. D.; Lingutla, N.; Nikolaus, K.; Winberry, J. P.

    2017-12-01

    Remote portable seismic stations are, in most cases, constrained by logistics and cost. High latitude operations introduce environmental, technical and logistical challenges that require substantially more engineering work to ensure robust, high quality data return. Since 2006, IRIS PASSCAL has been funded by NSF to develop, deploy, and maintain a pool of polar specific seismic stations. At roughly the same time, PASSCAL began supporting experiments specifically targeting glacier dynamics such as the mechanisms of subglacial hydrology, basal shear stress, ice stream stick slip mechanisms, and glacier seismicity. Although much of the development for high-latitude deployments was directly applicable to cryoseismology, these new experiments introduced a unique series of challenges including high ablation, standing water, and moving stations. Our polar development objectives have focused on: Reducing station power requirements, size and weight; Extending the operational temperature of a station; Simplifying logistics; Engineering solutions that are cost effective, manufacturable, serviceable and reusable; And, developing high-latitude communications for both state-of-health and data transmission. To these ends, PASSCAL continues testing new power storage technology, refining established power systems for lighter and smaller power banks, and exploring telemetry solutions to increase high-bandwidth communication options and abilities for remote seismic stations. Further enhancing PASSCAL's ability to support cryoseismology is a recent NSF funded collaborative effort lead by Central Washing University joined by IRIS and New Mexico Tech to build a Geophysical Earth Observatory for Ice Covered Environments (GEOICE). The GEOICE instrument, power system and other integrated ancillary components are designed to require minimal installation time and logistical load (i.e., size and weight), while maximizing ease-of-use in the field and optimizing costs of instrumentation and

  1. Methodological congruence in phylogenomic analyses with morphological support for teiid lizards (Sauria: Teiidae).

    Science.gov (United States)

    Tucker, Derek B; Colli, Guarino R; Giugliano, Lilian G; Hedges, S Blair; Hendry, Catriona R; Lemmon, Emily Moriarty; Lemmon, Alan R; Sites, Jack W; Pyron, R Alexander

    2016-10-01

    A well-known issue in phylogenetics is discordance among gene trees, species trees, morphology, and other data types. Gene-tree discordance is often caused by incomplete lineage sorting, lateral gene transfer, and gene duplication. Multispecies-coalescent methods can account for incomplete lineage sorting and are believed by many to be more accurate than concatenation. However, simulation studies and empirical data have demonstrated that concatenation and species tree methods often recover similar topologies. We use three popular methods of phylogenetic reconstruction (one concatenation, two species tree) to evaluate relationships within Teiidae. These lizards are distributed across the United States to Argentina and the West Indies, and their classification has been controversial due to incomplete sampling and the discordance among various character types (chromosomes, DNA, musculature, osteology, etc.) used to reconstruct phylogenetic relationships. Recent morphological and molecular analyses of the group resurrected three genera and created five new genera to resolve non-monophyly in three historically ill-defined genera: Ameiva, Cnemidophorus, and Tupinambis. Here, we assess the phylogenetic relationships of the Teiidae using "next-generation" anchored-phylogenomics sequencing. Our final alignment includes 316 loci (488,656bp DNA) for 244 individuals (56 species of teiids, representing all currently recognized genera) and all three methods (ExaML, MP-EST, and ASTRAL-II) recovered essentially identical topologies. Our results are basically in agreement with recent results from morphology and smaller molecular datasets, showing support for monophyly of the eight new genera. Interestingly, even with hundreds of loci, the relationships among some genera in Tupinambinae remain ambiguous (i.e. low nodal support for the position of Salvator and Dracaena). Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Support vector regression methodology for estimating global solar radiation in Algeria

    Science.gov (United States)

    Guermoui, Mawloud; Rabehi, Abdelaziz; Gairaa, Kacem; Benkaciali, Said

    2018-01-01

    Accurate estimation of Daily Global Solar Radiation (DGSR) has been a major goal for solar energy applications. In this paper we show the possibility of developing a simple model based on the Support Vector Regression (SVM-R), which could be used to estimate DGSR on the horizontal surface in Algeria based only on sunshine ratio as input. The SVM model has been developed and tested using a data set recorded over three years (2005-2007). The data was collected at the Applied Research Unit for Renewable Energies (URAER) in Ghardaïa city. The data collected between 2005-2006 are used to train the model while the 2007 data are used to test the performance of the selected model. The measured and the estimated values of DGSR were compared during the testing phase statistically using the Root Mean Square Error (RMSE), Relative Square Error (rRMSE), and correlation coefficient (r2), which amount to 1.59(MJ/m2), 8.46 and 97,4%, respectively. The obtained results show that the SVM-R is highly qualified for DGSR estimation using only sunshine ratio.

  3. Decision support methodology for national energy planning in developing countries: an implementation focused approach

    Science.gov (United States)

    Lee, Nathan Coenen

    etching. In this case, an excimer laser was used. Extremely thin fiber tips were obtained, with an ultra-high sensitivity to strain. The other technique employed to fabricate the fiber Bragg gratings was the point-by-point femtosecond laser inscription. In this case, the sensing elements are very stable at high temperatures and can be used to measure strain in harsh conditions. The employment of optical fiber lasers as sensing elements was also considered in this Thesis. Two laser cavities were studied, one based on the ring configuration and the other based on a figure-of-eight configuration. From these works, the quality of the laser emission, namely the signal-to-noise ratio, the reduced full-width at half maximum and the stability should be highlighted. These characteristics allowed the measurement of different physical parameters, such as strain, temperature and torsion. Lastly, the possibility to use microspheres as sensing elements was considered. Using the electric arc of a fusion splicer, it is possible to create microspheres at the tip of an optical fiber. Furthermore, with this technique it is chains of microspheres can be obtained, constituting Mach-Zehnder-type interferometers which are sensitive to physical parameters like strain and temperature. The preliminary results obtained by introducing silica microspheres in a support structure are also presented. In this case, the sensors were subjected to temperature variations. All the experimental work was combined with the respective theoretical considerations. Many questions have been raised with the course of this PhD, and there are still some without a definite answer. Thus, new research paths can be followed, having their basis grounded in the configurations here presented.

  4. Clarifying the links between social support and health: Culture, stress, and neuroticism matter

    Science.gov (United States)

    Park, Jiyoung; Kitayama, Shinobu; Karasawa, Mayumi; Curhan, Katherine; Markus, Hazel R; Kawakami, Norito; Miyamoto, Yuri; Love, Gayle D; Coe, Christopher L; Ryff, Carol D

    2012-01-01

    Although it is commonly assumed that social support positively predicts health, the empirical evidence has been inconsistent. We argue that three moderating factors must be considered: (1) support-approving norms (cultural context); (2) support-requiring situations (stressful events); and (3) support-accepting personal style (low neuroticism). Our large-scale cross-cultural survey of Japanese and US adults found significant associations between perceived support and health. The association was more strongly evident among Japanese (from a support-approving cultural context) who reported high life stress (in a support-requiring situation). Moreover, the link between support and health was especially pronounced if these Japanese were low in neuroticism. PMID:22419414

  5. Improvement in Product Development: Use of back-end data to support upstream efforts of Robust Design Methodology

    Directory of Open Access Journals (Sweden)

    Vanajah Siva

    2012-12-01

    Full Text Available In the area of Robust Design Methodology (RDM less is done on how to use and work with data from the back-end of the product development process to support upstream improvement. The purpose of this paper is to suggest RDM practices for the use of customer claims data in early design phases as a basis for improvements. The back-end data, when systematically analyzed and fed back into the product development process, aids in closing the product development loop from claims to improvement in the design phase. This is proposed through a flow of claims data analysis tied to an existing tool, namely Failure Mode and Effects Analysis (FMEA. The systematic and integrated analysis of back-end data is suggested as an upstream effort of RDM to increase understanding of noise factors during product usage based on the feedback of claims data to FMEA and to address continuous improvement in product development.

  6. Application of response surface methodology for optimization of natural organic matter degradation by UV/H2O2 advanced oxidation process.

    Science.gov (United States)

    Rezaee, Reza; Maleki, Afshin; Jafari, Ali; Mazloomi, Sajad; Zandsalimi, Yahya; Mahvi, Amir H

    2014-01-01

    In this research, the removal of natural organic matter from aqueous solutions using advanced oxidation processes (UV/H2O2) was evaluated. Therefore, the response surface methodology and Box-Behnken design matrix were employed to design the experiments and to determine the optimal conditions. The effects of various parameters such as initial concentration of H2O2 (100-180 mg/L), pH (3-11), time (10-30 min) and initial total organic carbon (TOC) concentration (4-10 mg/L) were studied. Analysis of variance (ANOVA), revealed a good agreement between experimental data and proposed quadratic polynomial model (R(2) = 0.98). Experimental results showed that with increasing H2O2 concentration, time and decreasing in initial TOC concentration, TOC removal efficiency was increased. Neutral and nearly acidic pH values also improved the TOC removal. Accordingly, the TOC removal efficiency of 78.02% in terms of the independent variables including H2O2 concentration (100 mg/L), pH (6.12), time (22.42 min) and initial TOC concentration (4 mg/L) were optimized. Further confirmation tests under optimal conditions showed a 76.50% of TOC removal and confirmed that the model is accordance with the experiments. In addition TOC removal for natural water based on response surface methodology optimum condition was 62.15%. This study showed that response surface methodology based on Box-Behnken method is a useful tool for optimizing the operating parameters for TOC removal using UV/H2O2 process.

  7. Measuring soil organic matter turn over and carbon stabilisation in pasture soils using 13C enrichment methodology.

    Science.gov (United States)

    Robinson, J. M.; Barker, S.; Schipper, L. A.

    2017-12-01

    Carbon storage in soil is a balance between photosynthesis and respiration, however, not all C compounds decompose equally in soil. Soil C consists of several fractions of C ranging from, accessible C (rapidly cycling) to stored or protected C (slow cycling). The key to increasing C storage is through the transfer of soil C from this accessible fraction, where it can be easily lost through microbial degradation, into the more stable fraction. With the increasing use of isotope enrichment techniques, 13C may be used to trace the movement of newly incorporated carbon in soil and examine how land management practises affect carbon storage. A laboratory method was developed to rapidly analyse soil respired CO2 for δ13C to determine the temperature sensitivity of newly incorporated 13C enriched carbon. A Horotiu silt loam (2 mm sieved, 60% MWHC) was mixed with 13C enriched ryegrass/clover plant matter in Hungate tubes and incubated for 5 hours at 20 temperatures( 4 - 50 °C) using a temperature gradient method (Robinson J. M., et al, (2017) Biogeochemistry, 13, 101-112). The respired CO2 was analysed using a modified Los Gatos, Off-axis ICOS carbon dioxide analyser. This method was able to analyse the δ13C signature of respired CO2 as long as a minimum concentration of CO2 was produced per tube. Further analysis used a two-component mixing model to separate the CO2 into source components to determine the contribution of added C and soil to total respiration. Preliminary data showed the decomposition of the two sources of C were both temperature dependant. Overall this method is a relatively quick and easy way to analyse δ13C of respired soil CO2 samples, and will allow for the testing of the effects of multiple variables on the decomposition of carbon fractions in future use.

  8. Investigating 3S Synergies to Support Infrastructure Development and Risk-Informed Methodologies for 3S by Design

    International Nuclear Information System (INIS)

    Suzuki, M.; Izumi, Y.; Kimoto, T.; Naoi, Y.; Inoue, T.; Hoffheins, B.

    2010-01-01

    In 2008, Japan and other G8 countries pledged to support the Safeguards, Safety, and Security (3S) Initiative to raise awareness of 3S worldwide and to assist countries in setting up nuclear energy infrastructures that are essential cornerstones of a successful nuclear energy program. The goals of the 3S initiative are to ensure that countries already using nuclear energy or those planning to use nuclear energy are supported by strong national programs in safety, security, and safeguards not only for reliability and viability of the programs, but also to prove to the international audience that the programs are purely peaceful and that nuclear material is properly handled, accounted for, and protected. In support of this initiative, Japan Atomic Energy Agency (JAEA) has been conducting detailed analyses of the R and D programs and cultures of each of the 'S' areas to identify overlaps where synergism and efficiencies might be realized, to determine where there are gaps in the development of a mature 3S culture, and to coordinate efforts with other Japanese and international organizations. As an initial outcome of this study, incoming JAEA employees are being introduced to 3S as part of their induction training and the idea of a President's Award program is being evaluated. Furthermore, some overlaps in 3S missions might be exploited to share facility instrumentation as with Joint-Use-Equipment (JUE), in which cameras and radiation detectors, are shared by the State and IAEA. Lessons learned in these activities can be applied to developing more efficient and effective 3S infrastructures for incorporating into Safeguards by Design methodologies. They will also be useful in supporting human resources and technology development projects associated with Japan's planned nuclear security center for Asia, which was announced during the 2010 Nuclear Security Summit. In this presentation, a risk-informed approach regarding integration of 3S will be introduced. An initial

  9. Solid Matter

    CERN Document Server

    Angelo, Joseph A

    2011-01-01

    Supported by a generous quantity of full-color illustrations and interesting sidebars, Solid Matter introduces the basic characteristics and properties of solid matter. It briefly describes the cosmic connection of the elements, leading readers through several key events in human pre-history that resulted in more advanced uses of matter in the solid state. Chapters include:. -Solid Matter: An Initial Perspective. -Physical Behavior of Matter. -The Gravity of Matter. -Fundamentals of Materials Science. -Rocks and Minerals. -Metals. -Building Materials. -Carbon Earth's Most Versatile Element. -S

  10. Laboratory Calibration Studies in Support of ORGANICS on the International Space Station: Evolution of Organic Matter in Space

    Science.gov (United States)

    Ruiterkamp, R.; Ehrenfreund, P.; Halasinski, T.; Salama, F.; Foing, B.; Schmidt, W.

    2002-01-01

    This paper describes the scientific overview and current status of ORGANICS an exposure experiment performed on the International Space Station (ISS) to study the evolution of organic matter in space (PI: P. Ehrenfreund), with supporting laboratory experiments performed at NASA Ames. ORGANICS investigates the chemical evolution of samples submitted to long-duration exposure to space environment in near-Earth orbit. This experiment will provide information on the nature, evolution, and survival of carbon species in the interstellar medium (ISM) and in solar system targets.

  11. White Matter Tracts Connected to the Medial Temporal Lobe Support the Development of Mnemonic Control.

    Science.gov (United States)

    Wendelken, Carter; Lee, Joshua K; Pospisil, Jacqueline; Sastre, Marcos; Ross, Julia M; Bunge, Silvia A; Ghetti, Simona

    2015-09-01

    One of the most important factors driving the development of memory during childhood is mnemonic control, or the capacity to initiate and maintain the processes that guide encoding and retrieval operations. The ability to selectively attend to and encode relevant stimuli is a particularly useful form of mnemonic control, and is one that undergoes marked improvement over childhood. We hypothesized that structural integrity of white matter tracts, in particular those connecting medial temporal lobe memory regions to other cortical areas, and/or those connecting frontal and parietal control regions, should contribute to successful mnemonic control. To test this hypothesis, we examined the relationship between structural integrity of selected white matter tracts and an experimental measure of mnemonic control, involving enhancement of memory by attention at encoding, in 116 children aged 7-11 and 25 young adults. We observed a positive relationship between integrity of uncinate fasciculus and mnemonic enhancement across age groups. In adults, but not in children, we also observed an association between mnemonic enhancement and integrity of ventral cingulum bundle and ventral fornix/fimbria. Integrity of fronto-parietal tracts, including dorsal cingulum and superior longitudinal fasciculus, was unrelated to mnemonic enhancement. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  12. Inclusion of products of physicochemical oxidation of organic wastes in matter recycling of biological-technical life support systems.

    Science.gov (United States)

    Tikhomirov, Alexander A.; Kudenko, Yurii; Trifonov, Sergei; Ushakova, Sofya

    Inclusion of products of human and plant wastes' `wet' incineration in 22 medium using alter-nating current into matter recycling of biological-technical life support system (BTLSS) has been considered. Fluid and gaseous components have been shown to be the products of such processing. In particular, the final product contained all necessary for plant cultivation nitrogen forms: NO2, NO3, NH4+. As the base solution included urine than NH4+ form dominated. At human solid wastes' mineralization NO2 NH4+ were registered in approximately equal amount. Comparative analysis of mineral composition of oxidized human wastes' and standard Knop solutions has been carried out. On the grounds of that analysis the dilution methods of solutions prepared with addition of oxidized human wastes for their further use for plant irrigation have been suggested. Reasonable levels of wheat productivity cultivated at use of given solutions have been obtained. CO2, N2 and O2 have been determined to be the main gas components of the gas admixture emitted within the given process. These gases easily integrate in matter recycling process of closed ecosystem. The data of plants' cultivation feasibility in the atmosphere obtained after closing of gas loop including physicochemical facility and vegetation chamber with plants-representatives of LSS phototrophic unit has been received. Conclusion of advance research on creation of matter recycling process in the integrated physical-chemical-biological model system has been drawn.

  13. The effect of support on Internet-delivered treatment for insomnia: Does baseline depression severity matter?

    NARCIS (Netherlands)

    Lancee, J.; Sorbi, M.J.; Eisma, M.C.; van Straten, A.; van den Bout, J.

    2014-01-01

    Internet-delivered cognitive-behavioral treatment is effective for insomnia. However, little is known about the beneficial effects of support. Recently we demonstrated that motivational support moderately improved the effects of Internet-delivered treatment for insomnia. In the present study, we

  14. A methodology for supporting decisions on the establishment of protective measures after severe nuclear accidents. Final report

    International Nuclear Information System (INIS)

    Papazoglou, I.A.; Kollas, J.G.

    1994-06-01

    Full text: The objective of this report is to demonstrate the use of a methodology supporting decisions on protective measures following severe nuclear accidents. A multicriteria decision analysis approach is recommended where value tradeoffs are postponed until the very last stage of the decision process. Use of efficient frontiers is made to exclude all technically inferior solutions and present the decision maker with all non-dominated solutions. A choice among these solutions implies a value trade-off among the multiple criteria. An interactive computer package has been developed where the decision maker can choose a point on the efficient frontier in the consequence space and immediately see the alternative in the decision space resulting in the chosen consequences. The methodology is demonstrated through an application on the choice among possible protective measures in contaminated areas of the former USSR after the Chernobyl accident. Two distinct cases are considered: First a decision is to be made only on the basis of the level of soil contamination with Cs-137 and the total cost of the chosen protective policy; Next the decision is based on the geographic dimension of the contamination and the total cost. Three alternative countermeasure actions are considered for population segments living on soil contaminated at a certain level or in a specific geographic region: (a) relocation of the population; (b) improvement of the living conditions; and, (c) no countermeasures at all. This is the final deliverable of the CEC-CIS Joint Study Project 2, Task 5: Decision-Aiding-System for Establishing Intervention Levels, performed under Contracts COSU-CT91-0007 and COSU-CT92-0021 with the Commission of European Communities through CEPN. (author)

  15. Does Government Support for Private Innovation Matter? Firm-Level Evidence from Turkey and Poland

    OpenAIRE

    Wojciech Grabowski; Teoman Pamukcu; Krzysztof Szczygielski; Sinan Tandogan

    2013-01-01

    The aim of the project is to analyze government support for innovation in a comparative perspective by first examining the main existing instruments of financial support for innovation in Turkey and Poland, and secondly to assess their effectiveness by applying recent econometric techniques to firm-level data for both countries obtained from the Community Innovation Survey (CIS). Comparing Turkey to Poland is both meaningful and promising from a policy-analysis point of view. Both countries a...

  16. Trauma, social support, family conflict, and chronic pain in recent service veterans: does gender matter?

    Science.gov (United States)

    Driscoll, Mary A; Higgins, Diana M; Seng, Elizabeth K; Buta, Eugenia; Goulet, Joseph L; Heapy, Alicia A; Kerns, Robert D; Brandt, Cynthia A; Haskell, Sally G

    2015-06-01

    Women veterans have a higher prevalence of chronic pain relative to men. One hypothesis is that differential combat and traumatic sexual experiences and attenuated levels of social support between men and women may differentially contribute to the development and perpetuation of pain. This investigation examined [1] gender differences in trauma, social support, and family conflict among veterans with chronic pain, and [2] whether trauma, social support, and family conflict were differentially associated with pain severity, pain interference, and depressive symptom severity as a function of gender. Participants included 460 veterans (56% female) who served in support of recent conflicts, and who endorsed pain lasting 3 months or longer. Participants completed a baseline survey during participation in a longitudinal investigation. Self-report measures included pain severity, pain interference, depressive symptom severity, exposure to traumatic life events, emotional and tangible support, and family conflict. Relative to men, women veterans reporting chronic pain evidenced higher rates of childhood interpersonal trauma (51% vs 34%; P military sexual trauma (54% vs 3%; P trauma, and family conflict with pain interference. It also moderated family conflict in the prediction of depressive symptoms. Results underscore the potential importance of developing and testing gender specific models of chronic pain that consider the relative roles of trauma, social support, and family conflict. Wiley Periodicals, Inc.

  17. An ultrasonic methodology for in-service inspection of shell weld of core support structure in a sodium cooled fast reactor

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, Anish, E-mail: anish@igcar.gov.in; Rajkumar, K.V.; Sharma, Govind K.; Dhayalan, R.; Jayakumar, T.

    2015-02-15

    Highlights: • We demonstrate a novel ultrasonic methodology for in-service inspection of shell weld of core support structure in a sodium cooled fast breeder reactor. • The methodology comprises of the inspection of shell weld immersed in sodium from the outside surface of the main vessel using ultrasonic guided wave. • The formation and propagation of guided wave modes are validated by finite element simulation of the inspection methodology. • A defect down to 20% of 30 mm thick wall (∼6 mm) in the shell weld can be detected reliably using the developed methodology. - Abstract: The paper presents a novel ultrasonic methodology developed for in-service inspection (ISI) of shell weld of core support structure of main vessel of 500 MWe prototype fast breeder reactor (PFBR). The methodology comprises of the inspection of shell weld immersed in sodium from the outsider surface of the main vessel using a normal beam longitudinal wave ultrasonic transducer. Because of the presence of curvature in the knuckle region of the main vessel, the normal beam longitudinal wave enters the support shell plate at an angle and forms the guided waves by mode conversion and multiple reflections from the boundaries of the shell plate. Hence, this methodology can be used to detect defects in the shell weld of the core support structure. The successful demonstration of the methodology on a mock-up sector made of stainless steel indicated that an artificial defect down to 20% of 30 mm thick wall (∼6 mm) in the shell weld can be detected reliably.

  18. Autonomy support and control in weight management: what important others do and say matters.

    Science.gov (United States)

    Ng, Johan Y Y; Ntoumanis, Nikos; Thøgersen-Ntoumani, Cecilie

    2014-09-01

    Drawing from self-determination theory (Ryan & Deci, 2002, Overview of self-determination theory: An organismic-dialectical perspective. In E. L. Deci & R. M. Ryan (Eds.), Handbook of self-determination research (pp. 3-33). Rochester, NY: The University of Rochester Press.), we examined how individuals' psychological needs, motivation, and behaviours (i.e., physical activity and eating) associated with weight management could be predicted by perceptions of their important others' supportive and controlling behaviours. Using a cross-sectional survey design, 235 participants (mean age = 27.39 years, SD = 8.96 years) completed an online questionnaire. Statistical analyses showed that when important others were perceived to be more supportive, participants reported higher levels of more optimal forms of motivation for weight management, which in turn predicted more physical activity and healthy eating behaviours. In contrast, when important others were perceived to be controlling, participants reported higher levels of less optimal forms of motivation, which in turn predicted less physical activity and healthy eating behaviours, as well as more unhealthy eating behaviours. Significant indirect effects were also found from perceived support and control from important others to physical activity and eating behaviours, all in the expected directions. The findings support the importance of important others providing support and refraining from controlling behaviours in order to facilitate motivation and behaviours conducive to successful weight management. What is already known on this subject? Autonomy support is related to basic need satisfaction and autonomous motivation in the context of weight management. In turn, these variables are related to adaptive outcomes for weight management. What does this study add? Measurement of perceived controlling behaviours by important others. Measurement of perceived need thwarting. Structural model on how important others affect

  19. Investigation of optimal seismic design methodology for piping systems supported by elasto-plastic dampers. Part. 2. Applicability for seismic waves with various frequency characteristics

    International Nuclear Information System (INIS)

    Ito, Tomohiro; Michiue, Masashi; Fujita, Katsuhisa

    2010-01-01

    In this study, the applicability of a previously developed optimal seismic design methodology, which can consider the structural integrity of not only piping systems but also elasto-plastic supporting devices, is studied for seismic waves with various frequency characteristics. This methodology employs a genetic algorithm and can search the optimal conditions such as the supporting location and the capacity and stiffness of the supporting devices. Here, a lead extrusion damper is treated as a typical elasto-plastic damper. Numerical simulations are performed using a simple piping system model. As a result, it is shown that the proposed optimal seismic design methodology is applicable to the seismic design of piping systems subjected to seismic waves with various frequency characteristics. The mechanism of optimization is also clarified. (author)

  20. A national-scale remote sensing-based methodology for quantifying tidal marsh biomass to support "Blue Carbon" accounting

    Science.gov (United States)

    Byrd, K. B.; Ballanti, L.; Nguyen, D.; Simard, M.; Thomas, N.; Windham-Myers, L.; Castaneda, E.; Kroeger, K. D.; Gonneea, M. E.; O'Keefe Suttles, J.; Megonigal, P.; Troxler, T.; Schile, L. M.; Davis, M.; Woo, I.

    2016-12-01

    According to 2013 IPCC Wetlands Supplement guidelines, tidal marsh Tier 2 or Tier 3 accounting must include aboveground biomass carbon stock changes. To support this need, we are using free satellite and aerial imagery to develop a national scale, consistent remote sensing-based methodology for quantifying tidal marsh aboveground biomass. We are determining the extent to which additional satellite data will increase the accuracy of this "blue carbon" accounting. Working in 6 U.S. estuaries (Cape Cod, MA, Chesapeake Bay, MD, Everglades, FL, Mississippi Delta, LA, San Francisco Bay, CA, and Puget Sound, WA), we built a tidal marsh biomass dataset (n=2404). Landsat reflectance data were matched spatially and temporally with field plots using Google Earth Engine. We quantified percent cover of green vegetation, non-vegetation, and open water in Landsat pixels using segmentation of 1m National Agriculture Imagery Program aerial imagery. Sentinel-1A C-band backscatter data were used in Chesapeake, Mississippi Delta and Puget Sound. We tested multiple Landsat vegetation indices and Sentinel backscatter metrics in 30m scale biomass linear regression models by region. Scaling biomass by fraction green vegetation significantly improved biomass estimation (e.g. Cape Cod: R2 = 0.06 vs. R2 = 0.60, n=28). The best vegetation indices differed by region, though indices based on the shortwave infrared-1 and red bands were most predictive in the Everglades and the Mississippi Delta, while the soil adjusted vegetation index was most predictive in Puget Sound and Chesapeake. Backscatter metrics significantly improved model predictions over vegetation indices alone; consistently across regions, the most significant metric was the range in backscatter values within the green vegetation segment of the Landsat pixel (e.g. Mississippi Delta: R2 = 0.47 vs. R2 = 0.59, n=15). Results support using remote sensing of biomass stock change to estimate greenhouse gas emission factors in tidal

  1. Nonlinear Methodologies for Identifying Seismic Event and Nuclear Explosion Using Random Forest, Support Vector Machine, and Naive Bayes Classification

    Directory of Open Access Journals (Sweden)

    Longjun Dong

    2014-01-01

    Full Text Available The discrimination of seismic event and nuclear explosion is a complex and nonlinear system. The nonlinear methodologies including Random Forests (RF, Support Vector Machines (SVM, and Naïve Bayes Classifier (NBC were applied to discriminant seismic events. Twenty earthquakes and twenty-seven explosions with nine ratios of the energies contained within predetermined “velocity windows” and calculated distance are used in discriminators. Based on the one out cross-validation, ROC curve, calculated accuracy of training and test samples, and discriminating performances of RF, SVM, and NBC were discussed and compared. The result of RF method clearly shows the best predictive power with a maximum area of 0.975 under the ROC among RF, SVM, and NBC. The discriminant accuracies of RF, SVM, and NBC for test samples are 92.86%, 85.71%, and 92.86%, respectively. It has been demonstrated that the presented RF model can not only identify seismic event automatically with high accuracy, but also can sort the discriminant indicators according to calculated values of weights.

  2. Evaluation of modulation transfer function of optical lens system by support vector regression methodologies - A comparative study

    Science.gov (United States)

    Petković, Dalibor; Shamshirband, Shahaboddin; Saboohi, Hadi; Ang, Tan Fong; Anuar, Nor Badrul; Rahman, Zulkanain Abdul; Pavlović, Nenad T.

    2014-07-01

    The quantitative assessment of image quality is an important consideration in any type of imaging system. The modulation transfer function (MTF) is a graphical description of the sharpness and contrast of an imaging system or of its individual components. The MTF is also known and spatial frequency response. The MTF curve has different meanings according to the corresponding frequency. The MTF of an optical system specifies the contrast transmitted by the system as a function of image size, and is determined by the inherent optical properties of the system. In this study, the polynomial and radial basis function (RBF) are applied as the kernel function of Support Vector Regression (SVR) to estimate and predict estimate MTF value of the actual optical system according to experimental tests. Instead of minimizing the observed training error, SVR_poly and SVR_rbf attempt to minimize the generalization error bound so as to achieve generalized performance. The experimental results show that an improvement in predictive accuracy and capability of generalization can be achieved by the SVR_rbf approach in compare to SVR_poly soft computing methodology.

  3. The Racial Divide in Support for the Death Penalty: Does White Racism Matter?

    Science.gov (United States)

    Unnever, James D.; Cullen, Francis T.

    2007-01-01

    Using data from the 2000 National Election Study, this research investigates the sources of the racial divide in support for capital punishment with a specific focus on white racism. After delineating a measure of white racism, we explore whether it can account for why a majority of African Americans oppose the death penalty while most whites…

  4. Does the severity of disability matter? : The opinion of parents about professional support in residential facilities

    NARCIS (Netherlands)

    Luijkx, J.; Ten Brug, A.; Vlaskamp, C.

    BACKGROUND: Researchers have shown that the characteristics of a person with an intellectual disability (ID), in particular the severity of the disability, are related to the outcomes of professional support. Hardly any studies have asked parents and/or legal guardians for their own opinion about

  5. Empirically Supported Psychotherapy in Social Work Training Programs: Does the Definition of Evidence Matter?

    Science.gov (United States)

    Bledsoe, Sarah E.; Weissman, Myrna M.; Mullen, Edward J.; Ponniah, Kathryn; Gameroff, Marc J.; Verdeli, Helen; Mufson, Laura; Fitterling, Heidi; Wickramaratne, Priya

    2007-01-01

    Objectives: A national survey finds that 62% of social work programs do not require didactic and clinical supervision in any empirically supported psychotherapy (EST). The authors report the results of analysis of national survey data using two alternative classifications of EST to determine if the results are because of the definition of EST used…

  6. Gender difference in support for democracy in Sub-Saharan Africa: Do social institutions matter?

    NARCIS (Netherlands)

    Konte, M.

    2014-01-01

    Little investigation has been made to explain why women are less likely than are men to support democracy in Sub-Saharan Africa. This gender difference in politics has been found in numerous studies and may hinder the much needed legitimation of democracy in this region. This paper addresses the

  7. Why power matters: creating a foundation of mutual support in couple relationships.

    Science.gov (United States)

    Knudson-Martin, Carmen

    2013-03-01

    Research shows that equal power helps couples create intimacy and relationship success. However, though couples increasingly desire equal relationships, cultural models of mutual support are not well developed. Clinicians often approach heterosexual couple therapy as though partners are inherently equal, thus reinforcing unacknowledged gender inequities. This article examines research that shows why power imbalances are destructive to intimate relationships and focuses on four gender-related aspects of mutual support: (a) shared relational responsibility, (b) mutual vulnerability, (c) mutual attunement, and (d) shared influence. Case examples illustrate how socio-emotional attunement, interrupting the flow of power, and introducing alternative relational experience help couple therapists identify and address power disparities in these important relational processes. Encouraging the powerful person to take relational initiative and introducing alternative gender discourse are especially important. © FPI, Inc.

  8. Polygamy and poor mental health among Arab Bedouin women: do socioeconomic position and social support matter?

    Science.gov (United States)

    Daoud, Nihaya; Shoham-Vardi, Ilana; Urquia, Marcelo Louis; O'Campo, Patricia

    2014-08-01

    Polygamy is a complex phenomenon and a product of power relations, with deep cultural, social, economic, and political roots. Despite being banned in many countries, the practice persists and has been associated with women's marginalization and mental health sequelae. In this study, we sought to improve understanding of this ongoing, complex phenomenon by examining the contribution of socioeconomic position (SEP) and social support to the excess of depressive symptoms (DS) and poor self-rated health (SRH) among women in polygamous marriages compared to women in monogamous marriages. Measuring the contribution of these factors could facilitate policies and interventions aimed at protecting women's mental health. The study was conducted among a sample of Arab Bedouin women living in a marginalized community in southern Israel (N=464, age 18-50). The women were personally interviewed in 2008-2009. We then used logistic regression models to calculate the contribution of SEP (as defined by the women's education, family SEP, and household characteristics) and social support to excess of depressive symptoms and poor SRH among participants in polygamous versus monogamous marriages. About 23% of the participants were in polygamous marriages. These women reported almost twice the odds of depressive symptoms (OR=1.91, 95%CI=1.22, 2.99) and poorer SRH (OR=1.73, 95%CI=1.10, 2.72) than those in monogamous marriages. Women's education changed these associations slightly, but family SEP and household characteristics resulted in virtually no further change. Social support reduced the odds for poor SRH and DS by about 23% and 28%, respectively. Polygamy is associated with higher risk for poor mental health of women regardless of their SEP and education. Social support seems to have some protective effect.

  9. Does perceived teacher affective support matter for middle school students in mathematics classrooms?

    Science.gov (United States)

    Sakiz, Gonul; Pape, Stephen J; Hoy, Anita Woolfolk

    2012-04-01

    The purpose of the present study was to explore the importance of perceived teacher affective support in relation to sense of belonging, academic enjoyment, academic hopelessness, academic self-efficacy, and academic effort in middle school mathematics classrooms. A self-report survey was administered to 317 seventh- and eighth-grade students in 5 public middle schools. Structural equation modeling indicated significant associations between perceived teacher affective support and middle school students' motivational, emotional, and behavioral outcomes. The structural model explained a significant proportion of variance in students' sense of belonging (42%), academic enjoyment (43%), self-efficacy beliefs (43%), academic hopelessness (18%), and academic effort (32%) in mathematics classrooms. In addition to providing the basis for a concise new measure of perceived teacher affective support, these findings point to the importance of students' perceptions of the affective climate within learning environments for promoting academic enjoyment, academic self-efficacy, and academic effort in mathematics. Copyright © 2011 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  10. Family and peer support matter for precoital and coital behaviors among adolescents in Lima

    Science.gov (United States)

    Bayer, Angela M.; Cabrera, Lilia Z.; Gilman, Robert H.; Hindin, Michelle J.; Tsui, Amy O.

    2015-01-01

    We analyzed the association between sub-scales developed with adolescents and the outcomes of precoital behaviors and vaginal sex in Lima, Peru. Adolescent participants in key informant sessions operationalized concepts identified during qualitative concept mapping into several sub-scales. Face and content validity testing and pilot application with respondent debriefing were used to refine the sub-scales. Three hundred 15–17 year olds were surveyed about the sub-scales, socio-demographics and sexual behaviors. Exploratory factor analysis confirmed six sub-scales, self-image, goals and decision-making, family education, parental rules/control, school support and peer support, which we regressed on the outcomes. Twice as many males as females reported more than three precoital behaviors and vaginal sex. Higher peer support reduced the likelihood of vaginal sex and precoital behaviors and higher family education reduced precoital behaviors. Results affirm the importance of including adolescents in the entire research process and of sex education with family- and peer-based strategies. PMID:25305443

  11. MATHEMATICAL MODEL AND METHODOLOGY FOR CALCULATION OF MINIMIZATION ON TURNING RADIUS OF TRACTOR UNIT WITH REPLACEABLE SUPPORTING AND MANEUVERING DEVICE

    Directory of Open Access Journals (Sweden)

    P. V. Zeleniy

    2016-01-01

    Full Text Available Smooth plowing with the help of reversible plows has replaced an enclosure method of soil treatment. The method may cause a formation of back ridges or open furrows. Due to this fact turnings of a tractor unit with a minimum radius required in order to ensure shuttle movements each time in the furrow of the preceding operating stroke have become a dominant type of turnings. Non-productive shift time is directly dependent on them and it is on the average 10–12 %, and it is up to 40 % in small contour areas with short run. Large non-productive time is connected with the desire to reduce headland width at field edges, and then a turning is made in several stages while using a complicated maneuvering. Therefore, an increase in efficiency of a plowing unit by means of minimization on its turning radius and execution of turning at one stage in the shortest possible time are considered as relevant objectives. In such a case it is necessary to take into account the fact that potential capabilities of universal tractors having established time-proved designs in respect of reduction of turning radius are practically at the end. So it is expedient to solve the matter at the expense of additional removable devices that ensure transformation of tractor wheel formula at the run end in order to reorient its position. Finally high quality plowing ensured by future-oriented reversible plows will be accompanied not only by output increase per shift, but also by decrease in headland width, their compaction and abrasion due to suspension systems and increase in productivity. The developed design having a novelty which proved by an invention patent and representing an additional supporting and maneuvering device significantly minimizes all the above-mentioned disadvantages and does not require any changes in tractor production design. Investigations have been carried on the following topic: “Minimization of turning radius for universal tractors by transformation

  12. CosmoQuest: Supporting Subject Matter Experts in Broadening the Impacts of their Work beyond their Institutional Walls.

    Science.gov (United States)

    Noel-Storr, J.; Buxner, S.; Grier, J.; Gay, P.

    2016-12-01

    CosmoQuest is a virtual research facility, which, like its physical counterparts, provides tools for scientists to acquire reduced data products (thanks to our cadre of citizen scientists working to analyze images and produce results online), and also to participate in education and outreach activities either directly through CosmoQuest activities (such as CosmoAcademy and the Educators' Zone) or with the support of CosmoQuest. Here, we present our strategies to inspire, engage and support Subject Matter Experts (SMEs - Scientists, Engineers, Technologists and Mathematicians) in activities outside of their institutions, and beyond college classroom teaching. We provide support for SMEs who are interested in increasing the impacts of their science knowledge and expertise by interacting with people online, or in other venues outside of their normal work environment. This includes a broad spectrum of opportunities for those interested in hosting webinars; running short courses for the public; using Facebook, Twitter or other social media to communicate science; or other diverse activities such as supporting an open house, science fair, or star party. As noted by Katheryn Woods-Townsend and colleagues, "...face-to-face interactions with scientists allowed students to view scientists as approachable and normal people, and to begin to understand the range of scientific areas and careers that exist. Scientists viewed the scientist-student interactions as a vehicle for science communication" (2015). As CosmoQuest fosters these relationships, it We present a framework for SMEs which combine opportunities for continuing professional development (virtually and in person at conferences) with ongoing online support, creating a dynamic professional learning network. The goal of this is to deepen SME capacity-knowledge, attitudes and behaviors-both encouraging and empowering them to connect to broader audiences in new ways.

  13. Fathers Matter: The Role of Father Autonomy Support and Control in Preschoolers' Executive Function Development

    Science.gov (United States)

    Meuwissen, Alyssa S.; Carlson, Stephanie M.

    2015-01-01

    Although previous work has shown that mothers' parenting influences the development of child executive function (important self-control skills developed in the preschool years) the role of fathers' parenting has not been thoroughly investigated. We observed fathers' autonomy support and control in dyadic play with their 3-year-old children (N pairs = 110), and measured father and child EF independently with laboratory tasks. We found that fathers' controlling parenting was significantly inversely related to the child EF composite, above and beyond family income and child verbal ability. These results are consistent with the hypothesis that fathers are important for the development of EF in their children, and suggest fathers should be included in both research and parenting interventions. PMID:26209884

  14. Methodological exemplar of integrating quantitative and qualitative evidence - supportive care for men with prostate cancer:what are the most important components?

    OpenAIRE

    Huntley, Alyson; King, Anna J L; Moore, Theresa H M; Paterson, Charlotte; Persad, Raj; Sharp, Debbie J; Evans, Maggie A

    2017-01-01

    AIMS: To present a methodological exemplar of integrating findings from a quantitative and qualitative review on the same topic to provide insight into components of care that contribute to supportive care that is acceptable to men with prostate cancer.BACKGROUND: Men with prostate cancer are likely to live a long time with the disease, experience side effects from treatment and therefore have ongoing supportive care needs. Quantitative and qualitative reviews have been published but the find...

  15. Commentary: Supporting preterm children's parents matters - a reflection on Treyvaud et al. (2016).

    Science.gov (United States)

    Jaekel, Julia

    2016-07-01

    Children born preterm or with low birth weight (LBW) grow up with an increased risk for a range of neurodevelopmental, cognitive, socioemotional, and academic problems. While long-term effects of preterm and LBW birth have traditionally been studied from a deficit perspective, Treyvaud et al. correctly state that the increased risk for impairments in this population urgently requires identification of protective factors. Their new findings add to empirical evidence from observational studies showing that sensitive parenting can protect preterm children from negative developmental outcomes. In order to identify strategies that support preterm children's life chances, well-designed longitudinal studies, such as the one by Treyvaud et al., are indispensable. Next, we will need large randomized trials to test the causality between intervention-induced parenting changes and preterm children's long-term outcomes. We need interdisciplinary and international collaboration to study preterm parent-child dyads within multimethod frameworks and uncover the highly complex mechanisms that shape individual developmental trajectories. © 2016 Association for Child and Adolescent Mental Health.

  16. Family matters: Familial support and science identity formation for African American female STEM majors

    Science.gov (United States)

    Parker, Ashley Dawn

    This research seeks to understand the experiences of African American female undergraduates in STEM. It investigates how familial factors and science identity formation characteristics influence persistence in STEM while considering the duality of African American women's status in society. This phenomenological study was designed using critical race feminism as the theoretical framework to answer the following questions: 1) What role does family play in the experiences of African American women undergraduate STEM majors who attended two universities in the UNC system? 2) What factors impact the formation of science identity for African American women undergraduate STEM majors who attended two universities in the UNC system? Purposive sampling was used to select the participants for this study. The researcher conducted in-depth interviews with 10 African American female undergraduate STEM major from a predominantly White and a historically Black institution with the state of North Carolina public university system. Findings suggest that African American families and science identity formation influence the STEM experiences of the African American females interviewed in this study. The following five themes emerged from the findings: (1) independence, (2) support, (3) pressure to succeed, (4) adaptations, and (5) race and gender. This study contributes to the literature on African American female students in STEM higher education. The findings of this study produced knowledge regarding policies and practices that can lead to greater academic success and persistence of African American females in higher education in general, and STEM majors in particular. Colleges and universities may benefit from the findings of this study in a way that allows them to develop and sustain programs and policies that attend to the particular concerns and needs of African American women on their campuses. Finally, this research informs both current and future African American female

  17. State and perspectives of methodological support of materials research of products from Zirconium alloys for fuel rods and fuel assemblies of VVER

    International Nuclear Information System (INIS)

    Gusev, A.; Markelov, V.; Novikov, V.; Zheltkovskaya, T.; Malgin, A.; Shevyakov, A.; Bekrenev, S.

    2015-01-01

    The basic methodological framework for the study of the characteristics of zirconium products was created in JSC «VNIINM». The reliability of experiments confirmed the results of metrological certification procedures. Further development of methodological support of «VNIINM» for Zr products research is the development and validation of methods to determine: mechanical characteristics under internal pressure; Determination of Contractile Strain Ratio (CSR); Expansion Due to Compression (EDC); Plane Strain Tensile (PST); characteristics of resistance multi-cycle and low-cyclic fatigue; texture parameters using the orientation distribution function; the electrical characteristics of the oxide film by impedance

  18. Classroom Management Strategies to Address the Needs of Sudanese Refugee Learners: Support Document--Methodology and Literature Review

    Science.gov (United States)

    Burgoyne, Ursula; Hull, Oksana

    2007-01-01

    This document presents the methodology and literature review for the research report "Classroom Management Strategies to Address the Needs of Sudanese Refugee Learners" (ED499673), which examined the extent to which English language, literacy and numeracy teachers used classroom management strategies to meet the needs of adult Sudanese…

  19. La Familia: methodological issues in the assessment of perinatal social support for Mexicanas living in the United States.

    Science.gov (United States)

    Clark, L

    2001-11-01

    Do Mexicanas receive social support from a close network of family and friends during the perinatal period? To answer this question, a longitudinal ethnographic study followed 28 urban Mexican-origin women living in the US from their last trimester of pregnancy through their first month post-partum. A total of 93 interviews with Mexicanas focused on health and social support. All of the women lived in a large western city in the US but varied in their acculturation and income levels. Analyses identified four social support themes from women's experience (the emic analysis) and four social support typologies from the researcher (etic) analyses. The kinds of support women described as emanating from their support networks were inductively identified as Helping with Daily Hassles, Showing Love and Understanding, Being There for Me, and My Family Failing Me. Approximately half of the women reported densely supportive networks. The other women were disconnected from their support networks, or dealt with antagonism or instability in their networks. Women's perceptions of social support differed from the judgements made by the researcher about received support. Specifically, women perceived more network members in the supportive category than did the researcher by a factor of 1.4, and fewer network members in the disconnected category by a factor of 0.7. From an emic perspective, women listed only half as many antagonistic network members compared to the etic analysis (a factor of 0.50). These emic/etic discrepancies complicate clinical assessment of social support, but suggest that data on social support should be collected as part of the clinical processes of perinatal risking. To enhance assessment of social support, a clinically relevant guide is proposed for use by practitioners caring for Mexicanas in the perinatal period.

  20. Groundwater flow simulations in support of the Local Scale Hydrogeological Description developed within the Laxemar Methodology Test Project

    International Nuclear Information System (INIS)

    Follin, Sven; Svensson, Urban

    2002-05-01

    The deduced Site Descriptive Model of the Laxemar area has been parameterised from a hydraulic point of view and subsequently put into practice in terms of a numerical flow model. The intention of the subproject has been to explore the adaptation of a numerical flow model to site-specific surface and borehole data, and to identify potential needs for development and improvement in the planned modelling methodology and tools. The experiences made during this process and the outcome of the simulations have been presented to the methodology test project group in course of the project. The discussion and conclusions made in this particular report concern two issues mainly, (i) the use of numerical simulations as a means of gaining creditability, e.g. discrimination between alternative geological models, and (ii) calibration and conditioning of probabilistic (Monte Carlo) realisations

  1. Design-based research as a methodological approach to support participatory engagement of learners in the development of learning technologies

    OpenAIRE

    McDowell, James

    2015-01-01

    Following the origination of the design experiment as a mechanism to introduce learning interventions into the messy conditions of the classroom (Brown, 1992; Collins, 1992), design-based research (DBR) faced criticism from opposing paradigmatic camps before its acknowledgement as a promising methodology in which “formative evaluation plays a significant role” (Dede, Ketelhut, Whitehouse, Breit & McCloskey, 2009, p.16). \\ud \\ud This session presents a case study of a researcher-practitioner i...

  2. A methodological framework to support the initiation, design and institutionalization of participatory modeling processes in water resources management

    Science.gov (United States)

    Halbe, Johannes; Pahl-Wostl, Claudia; Adamowski, Jan

    2018-01-01

    Multiple barriers constrain the widespread application of participatory methods in water management, including the more technical focus of most water agencies, additional cost and time requirements for stakeholder involvement, as well as institutional structures that impede collaborative management. This paper presents a stepwise methodological framework that addresses the challenges of context-sensitive initiation, design and institutionalization of participatory modeling processes. The methodological framework consists of five successive stages: (1) problem framing and stakeholder analysis, (2) process design, (3) individual modeling, (4) group model building, and (5) institutionalized participatory modeling. The Management and Transition Framework is used for problem diagnosis (Stage One), context-sensitive process design (Stage Two) and analysis of requirements for the institutionalization of participatory water management (Stage Five). Conceptual modeling is used to initiate participatory modeling processes (Stage Three) and ensure a high compatibility with quantitative modeling approaches (Stage Four). This paper describes the proposed participatory model building (PMB) framework and provides a case study of its application in Québec, Canada. The results of the Québec study demonstrate the applicability of the PMB framework for initiating and designing participatory model building processes and analyzing barriers towards institutionalization.

  3. Computing elastic‐rebound‐motivated rarthquake probabilities in unsegmented fault models: a new methodology supported by physics‐based simulators

    Science.gov (United States)

    Field, Edward H.

    2015-01-01

    A methodology is presented for computing elastic‐rebound‐based probabilities in an unsegmented fault or fault system, which involves computing along‐fault averages of renewal‐model parameters. The approach is less biased and more self‐consistent than a logical extension of that applied most recently for multisegment ruptures in California. It also enables the application of magnitude‐dependent aperiodicity values, which the previous approach does not. Monte Carlo simulations are used to analyze long‐term system behavior, which is generally found to be consistent with that of physics‐based earthquake simulators. Results cast doubt that recurrence‐interval distributions at points on faults look anything like traditionally applied renewal models, a fact that should be considered when interpreting paleoseismic data. We avoid such assumptions by changing the "probability of what" question (from offset at a point to the occurrence of a rupture, assuming it is the next event to occur). The new methodology is simple, although not perfect in terms of recovering long‐term rates in Monte Carlo simulations. It represents a reasonable, improved way to represent first‐order elastic‐rebound predictability, assuming it is there in the first place, and for a system that clearly exhibits other unmodeled complexities, such as aftershock triggering.

  4. Review of current design methodologies to improve the safety of roof support systems, particularly in the face area in collieries.

    CSIR Research Space (South Africa)

    Canbulat, I

    1999-05-01

    Full Text Available This research report summarizes an extensive literature survey on roofbolt support design methods used worldwide, and presents the findings of extensive underground roof monitoring conducted in 29 sites at five collieries. An analysis of fall...

  5. CRITICISM AND SUPPORT TO CORPORATE SOCIAL RESPONSIBILITY: AN ETHNOGRAPHIC APPROACH BASED ON THE WORKERS’ EXPERIENCE AND A QUALITATIVE METHODOLOGY PROPOSAL

    Directory of Open Access Journals (Sweden)

    JUAN ANTONIO NAVARRO PRADOS

    2007-01-01

    Full Text Available This paper aims at presenting partial results and process of an investigation which analyzes the experience of a groupof employees, namely the experience of the implementation process of the corporate social responsibility policy of amedium-sized service company. For the case study participant observation, analysis of corporate documents and indepthinterviews to 64 employees across all organisational levels were employed. AtlasTi software was used to analyseand feed back the information received. This analysis produced a matrix of 161 content codes further analysed bymeans of network analysis methodology. Eventually, content network data were compared to the corporate sociogram.The investigation has been carried out during the last three years.

  6. Methodological issues involved in conducting qualitative research on support for nurses directly involved with women who chose to terminate their pregnancy

    Directory of Open Access Journals (Sweden)

    Antoinette Gmeiner

    2001-11-01

    Full Text Available The purpose of this article is to describe the methodological issues involved in conducting qualitative research to explore and describe nurses’ experience of being directly involved with termination of pregnancies and developing guidelines for support for these nurses. Opsomming Die doel van hierdie artikel is om die metodologiese vraagstukke te beskryf rondom die uitvoer van kwalitatiewe navorsing waar verpleegkundiges se ervaring van hul direkte betrokkenheid by terminasie van swangerskap verken en beskryf is. *Please note: This is a reduced version of the abstract. Please refer to PDF for full text.

  7. UniSchooLabs Toolkit: Tools and Methodologies to Support the Adoption of Universities’ Remote and Virtual Labs in Schools

    Directory of Open Access Journals (Sweden)

    Augusto Chioccariello

    2012-11-01

    Full Text Available The UniSchooLabs project aims at creating an infrastructure supporting web access to remote/virtual labs and associated educational resources to engage learners with hands-on and minds-on activities in science, technology and math in schools. The UniSchooLabs tool-kit supports the teacher in selecting a remote or virtual lab and developing a lab activity based on an inquiry model template. While working with the toolkit the teacher has access to three main features: a a catalogue of available online laboratories; b an archive of activities created by other users; c a tool for creating new activities or reusing existing ones.

  8. Methodological Aspects of the Development of Technological Entrepreneurship and Implementation of Financial Support Tools in Russian Universities

    Directory of Open Access Journals (Sweden)

    Babkinа Irina

    2016-01-01

    Full Text Available This article describes the development of methods of identification and support of communication between a developer and a technological entrepreneur. It is to promote university research activities. Special attention is paid to stimulating inventive activity and a university need in evolution of a developer from an inventor to series developmental engineer. The importance of an entrepreneurial path has been empathized. Financial tools of attraction of alternative funding for university innovative projects (e.g. endowment fund have been analyzed.

  9. Methodological approach and tools for systems thinking in health systems research: technical assistants' support of health administration reform in the Democratic Republic of Congo as an application.

    Science.gov (United States)

    Ribesse, Nathalie; Bossyns, Paul; Marchal, Bruno; Karemere, Hermes; Burman, Christopher J; Macq, Jean

    2017-03-01

    In the field of development cooperation, interest in systems thinking and complex systems theories as a methodological approach is increasingly recognised. And so it is in health systems research, which informs health development aid interventions. However, practical applications remain scarce to date. The objective of this article is to contribute to the body of knowledge by presenting the tools inspired by systems thinking and complexity theories and methodological lessons learned from their application. These tools were used in a case study. Detailed results of this study are in process for publication in additional articles. Applying a complexity 'lens', the subject of the case study is the role of long-term international technical assistance in supporting health administration reform at the provincial level in the Democratic Republic of Congo. The Methods section presents the guiding principles of systems thinking and complex systems, their relevance and implication for the subject under study, and the existing tools associated with those theories which inspired us in the design of the data collection and analysis process. The tools and their application processes are presented in the results section, and followed in the discussion section by the critical analysis of their innovative potential and emergent challenges. The overall methodology provides a coherent whole, each tool bringing a different and complementary perspective on the system.

  10. A methodology for a minimum data set for rare diseases to support national centers of excellence for healthcare and research

    Science.gov (United States)

    Choquet, Rémy; Maaroufi, Meriem; de Carrara, Albane; Messiaen, Claude; Luigi, Emmanuel; Landais, Paul

    2015-01-01

    Background Although rare disease patients make up approximately 6–8% of all patients in Europe, it is often difficult to find the necessary expertise for diagnosis and care and the patient numbers needed for rare disease research. The second French National Plan for Rare Diseases highlighted the necessity for better care coordination and epidemiology for rare diseases. A clinical data standard for normalization and exchange of rare disease patient data was proposed. The original methodology used to build the French national minimum data set (F-MDS-RD) common to the 131 expert rare disease centers is presented. Methods To encourage consensus at a national level for homogeneous data collection at the point of care for rare disease patients, we first identified four national expert groups. We reviewed the scientific literature for rare disease common data elements (CDEs) in order to build the first version of the F-MDS-RD. The French rare disease expert centers validated the data elements (DEs). The resulting F-MDS-RD was reviewed and approved by the National Plan Strategic Committee. It was then represented in an HL7 electronic format to maximize interoperability with electronic health records. Results The F-MDS-RD is composed of 58 DEs in six categories: patient, family history, encounter, condition, medication, and questionnaire. It is HL7 compatible and can use various ontologies for diagnosis or sign encoding. The F-MDS-RD was aligned with other CDE initiatives for rare diseases, thus facilitating potential interconnections between rare disease registries. Conclusions The French F-MDS-RD was defined through national consensus. It can foster better care coordination and facilitate determining rare disease patients’ eligibility for research studies, trials, or cohorts. Since other countries will need to develop their own standards for rare disease data collection, they might benefit from the methods presented here. PMID:25038198

  11. Extração de matéria orgânica aquática por abaixamento de temperatura: uma metodologia alternativa para manter a identidade da amostra Extraction of aquatic organic matter by temperature decreasing: an alternative methodology to keep the original sample characteristics

    Directory of Open Access Journals (Sweden)

    Rosana N. H. Martins de Almeida

    2003-03-01

    Full Text Available In this work was developed an alternative methodology to separation of aquatic organic matter (AOM present in natural river waters. The process is based in temperature decreasing of the aqueous sample under controlled conditions that provoke the freezing of the sample and separation of the dark extract, not frozen and rich in organic matter. The results showed that speed of temperature decreasing exerts strongly influence in relative recovery of organic carbon, enrichment and time separation of the organic matter present in water samples. Elemental composition, infrared spectra and thermal analysis results showed that the alternative methodology is less aggressive possible in the attempt of maintaining the integrity of the sample.

  12. Methodological guidelines

    International Nuclear Information System (INIS)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-01-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs

  13. Methodological guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-04-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs.

  14. Methodological issues in the design and evaluation of supported communication for aphasia training: a cluster-controlled feasibility study.

    Science.gov (United States)

    Horton, Simon; Clark, Allan; Barton, Garry; Lane, Kathleen; Pomeroy, Valerie M

    2016-04-18

    To assess the feasibility and acceptability of training stroke service staff to provide supported communication for people with moderate-severe aphasia in the acute phase; assess the suitability of outcome measures; collect data to inform sample size and Health Economic evaluation in a definitive trial. Phase II cluster-controlled, observer-blinded feasibility study. In-patient stroke rehabilitation units in the UK matched for bed numbers and staffing were assigned to control and intervention conditions. 70 stroke rehabilitation staff from all professional groups, excluding doctors, were recruited. 20 patients with moderate-severe aphasia were recruited. Supported communication for aphasia training, adapted to the stroke unit context versus usual care. Training was supplemented by a staff learning log, refresher sessions and provision of communication resources. Feasibility of recruitment and acceptability of the intervention and of measures required to assess outcomes and Health Economic evaluation in a definitive trial. Staff outcomes: Measure of Support in Conversation; patient outcomes: Stroke and Aphasia Quality of Life Scale; Communicative Access Measure for Stroke; Therapy Outcome Measures for aphasia; EQ-5D-3L was used to assess health outcomes. Feasibility of staff recruitment was demonstrated. Training in the intervention was carried out with 28 staff and was found to be acceptable in qualitative reports. 20 patients consented to take part, 6 withdrew. 18 underwent all measures at baseline; 16 at discharge; and 14 at 6-month follow-up. Of 175 patients screened 71% were deemed to be ineligible, either lacking capacity or too unwell to participate. Poor completion rates impacted on assessment of patient outcomes. We were able to collect sufficient data at baseline, discharge and follow-up for economic evaluation. The feasibility study informed components of the intervention and implementation in day-to-day practice. Modifications to the design are needed

  15. New methodology for aquifer influx status classification for single wells in a gas reservoir with aquifer support

    Directory of Open Access Journals (Sweden)

    Yong Li

    2016-10-01

    Full Text Available For gas reservoirs with strong bottom or edge aquifer support, the most important thing is avoiding aquifer breakthrough in a gas well. Water production in gas wells does not only result in processing problems in surface facilities, but it also explicitly reduces well productivity and reservoir recovery. There are a lot of studies on the prediction of water breakthrough time, but they are not completely practicable due to reservoir heterogeneity. This paper provides a new method together with three diagnostic curves to identify aquifer influx status for single gas wells; the aforementioned curves are based on well production and pressure data. The whole production period of a gas well can be classified into three periods based on the diagnostic curves: no aquifer influx period, early aquifer influx period, and middle-late aquifer influx period. This new method has been used for actual gas well analysis to accurately identify gas well aquifer influx status and the water breakthrough sequence of all wells in the same gas field. Additionally, the evaluation results are significantly beneficial for well production rate optimization and development of an effective gas field.

  16. Monitoring of Bridges by a Laser Pointer: Dynamic Measurement of Support Rotations and Elastic Line Displacements: Methodology and First Test.

    Science.gov (United States)

    Artese, Serena; Achilli, Vladimiro; Zinno, Raffaele

    2018-01-25

    Deck inclination and vertical displacements are among the most important technical parameters to evaluate the health status of a bridge and to verify its bearing capacity. Several methods, both conventional and innovative, are used for structural rotations and displacement monitoring; however, none of these allow, at the same time, precision, automation, static and dynamic monitoring without using high cost instrumentation. The proposed system uses a common laser pointer and image processing. The elastic line inclination is measured by analyzing the single frames of an HD video of the laser beam imprint projected on a flat target. For the image processing, a code was developed in Matlab ® that provides instantaneous rotation and displacement of a bridge, charged by a mobile load. An important feature is the synchronization of the load positioning, obtained by a GNSS receiver or by a video. After the calibration procedures, a test was carried out during the movements of a heavy truck maneuvering on a bridge. Data acquisition synchronization allowed us to relate the position of the truck on the deck to inclination and displacements. The inclination of the elastic line at the support was obtained with a precision of 0.01 mrad. The results demonstrate the suitability of the method for dynamic load tests, and the control and monitoring of bridges.

  17. Monitoring of Bridges by a Laser Pointer: Dynamic Measurement of Support Rotations and Elastic Line Displacements: Methodology and First Test

    Directory of Open Access Journals (Sweden)

    Serena Artese

    2018-01-01

    Full Text Available Deck inclination and vertical displacements are among the most important technical parameters to evaluate the health status of a bridge and to verify its bearing capacity. Several methods, both conventional and innovative, are used for structural rotations and displacement monitoring; however, none of these allow, at the same time, precision, automation, static and dynamic monitoring without using high cost instrumentation. The proposed system uses a common laser pointer and image processing. The elastic line inclination is measured by analyzing the single frames of an HD video of the laser beam imprint projected on a flat target. For the image processing, a code was developed in Matlab® that provides instantaneous rotation and displacement of a bridge, charged by a mobile load. An important feature is the synchronization of the load positioning, obtained by a GNSS receiver or by a video. After the calibration procedures, a test was carried out during the movements of a heavy truck maneuvering on a bridge. Data acquisition synchronization allowed us to relate the position of the truck on the deck to inclination and displacements. The inclination of the elastic line at the support was obtained with a precision of 0.01 mrad. The results demonstrate the suitability of the method for dynamic load tests, and the control and monitoring of bridges.

  18. School Engagement among Urban Adolescents of Color: Does Perception of Social Support and Neighborhood Safety Really Matter?

    Science.gov (United States)

    Daly, Brian P.; Shin, Richard Q.; Thakral, Charu; Selders, Michael; Vera, Elizabeth

    2009-01-01

    In this study we examined the effects of risk factors (perceived neighborhood crime/delinquency problems, neighborhood incivilities) and protective factors (teacher support, family support, peer support) on the school engagement of 123 urban adolescents of color. Age and gender were also examined to determine if different ages (younger or older)…

  19. Do stress and support matter for caring? The role of perceived stress and social support on expressed emotion of carers of persons with first episode psychosis.

    Science.gov (United States)

    Sadath, Anvar; Muralidhar, D; Varambally, Shivarama; Gangadhar, B N; Jose, Justin P

    2017-02-01

    Caring for a person with first episode psychosis (FEP) is a challenging and distressing task for the carers. The carers' stress in the early stage of psychosis can increase their expressed emotion (EE) while social support is hypothesized to decrease EE. However, the influence of stress and social support on carers' EE is not well understood in FEP. To examine how the stress and social support shape expressed emotion in the carers of FEP. Seventy one carers of the patients with non-affective FEP were recruited from the inpatient psychiatry ward of a tertiary mental health care center in South India. The family questionnaire, perceived stress scale and multidimensional scale of perceived social support were used to measure their EE, stress and social support respectively. Carers experienced high level of perceived stress, EE and poor social support. Perceived stress significantly increased EE (β=0.834; psocial support did not significantly influence EE (β=-0.065; p>0.05). Perceived stress predicted 76 percent of the variance on EE (Adjusted R 2 =0.761). The results emphasize high level of stress and EE in carers of patients with FEP that implies the need for appropriate psychosocial interventions to manage their stress. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. International Expert Review of Sr-Can: Safety Assessment Methodology - External review contribution in support of SSI's and SKI's review of SR-Can

    International Nuclear Information System (INIS)

    Sagar, Budhi; Egan, Michael; Roehlig, Klaus-Juergen; Chapman, Neil; Wilmot, Roger

    2008-03-01

    In 2006, SKB published a safety assessment (SR-Can) as part of its work to support a licence application for the construction of a final repository for spent nuclear fuel. The purposes of the SR-Can project were stated in the main project report to be: 1. To make a first assessment of the safety of potential KBS-3 repositories at Forsmark and Laxemar to dispose of canisters as specified in the application for the encapsulation plant. 2. To provide feedback to design development, to SKB's research and development (R and D) programme, to further site investigations and to future safety assessments. 3. To foster a dialogue with the authorities that oversee SKB's activities, i.e. the Swedish Nuclear Power Inspectorate, SKI, and the Swedish Radiation Protection Authority, SSI, regarding interpretation of applicable regulations, as a preparation for the SR-Site project. To help inform their review of SKB's proposed approach to development of the longterm safety case, the authorities appointed three international expert review teams to carry out a review of SKB's SR-Can safety assessment report. Comments from one of these teams - the Safety Assessment Methodology (SAM) review team - are presented in this document. The SAM review team's scope of work included an examination of SKB's documentation of the assessment ('Long-term safety for KBS-3 Repositories at Forsmark and Laxemar - a first evaluation' and several supporting reports) and hearings with SKB staff and contractors, held in March 2007. As directed by SKI and SSI, the SAM review team focused on methodological aspects and sought to determine whether SKB's proposed safety assessment methodology is likely to be suitable for use in the future SR-Site and to assess its consistency with the Swedish regulatory framework. No specific evaluation of long-term safety or site acceptability was undertaken by any of the review teams. SKI and SSI's Terms of Reference for the SAM review team requested that consideration be given

  1. Investigating Safety, Safeguards and Security (3S) Synergies to Support Infrastructure Development and Risk-Informed Methodologies for 3S by Design

    International Nuclear Information System (INIS)

    Suzuki, M.; Izumi, Y.; Kimoto, T.; Naoi, Y.; Inoue, T.; Hoffheins, B.

    2010-01-01

    In 2008, Japan and other G8 countries pledged to support the Safeguards, Safety, and Security (3S) Initiative to raise awareness of 3S worldwide and to assist countries in setting up nuclear energy infrastructures that are essential cornerstones of a successful nuclear energy program. The goals of the 3S initiative are to ensure that countries already using nuclear energy or those planning to use nuclear energy are supported by strong national programs in safety, security, and safeguards not only for reliability and viability of the programs, but also to prove to the international audience that the programs are purely peaceful and that nuclear material is properly handled, accounted for, and protected. In support of this initiative, Japan Atomic Energy Agency (JAEA) has been conducting detailed analyses of the R and D programs and cultures of each of the 'S' areas to identify overlaps where synergism and efficiencies might be realized, to determine where there are gaps in the development of a mature 3S culture, and to coordinate efforts with other Japanese and international organizations. As an initial outcome of this study, incoming JAEA employees are being introduced to 3S as part of their induction training and the idea of a President's Award program is being evaluated. Furthermore, some overlaps in 3S missions might be exploited to share facility instrumentation as with Joint-Use-Equipment (JUE), in which cameras and radiation detectors, are shared by the State and IAEA. Lessons learned in these activities can be applied to developing more efficient and effective 3S infrastructures for incorporating into Safeguards by Design methodologies. They will also be useful in supporting human resources and technology development projects associated with Japan's planned nuclear security center for Asia, which was announced during the 2010 Nuclear Security Summit. In this presentation, a risk-informed approach regarding integration of 3S will be introduced. An initial

  2. Does technique matter; a pilot study exploring weighting techniques for a multi-criteria decision support framework

    NARCIS (Netherlands)

    van Til, Janine Astrid; Groothuis-Oudshoorn, Catharina Gerarda Maria; Lieferink, Marijke; Dolan, James; Goetghebeur, Mireille

    2014-01-01

    Background There is an increased interest in the use of multi-criteria decision analysis (MCDA) to support regulatory and reimbursement decision making. The EVIDEM framework was developed to provide pragmatic multi-criteria decision support in health care, to estimate the value of healthcare

  3. Methodology. Volume 3

    Science.gov (United States)

    1997-02-01

    and supporting technical architectures » Evolution with changing needs and technology » Maturation to achieve continuous improvement • Instituitional ...implementation issues » Institutional arrangements » Financial aspects » Scheduling matters – Such factors must be addressed to complete the

  4. Health and safety matters! Associations between organizational practices and personal support workers' life and work stress in Ontario, Canada.

    Science.gov (United States)

    Zeytinoglu, Isik U; Denton, Margaret; Brookman, Catherine; Davies, Sharon; Sayin, Firat K

    2017-06-21

    The home and community care sector is one of the fastest growing sectors globally and most prominently in mature industrialized countries. Personal support workers (PSWs) are the largest occupational group in the sector. This paper focuses on the emotional health of PSWs working in the home and community care sector in Ontario, Canada. The purpose of this paper is to present evidence on the associations between PSWs' life and work stress and organizational practices of full-time and guaranteed hours, and PSWs' perceptions of support at work and preference for hours. Data come from our 2015 survey of 1543 PSWs. Dependent variables are life and work stress. Independent variables are: objective organizational practices of full-time and guaranteed hours, and subjective organizational practices of perceived support at work, and preferred hours of work. Descriptive statistics, correlations and ordinary least square regression analyses with collinearity tests are conducted. Organizational practices of employing PSWs in full-time or guaranteed hours are not associated with their life and work stress. However, those who perceive support from their organizations are also the ones reporting lower life and work stress. In addition, those PSWs perceiving support from their supervisor report lower work stress. PSWs would like to work in their preferred hours, and those who prefer to work more hours report lower life and work stress, and conversely, those who prefer to work less hours report life and work stress. For PSWs in home and community care, perceived support from their organizations and supervisors, and employment in preferred hours are important factors related to their life and work stress.

  5. Does It Matter Who Participates in Our Studies?: A Caution when Interpreting the Research on Positive Behavioral Support

    Science.gov (United States)

    Durand, V. Mark; Rost, Nichole

    2005-01-01

    Research on the treatment of challenging behaviors such as aggression, tantrums, and self-injury expanded significantly over the past two decades. However, despite of the rather impressive numbers of studies, it is still uncertain whether positive behavioral support (PBS) is effective with everyone. To be able to tell family members and…

  6. Association of depressive symptomology and psychological trauma with diabetes control among older American Indian women: Does social support matter?

    Science.gov (United States)

    Goins, R Turner; Noonan, Carolyn; Gonzales, Kelly; Winchester, Blythe; Bradley, Vickie L

    2017-04-01

    Among older American Indian women with type 2 diabetes (T2DM), we examined the association between mental health and T2DM control and if social support modifies the association. Survey data were linked to T2DM medical record information. Mental health measures were the Center for Epidemiologic Studies - Depression Scale and the National Anxiety Disorders Screening Day instrument. T2DM control was all HbA1c values taken post mental health measures. There was not a significant association between depressive symptomatology and higher HbA1c although increased depressive symptomatology was associated with higher HbA1c values among participants with low social support. There was a significant association between psychological trauma and higher HbA1c values 12months [mean 7.5, 95% CI 7.0-8.0 for no trauma vs. mean 7.0, 95% CI 6.3-7.6 for trauma with no symptoms vs. mean 8.4, 95% CI 7.7-9.1 for trauma with ≥1 symptom(s)] and 6months later [mean 7.2, 95% CI 6.7-7.7 for no trauma vs. mean HbA1c 6.8, 95% CI 6.2-7.4 for trauma with no symptoms vs. mean 8.4, 95% CI 7.6-9.2 for trauma with ≥1 symptom(s)]. High social support attenuated the association between psychological trauma and HbA1c values. T2DM programs may consider activities that would strengthen participants' social support and thereby building on an intrinsic community strength. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Does technique matter; a pilot study exploring weighting techniques for a multi-criteria decision support framework.

    Science.gov (United States)

    van Til, Janine; Groothuis-Oudshoorn, Catharina; Lieferink, Marijke; Dolan, James; Goetghebeur, Mireille

    2014-01-01

    There is an increased interest in the use of multi-criteria decision analysis (MCDA) to support regulatory and reimbursement decision making. The EVIDEM framework was developed to provide pragmatic multi-criteria decision support in health care, to estimate the value of healthcare interventions, and to aid in priority-setting. The objectives of this study were to test 1) the influence of different weighting techniques on the overall outcome of an MCDA exercise, 2) the discriminative power in weighting different criteria of such techniques, and 3) whether different techniques result in similar weights in weighting the criteria set proposed by the EVIDEM framework. A sample of 60 Dutch and Canadian students participated in the study. Each student used an online survey to provide weights for 14 criteria with two different techniques: a five-point rating scale and one of the following techniques selected randomly: ranking, point allocation, pairwise comparison and best worst scaling. The results of this study indicate that there is no effect of differences in weights on value estimates at the group level. On an individual level, considerable differences in criteria weights and rank order occur as a result of the weight elicitation method used, and the ability of different techniques to discriminate in criteria importance. Of the five techniques tested, the pair-wise comparison of criteria has the highest ability to discriminate in weights when fourteen criteria are compared. When weights are intended to support group decisions, the choice of elicitation technique has negligible impact on criteria weights and the overall value of an innovation. However, when weights are used to support individual decisions, the choice of elicitation technique influences outcome and studies that use dissimilar techniques cannot be easily compared. Weight elicitation through pairwise comparison of criteria is preferred when taking into account its superior ability to discriminate between

  8. Association of depressive symptomology and psychological trauma with diabetes control among older American Indian women: Does social support matter?

    Science.gov (United States)

    Noonan, Carolyn; Gonzales, Kelly; Winchester, Blythe; Bradley, Vickie L.

    2017-01-01

    Aims Among older American Indian women with type 2 diabetes (T2DM), we examined the association between mental health and T2DM control and if social support modifies the association. Methods Survey data were linked to T2DM medical record information. Mental health measures were the Center for Epidemiologic Studies – Depression Scale and the National Anxiety Disorders Screening Day instrument. T2DM control was all HbA1c values taken post mental health measures. Results There was not a significant association between depressive symptomatology and higher HbA1c although increased depressive symptomatology was associated with higher HbA1c values among participants with low social support. There was a significant association between psychological trauma and higher HbA1c values 12 months [mean 7.5, 95% CI 7.0–8.0 for no trauma vs. mean 7.0, 95% CI 6.3–7.6 for trauma with no symptoms vs. mean 8.4, 95% CI 7.7–9.1 for trauma with =1 symptom(s)] and 6 months later [mean 7.2, 95% CI 6.7–7.7 for no trauma vs. mean HbA1c 6.8, 95% CI 6.2–7.4 for trauma with no symptoms vs. mean 8.4, 95% CI 7.6–9.2 for trauma with ≥1 symptom(s)]. High social support attenuated the association between psychological trauma and HbA1c values. Conclusions T2DM programs may consider activities that would strengthen participants’ social support and thereby building on an intrinsic community strength. PMID:28161383

  9. FY17 Status Report on Testing Supporting the Inclusion of Grade 91 Steel as an Acceptable Material for Application of the EPP Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Messner, Mark C. [Argonne National Lab. (ANL), Argonne, IL (United States); Sham, Sam [Argonne National Lab. (ANL), Argonne, IL (United States); Wang, Yanli [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-08-01

    This report summarizes the experiments performed in FY17 on Gr. 91 steels. The testing of Gr. 91 has technical significance because, currently, it is the only approved material for Class A construction that is strongly cyclic softening. Specific FY17 testing includes the following activities for Gr. 91 steel. First, two types of key feature testing have been initiated, including two-bar thermal ratcheting and Simplified Model Testing (SMT). The goal is to qualify the Elastic – Perfectly Plastic (EPP) design methodologies and to support incorporation of these rules for Gr. 91 into the ASME Division 5 Code. The preliminary SMT test results show that Gr. 91 is most damaging when tested with compression hold mode under the SMT creep fatigue testing condition. Two-bar thermal ratcheting test results at a temperature range between 350 to 650o C were compared with the EPP strain limits code case evaluation, and the results show that the EPP strain limits code case is conservative. The material information obtained from these key feature tests can also be used to verify its material model. Second, to provide experimental data in support of the viscoplastic material model development at Argonne National Laboratory, selective tests were performed to evaluate the effect of cyclic softening on strain rate sensitivity and creep rates. The results show the prior cyclic loading history decreases the strain rate sensitivity and increases creep rates. In addition, isothermal cyclic stress-strain curves were generated at six different temperatures, and a nonisothermal thermomechanical testing was also performed to provide data to calibrate the viscoplastic material model.

  10. Postcards from the edge: Trash-2-Cash communication tools used to support inter-disciplinary work towards a design driven material innovation (DDMI) methodology

    Science.gov (United States)

    Earley, R.; Hornbuckle, R.

    2017-10-01

    In this paper postcards from the EU funded Horizon 2020 Trash-2-Cash (2015-2018) project - completed by workshop participants - are presented in three tables with a focus on how they contributed to the building of communication channels, shared understanding and methods in this inter-disciplinary consortium work. The Trash-2-Cash project aims to support better waste utilisation, improve material efficiency, contribute to reduction of landfill area needs, whilst also producing high-value commercial products. Novel materials will drive the generation of new textile fibres that will utilize paper and textile fibre waste, originating from continuously increasing textile consumption. The inter-disciplanarity of the participants is key to achieving the project aims - but communication between sectors is challenging due to diverse expertise and levels of experience; language and cultural differences can also be barriers to collaboration as well. Designing easy and accessible, even fun, communication tools are one of the ways to help build relationships. The cards reviewed were used in Prato (November 2015), Helsinki (February 2016) and London (November 2016). This paper concludes with insights for the ongoing development of the project communications work towards the Design Driven Material Innovation (DDMI) methodology, due to be presented at the end of the project in 2018.

  11. Social Network Analysis as a Methodological Approach to Explore Health Systems: A Case Study Exploring Support among Senior Managers/Executives in a Hospital Network.

    Science.gov (United States)

    De Brún, Aoife; McAuliffe, Eilish

    2018-03-13

    Health systems research recognizes the complexity of healthcare, and the interacting and interdependent nature of components of a health system. To better understand such systems, innovative methods are required to depict and analyze their structures. This paper describes social network analysis as a methodology to depict, diagnose, and evaluate health systems and networks therein. Social network analysis is a set of techniques to map, measure, and analyze social relationships between people, teams, and organizations. Through use of a case study exploring support relationships among senior managers in a newly established hospital group, this paper illustrates some of the commonly used network- and node-level metrics in social network analysis, and demonstrates the value of these maps and metrics to understand systems. Network analysis offers a valuable approach to health systems and services researchers as it offers a means to depict activity relevant to network questions of interest, to identify opinion leaders, influencers, clusters in the network, and those individuals serving as bridgers across clusters. The strengths and limitations inherent in the method are discussed, and the applications of social network analysis in health services research are explored.

  12. Social Network Analysis as a Methodological Approach to Explore Health Systems: A Case Study Exploring Support among Senior Managers/Executives in a Hospital Network

    Directory of Open Access Journals (Sweden)

    Aoife De Brún

    2018-03-01

    Full Text Available Health systems research recognizes the complexity of healthcare, and the interacting and interdependent nature of components of a health system. To better understand such systems, innovative methods are required to depict and analyze their structures. This paper describes social network analysis as a methodology to depict, diagnose, and evaluate health systems and networks therein. Social network analysis is a set of techniques to map, measure, and analyze social relationships between people, teams, and organizations. Through use of a case study exploring support relationships among senior managers in a newly established hospital group, this paper illustrates some of the commonly used network- and node-level metrics in social network analysis, and demonstrates the value of these maps and metrics to understand systems. Network analysis offers a valuable approach to health systems and services researchers as it offers a means to depict activity relevant to network questions of interest, to identify opinion leaders, influencers, clusters in the network, and those individuals serving as bridgers across clusters. The strengths and limitations inherent in the method are discussed, and the applications of social network analysis in health services research are explored.

  13. Hazard classification methodology

    International Nuclear Information System (INIS)

    Brereton, S.J.

    1996-01-01

    This document outlines the hazard classification methodology used to determine the hazard classification of the NIF LTAB, OAB, and the support facilities on the basis of radionuclides and chemicals. The hazard classification determines the safety analysis requirements for a facility

  14. Key principles for a national clinical decision support knowledge sharing framework: synthesis of insights from leading subject matter experts.

    Science.gov (United States)

    Kawamoto, Kensaku; Hongsermeier, Tonya; Wright, Adam; Lewis, Janet; Bell, Douglas S; Middleton, Blackford

    2013-01-01

    To identify key principles for establishing a national clinical decision support (CDS) knowledge sharing framework. As part of an initiative by the US Office of the National Coordinator for Health IT (ONC) to establish a framework for national CDS knowledge sharing, key stakeholders were identified. Stakeholders' viewpoints were obtained through surveys and in-depth interviews, and findings and relevant insights were summarized. Based on these insights, key principles were formulated for establishing a national CDS knowledge sharing framework. Nineteen key stakeholders were recruited, including six executives from electronic health record system vendors, seven executives from knowledge content producers, three executives from healthcare provider organizations, and three additional experts in clinical informatics. Based on these stakeholders' insights, five key principles were identified for effectively sharing CDS knowledge nationally. These principles are (1) prioritize and support the creation and maintenance of a national CDS knowledge sharing framework; (2) facilitate the development of high-value content and tooling, preferably in an open-source manner; (3) accelerate the development or licensing of required, pragmatic standards; (4) acknowledge and address medicolegal liability concerns; and (5) establish a self-sustaining business model. Based on the principles identified, a roadmap for national CDS knowledge sharing was developed through the ONC's Advancing CDS initiative. The study findings may serve as a useful guide for ongoing activities by the ONC and others to establish a national framework for sharing CDS knowledge and improving clinical care.

  15. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  16. Resource management for sustainable development : The application of a methodology to support resource management for the adequate application of Construction Systems to enhance sustainability in the lower income dwelling construction industry in Costa Rica

    NARCIS (Netherlands)

    Egmond - de Wilde De Ligny, van E.L.C.; Erkelens, P.A.; Jonge, de S.; Vliet, van A.A.M.

    2000-01-01

    This paper describes the results of the application of a methodology to support resource management for the enhancement of sustainability in the construction industry. Particular emphasis is given to the sustainability of manufacturing and application of construction systems for low income housing

  17. Quantifying Matter

    CERN Document Server

    Angelo, Joseph A

    2011-01-01

    Quantifying Matter explains how scientists learned to measure matter and quantify some of its most fascinating and useful properties. It presents many of the most important intellectual achievements and technical developments that led to the scientific interpretation of substance. Complete with full-color photographs, this exciting new volume describes the basic characteristics and properties of matter. Chapters include:. -Exploring the Nature of Matter. -The Origin of Matter. -The Search for Substance. -Quantifying Matter During the Scientific Revolution. -Understanding Matter's Electromagnet

  18. Making methodology a matter of process ontology

    DEFF Research Database (Denmark)

    Revsbæk, Line

    2016-01-01

    This paper presents a practice of doing qualitative interview analysis from the insights of the process ontology in G. H. Mead’s Philosophy of the Present (1932). The paper presents two cases of analyzing in the present while listening to recorded interview material eliciting researcher’s case...... study and otherwise related experiences creating case narratives inclusive of researcher’s reflexive voice. The paper presents an auto-ethnographic approach to data analysis based on process theory ontology....

  19. Proposal for structure of the national system for support of sustainable tourism and a methodology of rating of selected tourist services

    Directory of Open Access Journals (Sweden)

    Michal Burian

    2005-01-01

    Full Text Available This thesis is focused on preparation of basic elements of the National System for Support of Sustainable Tourism in the Czech Republic and on development of a methodology for ranking of selected tourist services from the sustainable point of view. Its result is planned to be used as a basis for real decisions potentially taken by the Czech government in near future.The first part analyses and summarises existing approaches worldwide to sustainable tourism, potential benefits and different impacts of tourism in order to specify possible measures avoiding unwanted impacts later on. The need to change towards sutainability in tourism is based on analysis of different policies and documents and a summary of duties of the Czech government (national ones and international ones. There is also a list of the best experiences available how to promote, support and encourage development of sustainable tourism based on analysis of the World Tourism Organisation studies and recommendations, existing EMS systems like ISO, EU ecolabel or methods, approaches, criteria and indicators used by different labels.Next part defines neccessary elements for successful system functioning in the Czech Republic, based on existing and working governmental and non-governmental structures not forgetting such important activities like information, education, marketing and motivation and sets expected responsibilities and roles of different potential partners participating on the system implementation.Unfortunately, there is no one certified tourist business in Czechia according to ISO, EMAS or EU flower before end of October 2004. Main reasons are price, missing benefits and motivations for businesses. Therefore an self-assesment questionnaire for a small-scale accommodation is developed. This fits with the „easy to understand, easy to answer“. This option is missing in the country and it is understood as a prestep to more sophisticated methods of measurment. The main

  20. Design Methodology - Design Synthesis

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup

    2003-01-01

    Design Methodology is part of our practice and our knowledge about designing, and it has been strongly supported by the establishing and work of a design research community. The aim of this article is to broaden the reader¿s view of designing and Design Methodology. This is done by sketching...... the development of Design Methodology through time and sketching some important approaches and methods. The development is mainly forced by changing industrial condition, by the growth of IT support for designing, but also by the growth of insight into designing created by design researchers.......ABSTRACT Design Methodology shall be seen as our understanding of how to design; it is an early (emerging late 60ies) and original articulation of teachable and learnable methodics. The insight is based upon two sources: the nature of the designed artefacts and the nature of human designing. Today...

  1. Baryonic Dark Matter

    OpenAIRE

    De Paolis, F.; Jetzer, Ph.; Ingrosso, G.; Roncadelli, M.

    1997-01-01

    Reasons supporting the idea that most of the dark matter in galaxies and clusters of galaxies is baryonic are discussed. Moreover, it is argued that most of the dark matter in galactic halos should be in the form of MACHOs and cold molecular clouds.

  2. Remediation of lead-contaminated sediment by biochar-supported nano-chlorapatite: Accompanied with the change of available phosphorus and organic matters.

    Science.gov (United States)

    Huang, Danlian; Deng, Rui; Wan, Jia; Zeng, Guangming; Xue, Wenjing; Wen, Xiaofeng; Zhou, Chengyun; Hu, Liang; Liu, Xigui; Xu, Piao; Guo, Xueying; Ren, Xiaoya

    2018-04-15

    Some rivers in China have been seriously contaminated due to the discharge of lead (Pb) smelting wastewater. In this study, biochar-supported nano-chlorapatite (BC-nClAP) was synthesized to immobilize Pb in contaminated sediment. The remediation effect of BC-nClAP on Pb-contaminated sediment was evaluated through batch experiments and the materials were characterized by x-ray diffraction, scanning electron microscope, Brunner-Emmet-Teller and electronic differential system. It was found that BC-nClAP can transform Pb effectively from labile fraction into stable fraction with a maximum transformation efficiency increasing to 94.1% after 30 days of treatment, and the stabilization efficiency of toxicity characteristic leaching procedure reached 100% only after 16 days of treatment. The content of available phosphorus (AP) in the sediments treated by BC-nClAP was much less than that treated by nClAP, which indicated a lower risk of eutrophication and suggested the dissolution-precipitation mechanism involved in Pb immobilization. BC-nClAP presented the best immobilization efficiency of Pb and the content of organic matters in BC-nClAP treated samples increased the most, thus the OM might play an important role during the Pb immobilization. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Technology, safety and costs of decommissioning a refernce boiling water reactor power station: Technical support for decommissioning matters related to preparation of the final decommissioning rule

    International Nuclear Information System (INIS)

    Konzek, G.J.; Smith, R.I.

    1988-07-01

    Preparation of the final Decommissioning Rule by the Nuclear Regulatory Commission (NRC) staff has been assisted by Pacific Northwest Laboratory (PNL) staff familiar with decommissioning matters. These efforts have included updating previous cost estimates developed during the series of studies of conceptually decommissioning reference licensed nuclear facilities for inclusion in the Final Generic Environmental Impact Statement (FGEIS) on decommissioning; documenting the cost updates; evaluating the cost and dose impacts of post-TMI-2 backfits on decommissioning; developing a revised scaling formula for estimating decommissioning costs for reactor plants different in size from the reference boiling water reactor (BWR) described in the earlier study; and defining a formula for adjusting current cost estimates to reflect future escalation in labor, materials, and waste disposal costs. This report presents the results of recent PNL studies to provide supporting information in three areas concerning decommissioning of the reference BWR: updating the previous cost estimates to January 1986 dollars; assessing the cost and dose impacts of post-TMI-2 backfits; and developing a scaling formula for plants different in size than the reference plant and an escalation formula for adjusting current cost estimates for future escalation

  4. Technology, safety and costs of decommissioning a reference pressurized water reactor power station: Technical support for decommissioning matters related to preparation of the final decommissioning rule

    International Nuclear Information System (INIS)

    Konzek, G.J.; Smith, R.I.

    1988-07-01

    Preparation of the final Decommissioning Rule by the Nuclear Regulatory Commission (NRC) staff has been assisted by Pacific Northwest Laboratory (PNL) staff familiar with decommissioning matters. These efforts have included updating previous cost estimates developed during the series of studies on conceptually decommissioning reference licensed nuclear facilities for inclusion in the Final Generic Environmental Impact Statement (FGEIS) on decommissioning; documenting the cost updates; evaluating the cost and dose impacts of post-TMI-2 backfits on decommissioning; developing a revised scaling formula for estimating decommissioning costs for reactor plants different in size from the reference pressurized water reactor (PWR) described in the earlier study; defining a formula for adjusting current cost estimates to reflect future escalation in labor, materials, and waste disposal costs; and completing a study of recent PWR steam generator replacements to determine realistic estimates for time, costs and doses associated with steam generator removal during decommissioning. This report presents the results of recent PNL studies to provide supporting information in four areas concerning decommissioning of the reference PWR: updating the previous cost estimates to January 1986 dollars; assessing the cost and dose impacts of post-TMI-2 backfits; assessing the cost and dose impacts of recent steam generator replacements; and developing a scaling formula for plants different in size than the reference plant and an escalation formula for adjusting current cost estimates for future escalation

  5. The potential of Sentinel 2 and PROBA-V images for supporting early warnings of particulate matter pollution episodes in Ploiesti urban area

    Science.gov (United States)

    Dunea, Daniel; Iordache, Stefania; Pohoata, Alin; Lungu, Emil; Ianache, Cornel; Ianache, Radu

    2016-04-01

    overlapped on the GIS thematic layers of the Ploiesti city area to develop the integrated system of PM movement prediction. All thematic layers were referenced to the same coordinate system using local 1970 stereographic projection and Dealul Piscului 1970 geographic coordinate system. The meteorological inputs used in experiments included long term time series recorded at local station. We combined these multiple datasets to find potential correlations that can be used for improving the prediction of particulate matter pollution episodes in Ploiesti urban area with latest state-of-the-art satellite imagery support. This study received funding from the European Economic Area Financial Mechanism 2009 - 2014 under the project ROKIDAIR "Towards a better protection of children against air pollution threats in the urban areas of Romania" contract no. 20SEE/30.06.2014 (http://www.rokidair.ro/en).

  6. Dark Matter

    Directory of Open Access Journals (Sweden)

    Einasto J.

    2011-06-01

    Full Text Available I give a review of the development of the concept of dark matter. The dark matter story passed through several stages from a minor observational puzzle to a major challenge for theory of elementary particles. Modern data suggest that dark matter is the dominant matter component in the Universe, and that it consists of some unknown non-baryonic particles. Dark matter is the dominant matter component in the Universe, thus properties of dark matter particles determine the structure of the cosmic web.

  7. Detecting dark matter

    International Nuclear Information System (INIS)

    Dixon, Roger L.

    2000-01-01

    Dark matter is one of the most pressing problems in modern cosmology and particle physic research. This talk will motivate the existence of dark matter by reviewing the main experimental evidence for its existence, the rotation curves of galaxies and the motions of galaxies about one another. It will then go on to review the corroborating theoretical motivations before combining all the supporting evidence to explore some of the possibilities for dark matter along with its expected properties. This will lay the ground work for dark matter detection. A number of differing techniques are being developed and used to detect dark matter. These will be briefly discussed before the focus turns to cryogenic detection techniques. Finally, some preliminary results and expectations will be given for the Cryogenic Dark Matter Search (CDMS) experiment

  8. Counting stem cells : methodological constraints

    NARCIS (Netherlands)

    Bystrykh, Leonid V.; Verovskaya, Evgenia; Zwart, Erik; Broekhuis, Mathilde; de Haan, Gerald

    The number of stem cells contributing to hematopoiesis has been a matter of debate. Many studies use retroviral tagging of stem cells to measure clonal contribution. Here we argue that methodological factors can impact such clonal analyses. Whereas early studies had low resolution, leading to

  9. On methodology

    DEFF Research Database (Denmark)

    Cheesman, Robin; Faraone, Roque

    2002-01-01

    This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública".......This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública"....

  10. Work in support of biosphere assessments for solid radioactive waste disposal. 1. performance assessments, requirements and methodology; criteria for radiological environmental protection

    International Nuclear Information System (INIS)

    Egan, M.J.; Loose, M.; Smith, G.M.; Watkins, B.M.

    2001-10-01

    The first part of this report is intended to assess how the recent Swedish regulatory developments and resulting criteria impose requirements on what should be included in a performance assessment (PA) for the SFR low and medium level waste repository and for a potential deep repository for high level waste. The second part of the report has been prepared by QuantiSci as an input to the development of SSI's PA review methodology. The aim of the third part is to provide research input to the development of radiological protection framework for the environment, for use in Sweden. This is achieved through a review of various approaches used in other fields

  11. Supplement to a Methodology for Succession Planning for Technical Experts

    Energy Technology Data Exchange (ETDEWEB)

    Kirk, Bernadette Lugue [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Cain, Ronald A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Agreda, Carla L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-11-01

    This report complements A Methodology for Succession Planning for Technical Experts (Ron Cain, Shaheen Dewji, Carla Agreda, Bernadette Kirk, July 2017), which describes a draft methodology for identifying and evaluating the loss of key technical skills at nuclear operations facilities. This report targets the methodology for identifying critical skills, and the methodology is tested through interviews with selected subject matter experts.

  12. Speech Matters

    DEFF Research Database (Denmark)

    Hasse Jørgensen, Stina

    2011-01-01

    About Speech Matters - Katarina Gregos, the Greek curator's exhibition at the Danish Pavillion, the Venice Biannual 2011.......About Speech Matters - Katarina Gregos, the Greek curator's exhibition at the Danish Pavillion, the Venice Biannual 2011....

  13. Memory Matters

    Science.gov (United States)

    ... Staying Safe Videos for Educators Search English Español Memory Matters KidsHealth / For Kids / Memory Matters What's in ... of your complex and multitalented brain. What Is Memory? When an event happens, when you learn something, ...

  14. Dark Matter

    Indian Academy of Sciences (India)

    What You See Ain't What. You Got, Resonance, Vol.4,. No.9,1999. Dark Matter. 2. Dark Matter in the Universe. Bikram Phookun and Biman Nath. In Part 11 of this article we learnt that there are compelling evidences from dynamics of spiral galaxies, like our own, that there must be non-luminous matter in them. In this.

  15. Work in support of biosphere assessments for solid radioactive waste disposal. 1. performance assessments, requirements and methodology; criteria for radiological environmental protection

    Energy Technology Data Exchange (ETDEWEB)

    Egan, M.J.; Loose, M.; Smith, G.M.; Watkins, B.M. [QuantiSci Ltd., Henley-on-Thames (United Kingdom)

    2001-10-01

    The first part of this report is intended to assess how the recent Swedish regulatory developments and resulting criteria impose requirements on what should be included in a performance assessment (PA) for the SFR low and medium level waste repository and for a potential deep repository for high level waste. The second part of the report has been prepared by QuantiSci as an input to the development of SSI's PA review methodology. The aim of the third part is to provide research input to the development of radiological protection framework for the environment, for use in Sweden. This is achieved through a review of various approaches used in other fields.

  16. Assessment of methodologies for radioactive waste management

    International Nuclear Information System (INIS)

    Hoos, I.R.

    1978-01-01

    No quantitative methodology is adequate to encompass and assess all the risks, no risk/benefit calculation is fine-tuned enough to supply decision-makers with the full range and all of the dimensions. Quality assurance cannot be conceived in terms of systems design alone, but must be maintained vigilantly and with integrity throughout the process. The responsibility of the NRC is fairly well established with respect to overall reactor safety. With respect to the management of radioactive wastes, its mission is not yet so clearly delineated. Herein lies a challenge and an opportunity. Where the known quantitative methodologies are restrictive and likely to have negative feedback effect on authority and public support, the broader lens and the bolder thrust are called for. The cozy cocoon of figures ultimately protects no one. The Commission, having acknowledged that the management of radioactive wastes is not merely a technological matter can now take the socially responsible position of exploring as fully and confronting as candidly as possible the total range of dimensions involved. Paradoxically, it is Charles J. Hitch, intellectual progenitor of the methodology, who observes that we may be missing the meaning of his message by relying too heavily on quantitative analysis and thus defining our task too narrowly. We live in a closed system, in which science and technology, politics and economics, and, above all, social and human elements interact, sometimes to create the problems, sometimes to articulate the questions, and sometimes to find viable solutions

  17. Macro Dark Matter

    CERN Document Server

    Jacobs, David M; Lynn, Bryan W.

    2015-01-01

    Dark matter is a vital component of the current best model of our universe, $\\Lambda$CDM. There are leading candidates for what the dark matter could be (e.g. weakly-interacting massive particles, or axions), but no compelling observational or experimental evidence exists to support these particular candidates, nor any beyond-the-Standard-Model physics that might produce such candidates. This suggests that other dark matter candidates, including ones that might arise in the Standard Model, should receive increased attention. Here we consider a general class of dark matter candidates with characteristic masses and interaction cross-sections characterized in units of grams and cm$^2$, respectively -- we therefore dub these macroscopic objects as Macros. Such dark matter candidates could potentially be assembled out of Standard Model particles (quarks and leptons) in the early universe. A combination of earth-based, astrophysical, and cosmological observations constrain a portion of the Macro parameter space; ho...

  18. Kaupapa Maori Methodology: Trusting the Methodology through Thick and Thin

    Science.gov (United States)

    Hiha, Anne Aroha

    2016-01-01

    Kaupapa Maori is thoroughly theorised in academia in Aotearoa and those wishing to use it as their research methodology can find support through the writing of a number of Maori academics. What is not so well articulated, is the experiential voice of those who have used Kaupapa Maori as research methodology. My identity as a Maori woman…

  19. Creativity in phenomenological methodology

    DEFF Research Database (Denmark)

    Dreyer, Pia; Martinsen, Bente; Norlyk, Annelise

    2014-01-01

    on the methodologies of van Manen, Dahlberg, Lindseth & Norberg, the aim of this paper is to argue that the increased focus on creativity and arts in research methodology is valuable to gain a deeper insight into lived experiences. We illustrate this point through examples from empirical nursing studies, and discuss......Nursing research is often concerned with lived experiences in human life using phenomenological and hermeneutic approaches. These empirical studies may use different creative expressions and art-forms to describe and enhance an embodied and personalised understanding of lived experiences. Drawing...... may support a respectful renewal of phenomenological research traditions in nursing research....

  20. Monetary valuation of environmental goods, services and impacts in support of the decision: methodological novelties. Seminar proceedings of December 19, 2013

    International Nuclear Information System (INIS)

    Bonnet, Xavier; Vanoli, Andre; Nauroy, Frederic; Devaux, Jeremy; Christov, Strahil; ); Simon, Olivier; Bortzmeyer, Martin; Vergez, Antonin; Lagarenne, Christine; Ami, Dominique; Aprahamian, Frederic; Chanel, Olivier; Luchini, Stephane; Baumstarck, Luc; Auverlot, Dominique; Ducos, Geraldine; Rafenberg, Christophe; Levrel, Harold; Ben Maid, Atika; Darses, Ophelie

    2014-10-01

    Within the Department of the General Commissioner for Sustainable Development, the Division for Economics, Assessment and Integration of Sustainable Development is in charge of developing and promoting the economic valuation of policies, regulations, environmental goods and services, related to biodiversity, natural assets and environmental amenities. On December 19, 2013, this department held the fourth annual seminar on monetary valuation of environmental goods, services and impacts. The three first editions were respectively devoted to economic valuation methods of environmental goods and services, their implementation and the use of monetary values resulting from these methods. The 2013 seminar addressed methodological innovations and the way they contribute to decision in private sector and in policy-making process, in domains such as environmental debt, circular economy or health impact of environment. Those conferences are aimed at experts and practitioners of monetary valuation techniques as well as at users of the values produced. They provide a place to gather and facilitate dialogue between representatives from universities, government agencies and private sector involved in these issues. (authors)

  1. Methodology of the design of an integrated telecommunications and computer network in a control information system for artillery battalion fire support

    Directory of Open Access Journals (Sweden)

    Slobodan M. Miletić

    2012-04-01

    Full Text Available A Command Information System (CIS in a broader sense can be defined as a set of hardware and software solutions by which one achieves real-time integration of organizational structures, doctrine, technical and technological systems and facilities, information flows and processes for efficient and rational decision-making and functioning. Time distribution and quality of information directly affect the implementation of the decision making process and criteria for evaluating the effectiveness of the system in which the achievement of the most important role is an integrated telecommunications and computer network (ITCN, dimensioned to the spatial distribution of tactical combat units connecting all the elements in a communications unit. The aim is to establish the design methodology as a way of the ITCN necessary to conduct analysis and extract all the necessary elements for modeling that are mapped to the elements of network infrastructure, and then analyzed from the perspective of telecommunications communication standards and parameters of the layers of the OSI network model. A relevant way to verify the designed model ITCN is the development of a simulation model with which adequate results can be obtained. Conclusions on the compliance with the requirements of tactical combat and tactical communication requirements are drawn on the basis of these results.

  2. MIRD methodology

    International Nuclear Information System (INIS)

    Rojo, Ana M.; Gomez Parada, Ines

    2004-01-01

    The MIRD (Medical Internal Radiation Dose) system was established by the Society of Nuclear Medicine of USA in 1960 to assist the medical community in the estimation of the dose in organs and tissues due to the incorporation of radioactive materials. Since then, 'MIRD Dose Estimate Report' (from the 1 to 12) and 'Pamphlets', of great utility for the dose calculations, were published. The MIRD system was planned essentially for the calculation of doses received by the patients during nuclear medicine diagnostic procedures. The MIRD methodology for the absorbed doses calculations in different tissues is explained

  3. PSA methodology

    Energy Technology Data Exchange (ETDEWEB)

    Magne, L

    1997-12-31

    The purpose of this text is first to ask a certain number of questions on the methods related to PSAs. Notably we will explore the positioning of the French methodological approach - as applied in the EPS 1300{sup 1} and EPS 900{sup 2} PSAs - compared to other approaches (Part One). This reflection leads to more general reflection: what contents, for what PSA? This is why, in Part Two, we will try to offer a framework for definition of the criteria a PSA should satisfy to meet the clearly identified needs. Finally, Part Three will quickly summarize the questions approached in the first two parts, as an introduction to the debate. 15 refs.

  4. PSA methodology

    International Nuclear Information System (INIS)

    Magne, L.

    1996-01-01

    The purpose of this text is first to ask a certain number of questions on the methods related to PSAs. Notably we will explore the positioning of the French methodological approach - as applied in the EPS 1300 1 and EPS 900 2 PSAs - compared to other approaches (Part One). This reflection leads to more general reflection: what contents, for what PSA? This is why, in Part Two, we will try to offer a framework for definition of the criteria a PSA should satisfy to meet the clearly identified needs. Finally, Part Three will quickly summarize the questions approached in the first two parts, as an introduction to the debate. 15 refs

  5. Calculation of relative tube/tube support plate displacements in steam generators under accident condition loads using non-linear dynamic analysis methodologies

    International Nuclear Information System (INIS)

    Smith, R.E.; Waisman, R.; Hu, M.H.; Frick, T.M.

    1995-01-01

    A non-linear analysis has been performed to determine relative motions between tubes and tube support plates (TSP) during a steam line break (SLB) event for steam generators. The SLB event results in blowdown of steam and water out of the steam generator. The fluid blowdown generates pressure drops across the TSPS, resulting in out-of-plane motion. The SLB induced pressure loads are calculated with a computer program that uses a drift-flux modeling of the two-phase flow. In order to determine the relative tube/TSP motions, a nonlinear dynamic time-history analysis is performed using a structural model that considers all of the significant component members relative to the tube support system. The dynamic response of the structure to the pressure loads is calculated using a special purpose computer program. This program links the various substructures at common degrees of freedom into a combined mass and stiffness matrix. The program accounts for structural non-linearities, including potential tube and TSP interaction at any given tube position. The program also accounts for structural damping as part of the dynamic response. Incorporating all of the above effects, the equations of motion are solved to give TSP displacements at the reduced set of DOF. Using the displacement results from the dynamic analysis, plate stresses are then calculated using the detailed component models. Displacements form the dynamic analysis are imposed as boundary conditions at the DOF locations, and the finite element program then solves for the overall distorted geometry. Calculations are also performed to assure that assumptions regarding elastic response of the various structural members and support points are valid

  6. D matter

    International Nuclear Information System (INIS)

    Shiu, Gary; Wang Liantao

    2004-01-01

    We study the properties and phenomenology of particlelike states originating from D branes whose spatial dimensions are all compactified. They are nonperturbative states in string theory and we refer to them as D matter. In contrast to other nonperturbative objects such as 't Hooft-Polyakov monopoles, D-matter states could have perturbative couplings among themselves and with ordinary matter. The lightest D particle (LDP) could be stable because it is the lightest state carrying certain (integer or discrete) quantum numbers. Depending on the string scale, they could be cold dark matter candidates with properties similar to that of WIMPs or wimpzillas. The spectrum of excited states of D matter exhibits an interesting pattern which could be distinguished from that of Kaluza-Klein modes, winding states, and string resonances. We speculate about possible signatures of D matter from ultrahigh energy cosmic rays and colliders

  7. Sustainable and economic feasibility of castor bean oil production: a methodology to support investments decision taking; Viabilidade economica e sustentabilidade da producao de oleo de mamona: uma metodologia de suporte a decisao de investimentos

    Energy Technology Data Exchange (ETDEWEB)

    Maia, Glawther Lima [Universidade Federal do Ceara (UFC), Fortaleza, CE (Brazil). Mestrado em Logistica e Pesquisa Operacional; Arruda, Joao Bosco Furtado [Universidade Federal do Ceara (UFC), Fortaleza, CE (Brazil)

    2008-07-01

    The castor bean agribusiness is supported by a complex supply chain that involves several inputs, main products and other derived products. So, the control of production costs is a basic and extremely important activity for the optimization of management processes. Using literature research, case study and assessment of scenarios, the present work was developed with the objective of conceiving a methodology, based on theoretical and economical aspects, the formation of costs and the revenue forecasting, which is applied with the aid of a computational program for supporting decision taking in the castor bean small producers agribusiness. The results obtained in the case study show, for example, that about 40% of the projects are considered accepted under the criterion of recovering the invested capital in less than 2 years. Also, it reveals that 75% of the projects contribute for the increase of the producer income and about 63% of the projects provide enough profit to add value to the producer's properties. Finally, it is clear that the proposed methodology makes easier a better agricultural planning, allowing a better utilization of the raw material and it facilitates the analysis of diversifying other derived products and revenue sources in the castor bean chain. (author)

  8. Sustainable and economic feasibility of castor bean oil production: a methodology to support investments decision taking; Viabilidade economica e sustentabilidade da producao de oleo de mamona: uma metodologia de suporte a decisao de investimentos

    Energy Technology Data Exchange (ETDEWEB)

    Maia, Glawther Lima [Universidade Federal do Ceara (UFC), Fortaleza, CE (Brazil). Mestrado em Logistica e Pesquisa Operacional; Arruda, Joao Bosco Furtado [Universidade Federal do Ceara (UFC), Fortaleza, CE (Brazil)

    2008-07-01

    The castor bean agribusiness is supported by a complex supply chain that involves several inputs, main products and other derived products. So, the control of production costs is a basic and extremely important activity for the optimization of management processes. Using literature research, case study and assessment of scenarios, the present work was developed with the objective of conceiving a methodology, based on theoretical and economical aspects, the formation of costs and the revenue forecasting, which is applied with the aid of a computational program for supporting decision taking in the castor bean small producers agribusiness. The results obtained in the case study show, for example, that about 40% of the projects are considered accepted under the criterion of recovering the invested capital in less than 2 years. Also, it reveals that 75% of the projects contribute for the increase of the producer income and about 63% of the projects provide enough profit to add value to the producer's properties. Finally, it is clear that the proposed methodology makes easier a better agricultural planning, allowing a better utilization of the raw material and it facilitates the analysis of diversifying other derived products and revenue sources in the castor bean chain. (author)

  9. Dark Matter

    International Nuclear Information System (INIS)

    Holt, S. S.; Bennett, C. L.

    1995-01-01

    These proceedings represent papers presented at the Astrophysics conference in Maryland, organized by NASA Goddard Space Flight Center and the University of Maryland. The topics covered included low mass stars as dark matter, dark matter in galaxies and clusters, cosmic microwave background anisotropy, cold and hot dark matter, and the large scale distribution and motions of galaxies. There were eighty five papers presented. Out of these, 10 have been abstracted for the Energy Science and Technology database

  10. Dark Matter

    International Nuclear Information System (INIS)

    Bashir, A.; Cotti, U.; De Leon, C. L.; Raya, A; Villasenor, L.

    2008-01-01

    One of the biggest scientific mysteries of our time resides in the identification of the particles that constitute a large fraction of the mass of our Universe, generically known as dark matter. We review the observations and the experimental data that imply the existence of dark matter. We briefly discuss the properties of the two best dark-matter candidate particles and the experimental techniques presently used to try to discover them. Finally, we mention a proposed project that has recently emerged within the Mexican community to look for dark matter

  11. METHODOLOGICAL ISSUES IN THE USE OF GENERALIZED ADDITIVE MODELS FOR THE ANALYSIS OF PARTICULATE MATTER; CONFERENCE PROCEEDINGS FOR 9TH INT'L. INHALATION SYMPOSIUM ON EFFECTS OF AIR CONTAMINANTS ON THE RESPIRATORY TRACT - INTERPRETATIONS FROM MOLECULES TO META ANALYSIS

    Science.gov (United States)

    Open cohort ("time-series") studies of the adverse health effects of short-term exposures to ambient particulate matter and gaseous co-pollutants have been essential in the standard setting process. Last year, a number of serious issues were raised concerning the fitting of Gener...

  12. Improvement of radiological consequence estimation methodologies for NPP accidents in the ARGOS and RODOS decision support systems through consideration of contaminant physico-chemical forms

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, K.G.; Roos, P. [Technical University of Denmark - DTU (Denmark); Lind, O.C.; Salbu, B. [Norwegian University of Life Sciences/CERAD - NMBU (Norway); Bujan, A.; Duranova, T. [VUJE, Inc. (Slovakia); Ikonomopoulos, A.; Andronopoulos, S. [National Centre for Scientific Research ' Demokritos' (Greece)

    2014-07-01

    The European standard computerized decision support systems RODOS and ARGOS, which are integrated in the operational nuclear emergency preparedness in practically all European countries, as well as in a range of non-European countries, are highly valuable tools for radiological consequence estimation, e.g., in connection with planning and exercising as well as in justification and optimization of intervention strategies. Differences between the Chernobyl and Fukushima accident atmospheric release source terms have demonstrated that differences in release conditions and processes may lead to very different degrees of volatilization of some radionuclides. Also the physico-chemical properties of radionuclides released can depend strongly on the release process. An example from the Chernobyl accident of the significance of this is that strontium particles released in the fire were oxidized and thus generally physico-chemically different from those released during the preceding explosion. This is reflected in the very different environmental mobility of the two groups of particles. The initial elemental matrix characteristics of the contaminants, as well as environmental parameters like pH, determine for instance the particle dissolution time functions, and thus the environmental mobility and potential for uptake in living organisms. As ICRP recommends optimization of intervention according to residual dose, it is crucial to estimate long term dose contributions adequately. In the EURATOM FP7 project PREPARE, an effort is made to integrate physico-chemical forms of contaminants in scenario-specific source term determination, thereby enabling consideration of influences on atmospheric dispersion/deposition, post-deposition migration, and effectiveness of countermeasure implementation. The first step in this context was to investigate, based on available experience, the important physico-chemical properties of radio-contaminants that might potentially be released to the

  13. Hanford Site environmental setting data developed for the unit risk factor methodology in support of the Programmatic Environmental Impact Statement (PEIS)

    International Nuclear Information System (INIS)

    Schramke, J.A.; Glantz, C.S.; Holdren, G.R.

    1994-05-01

    This report describes the environmental settings identified for the Hanford Site in support of the US Department of Energy's (DOE's) Programmatic Environmental Impact Study (PEIS). The objective of the PEIS is to provide the public with information about the types of waste and contamination problems associated with major DOE facilities across the country and to assess the relative risks that these wastes pose to the public, onsite workers, and the environment. The environmental setting information consists of the site-specific data required to model (using the Multimedia Environmental Pollutant Assessment System) the atmospheric, groundwater, and surface-water transport of contaminants within the boundaries of the Hanford Site. The environmental setting data describes the climate, atmospheric dispersion, hydrogeology, and surface-water characteristics of the Site. The number of environmental settings developed for the Hanford Site was the fewest that could provide accurate results when used in the risk assessment modeling. Environmental settings for Hanford were developed in conjunction with local experts in the fields of meteorology, geology, hydrology, and geochemistry. Site experts participated in the initial development, fine-tuning, and final review of Hanford's PEIS environmental settings

  14. Integrated micro-economic modelling and multi-criteria methodology to support public decision-making: the case of liquid bio-fuels in France

    International Nuclear Information System (INIS)

    Rozakis, S.; Sourie, J.-C.; Vanderpooten, D.

    2001-01-01

    Decision making to determine government support policy for agro-energy industry can be assisted by mathematical programming and Multiple Criteria procedures. In this case study, tax credit policy in the French bio-fuel industry producing ethanol and esters is determined. Micro-economic models simulate the agricultural sector and the bio-fuel industry through multi-level mixed integer linear programming. Aggregate supply of energy crops at the national level is estimated using a staircase model of 450 individual farm sub-models specialising in arable cropping. The government acts as a leader, since bio-fuel chains depend on subsidies. The model provides rational responses of the industry, taking into account of the energy crops' supply, to any public policy scheme (unitary tax exemptions for bio-fuels subject to budgetary constraints) as well as the performance of each response regarding total greenhouse gases emissions (GHG), budgetary expenditure and agents' surpluses. Budgetary, environmental and social concerns will affect policy decisions, and a multi-criteria optimisation module projects the decision maker aims at the closest feasible compromise solutions. When public expenditure is the first priority, the best compromise solution corresponds to tax exemptions of about 2 FF l -1 [FF: French Franc (1Euro equivalent to 6.559FF)] for ester and 3FF l -1 for ethanol (current tax exemptions amount at 2.30FF l -1 for ester and 3.30FF l -1 for ethanol). On the other hand, a priority on the reduction of GHG emissions requires an increase of ester volume produced at the expense of ethanol production (2.30 FF l -1 for both ester and ethanol chains proposed by the model). (Author)

  15. Integrated micro-economic modelling and multi-criteria methodology to support public decision-making: the case of liquid bio-fuels in France

    Energy Technology Data Exchange (ETDEWEB)

    Rozakis, S.; Sourie, J.-C. [Institut National de la Recherche Agronomique, Economie et Sociologie Rurales, Thiveral-Grignon, 78 (France); Vanderpooten, D. [Universite Paris-Dauphine, LAMSADE, Paris, 75 (France)

    2001-07-01

    Decision making to determine government support policy for agro-energy industry can be assisted by mathematical programming and Multiple Criteria procedures. In this case study, tax credit policy in the French bio-fuel industry producing ethanol and esters is determined. Micro-economic models simulate the agricultural sector and the bio-fuel industry through multi-level mixed integer linear programming. Aggregate supply of energy crops at the national level is estimated using a staircase model of 450 individual farm sub-models specialising in arable cropping. The government acts as a leader, since bio-fuel chains depend on subsidies. The model provides rational responses of the industry, taking into account of the energy crops' supply, to any public policy scheme (unitary tax exemptions for bio-fuels subject to budgetary constraints) as well as the performance of each response regarding total greenhouse gases emissions (GHG), budgetary expenditure and agents' surpluses. Budgetary, environmental and social concerns will affect policy decisions, and a multi-criteria optimisation module projects the decision maker aims at the closest feasible compromise solutions. When public expenditure is the first priority, the best compromise solution corresponds to tax exemptions of about 2 FF l{sup -1} [FF: French Franc (1Euro equivalent to 6.559FF)] for ester and 3FF l{sup -1} for ethanol (current tax exemptions amount at 2.30FF l{sup -1} for ester and 3.30FF l{sup -1} for ethanol). On the other hand, a priority on the reduction of GHG emissions requires an increase of ester volume produced at the expense of ethanol production (2.30 FF l{sup -1} for both ester and ethanol chains proposed by the model). (Author)

  16. Testing methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs.

  17. Testing methodologies

    International Nuclear Information System (INIS)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical ''signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs

  18. Health-Related Coping and Social Interaction in People with Multiple Sclerosis Supported by a Social Network: Pilot Study With a New Methodological Approach.

    Science.gov (United States)

    Lavorgna, Luigi; Russo, Antonio; De Stefano, Manuela; Lanzillo, Roberta; Esposito, Sabrina; Moshtari, Fatemeh; Rullani, Francesco; Piscopo, Kyrie; Buonanno, Daniela; Brescia Morra, Vincenzo; Gallo, Antonio; Tedeschi, Gioacchino; Bonavita, Simona

    2017-07-14

    .com was perceived by users to be a useful tool to support health-related coping and social interaction, and may suggest a new kind of therapeutic alliance between physicians and people with MS. ©Luigi Lavorgna, Antonio Russo, Manuela De Stefano, Roberta Lanzillo, Sabrina Esposito, Fatemeh Moshtari, Francesco Rullani, Kyrie Piscopo, Daniela Buonanno, Vincenzo Brescia Morra, Antonio Gallo, Gioacchino Tedeschi, Simona Bonavita. Originally published in the Interactive Journal of Medical Research (http://www.i-jmr.org/), 14.07.2017.

  19. Programa de apoyo a la toma de decisiones sobre nuevas cargas en sistemas de distribución mediante margen de capacidad; Methodology for supporting the decision-making process concerning the connection of new customers to electrical distribution systems

    Directory of Open Access Journals (Sweden)

    Elvis Richard Tello Ortíz

    2011-02-01

    Full Text Available Resumen/ AbstractEste artículo presenta una nueva metodología, que tiene como principal objetivo mejorar las políticas dedecisión del área comercial de las empresas distribuidoras, en lo referente a la administración de laconexión de las nuevas cargas, con un proceso que permite analizar la capacidad de potencia disponibleen el sistema, e implementado un programa aplicativo desde los puntos de vista eléctricos, mecánicos yeconómicos, respectivamente. La principal característica del desarrollo de este programa radica en elmodelo matemático utilizado para representar el máximo consumo, con una carga conectada y calcular lapotencia aún disponible en el sistema, además otro aspecto trascendente se encuentra en la integracióncompleta con las plataformas de Sistemas Informáticos Geográficos (GIS, actualmente utilizados por lasempresas de Distribución de Energía Eléctrica. This work presents a new methodology aimed at supporting the decision-making process regarding theconnection of new loads in electrical distribution networks. The methodology is based upon the innovativeconcept of capacity margin, established within this work, by which the maximum load that can beconnected to any given point is computed in advance in offline mode. The criteria for finding the maximumload are based on the common requirements of minimum voltage and maximum loading. The proposedmethodology automatically eliminates the possibility of making incorrect decisions associated withconventional filtering techniques. The application of the proposed methodology is illustrated throughvarious examples which use real distribution networks, at both low and medium voltage levels.

  20. Photocatalytic degradation of dissolved organic matter in the ground water employing TiO2 film supported on stainless steel plate

    International Nuclear Information System (INIS)

    Andayani, W.; Sumartono, A.; Lindu, M.

    2012-01-01

    The Taman Palem Residences, Cengkareng, Indonesia has a groundwater problem as a main sources of drinking water in the area due to yellowish brown colour of the water, that may come from dissolved organic matter (DOM), humic substances. Photocatalytic degradation using TiO 2 coated on a stainless steel plate (8 x 8 cm) to degrade the dissolved organic matter was studied. Groundwater samples were collected at 150 m deep from Taman Palem Residences. The TiO 2 catalyst was made from deep coating in a sol-gel system of titanium (IV) diisopropoxidebisacetylacetonate (TAA) precursor and immobilized at stainless steel plate (8 x 8 cm), followed by calcination at 525°C. Two catalyst sheets were put in batch reactor containing groundwater. The ground water containing DOM were irradiated by UV black light at varying initial pH values i.e 5, 7 and 9. Sampling of solution was taken at the interval time of 0, 1, 2, 4, and 6 hours. DOM residu in water before and after irradiation were measured by spectrophotometer UV-Vis at 300 nm. Photocatalytic degradation of DOM was greater in acid solution than in basic solution. The determination of intermediate degradation products by HPLC revealed that oxalic acid was detected consistently. (author)

  1. Chemical Force Spectroscopy Evidence Supporting the Layer-by-Layer Model of Organic Matter Binding to Iron (oxy)Hydroxide Mineral Surfaces

    KAUST Repository

    Chassé , Alexander W.; Ohno, Tsutomu; Higgins, Steven R.; Amirbahman, Aria; Yildirim, Nadir; Parr, Thomas B.

    2015-01-01

    © 2015 American Chemical Society. The adsorption of dissolved organic matter (DOM) to metal (oxy)hydroxide mineral surfaces is a critical step for C sequestration in soils. Although equilibrium studies have described some of the factors controlling this process, the molecular-scale description of the adsorption process has been more limited. Chemical force spectroscopy revealed differing adhesion strengths of DOM extracted from three soils and a reference peat soil material to an iron (oxy)hydroxide mineral surface. The DOM was characterized using ultrahigh-resolution negative ion mode electrospray ionization Fourier Transform ion cyclotron resonance mass spectrometry. The results indicate that carboxyl-rich aromatic and N-containing aliphatic molecules of DOM are correlated with high adhesion forces. Increasing molecular mass was shown to decrease the adhesion force between the mineral surface and the DOM. Kendrick mass defect analysis suggests that mechanisms involving two carboxyl groups result in the most stable bond to the mineral surface. We conceptualize these results using a layer-by-layer "onion" model of organic matter stabilization on soil mineral surfaces.

  2. Sleep spindles: a physiological marker of age-related changes in gray matter in brain regions supporting motor skill memory consolidation.

    Science.gov (United States)

    Fogel, Stuart; Vien, Catherine; Karni, Avi; Benali, Habib; Carrier, Julie; Doyon, Julien

    2017-01-01

    Sleep is necessary for the optimal consolidation of procedural learning, and in particular, for motor sequential skills. Motor sequence learning remains intact with age, but sleep-dependent consolidation is impaired, suggesting that memory deficits for procedural skills are specifically impacted by age-related changes in sleep. Age-related changes in spindles may be responsible for impaired motor sequence learning consolidation, but the morphological basis for this deficit is unknown. Here, we found that gray matter in the hippocampus and cerebellum was positively correlated with both sleep spindles and offline improvements in performance in young participants but not in older participants. These results suggest that age-related changes in gray matter in the hippocampus relate to spindles and may underlie age-related deficits in sleep-related motor sequence memory consolidation. In this way, spindles can serve as a biological marker for structural brain changes and the related memory deficits in older adults. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Chemical Force Spectroscopy Evidence Supporting the Layer-by-Layer Model of Organic Matter Binding to Iron (oxy)Hydroxide Mineral Surfaces

    KAUST Repository

    Chassé, Alexander W.

    2015-08-18

    © 2015 American Chemical Society. The adsorption of dissolved organic matter (DOM) to metal (oxy)hydroxide mineral surfaces is a critical step for C sequestration in soils. Although equilibrium studies have described some of the factors controlling this process, the molecular-scale description of the adsorption process has been more limited. Chemical force spectroscopy revealed differing adhesion strengths of DOM extracted from three soils and a reference peat soil material to an iron (oxy)hydroxide mineral surface. The DOM was characterized using ultrahigh-resolution negative ion mode electrospray ionization Fourier Transform ion cyclotron resonance mass spectrometry. The results indicate that carboxyl-rich aromatic and N-containing aliphatic molecules of DOM are correlated with high adhesion forces. Increasing molecular mass was shown to decrease the adhesion force between the mineral surface and the DOM. Kendrick mass defect analysis suggests that mechanisms involving two carboxyl groups result in the most stable bond to the mineral surface. We conceptualize these results using a layer-by-layer "onion" model of organic matter stabilization on soil mineral surfaces.

  4. Gaseous Matter

    CERN Document Server

    Angelo, Joseph A

    2011-01-01

    aseous Matter focuses on the many important discoveries that led to the scientific interpretation of matter in the gaseous state. This new, full-color resource describes the basic characteristics and properties of several important gases, including air, hydrogen, helium, oxygen, and nitrogen. The nature and scope of the science of fluids is discussed in great detail, highlighting the most important scientific principles upon which the field is based. Chapters include:. Gaseous Matter An Initial Perspective. Physical Characteristics of Gases. The Rise of the Science of Gases. Kinetic Theory of

  5. Technical support document: Energy efficiency standards for consumer products: Room air conditioners, water heaters, direct heating equipment, mobile home furnaces, kitchen ranges and ovens, pool heaters, fluorescent lamp ballasts and television sets. Volume 1, Methodology

    Energy Technology Data Exchange (ETDEWEB)

    1993-11-01

    The Energy Policy and Conservation Act (P.L. 94-163), as amended, establishes energy conservation standards for 12 of the 13 types of consumer products specifically covered by the Act. The legislation requires the Department of Energy (DOE) to consider new or amended standards for these and other types of products at specified times. DOE is currently considering amending standards for seven types of products: water heaters, direct heating equipment, mobile home furnaces, pool heaters, room air conditioners, kitchen ranges and ovens (including microwave ovens), and fluorescent light ballasts and is considering establishing standards for television sets. This Technical Support Document presents the methodology, data, and results from the analysis of the energy and economic impacts of the proposed standards. This volume presents a general description of the analytic approach, including the structure of the major models.

  6. Methodology for clinical trials involving patients with cancer who have febrile neutropenia: updated guidelines of the Immunocompromised Host Society/Multinational Association for Supportive Care in Cancer, with emphasis on outpatient studies.

    Science.gov (United States)

    Feld, Ronald; Paesmans, Marianne; Freifeld, Alison G; Klastersky, Jean; Pizzo, Philip A; Rolston, Kenneth V I; Rubenstein, Edward; Talcott, James A; Walsh, Thomas J

    2002-12-15

    Two multinational organizations, the Immunocompromised Host Society and the Multinational Association for Supportive Care in Cancer, have produced for investigators and regulatory bodies a set of guidelines on methodology for clinical trials involving patients with febrile neutropenia. The guidelines suggest that response (i.e., success of initial empirical antibiotic therapy without any modification) be determined at 72 h and again on day 5, and the reasons for modification should be stated. Blinding and stratification are to be encouraged, as should statistical consideration of trials specifically designed for showing equivalence. Patients enrolled in outpatient studies should be selected by use of a validated risk model, and patients should be carefully monitored after discharge from the hospital. Response and safety parameters should be recorded along with readmission rates. If studies use these guidelines, comparisons between studies will be simpler and will lead to further improvements in patient therapy.

  7. Dark matters

    International Nuclear Information System (INIS)

    Silk, Joseph

    2010-01-01

    One of the greatest mysteries in the cosmos is that it is mostly dark. That is, not only is the night sky dark, but also most of the matter and the energy in the universe is dark. For every atom visible in planets, stars and galaxies today there exists at least five or six times as much 'Dark Matter' in the universe. Astronomers and particle physicists today are seeking to unravel the nature of this mysterious but pervasive dark matter, which has profoundly influenced the formation of structure in the universe. Dark energy remains even more elusive, as we lack candidate fields that emerge from well established physics. I will describe various attempts to measure dark matter by direct and indirect means, and discuss the prospects for progress in unravelling dark energy.

  8. Dirac matter

    CERN Document Server

    Rivasseau, Vincent; Fuchs, Jean-Nöel

    2017-01-01

    This fifteenth volume of the Poincare Seminar Series, Dirac Matter, describes the surprising resurgence, as a low-energy effective theory of conducting electrons in many condensed matter systems, including graphene and topological insulators, of the famous equation originally invented by P.A.M. Dirac for relativistic quantum mechanics. In five highly pedagogical articles, as befits their origin in lectures to a broad scientific audience, this book explains why Dirac matters. Highlights include the detailed "Graphene and Relativistic Quantum Physics", written by the experimental pioneer, Philip Kim, and devoted to graphene, a form of carbon crystallized in a two-dimensional hexagonal lattice, from its discovery in 2004-2005 by the future Nobel prize winners Kostya Novoselov and Andre Geim to the so-called relativistic quantum Hall effect; the review entitled "Dirac Fermions in Condensed Matter and Beyond", written by two prominent theoreticians, Mark Goerbig and Gilles Montambaux, who consider many other mater...

  9. EVALUATION OF THE GRAI INTEGRATED METHODOLOGY AND THE IMAGIM SUPPORTWARE

    Directory of Open Access Journals (Sweden)

    J.M.C. Reid

    2012-01-01

    Full Text Available This paper describes the GRAI Integrated Methodology and identifies the need for computer tools to support enterprise modelling,design and integration. The IMAGIM tool is then evaluated in terms of its ability to support the GRAI Integrated Methodology. The GRAI Integrated Methodology is an Enterprise Integration methodology developed to support the design of CIM systems . The GRAI Integrated Methodology consists of the GRAI model and a structured approach. The latest addition to the methodology is the IMAGIM software tool developed by the GRAI research group for the specific purpose of supporting the methodology.

  10. International Expert Review of Sr-Can: Safety Assessment Methodology - External review contribution in support of SSI's and SKI's review of SR-Can

    Energy Technology Data Exchange (ETDEWEB)

    Sagar, Budhi (Center for Nuclear Waste Regulatory Analyses, Southwest Research Inst., San Antonio, TX (US)); Egan, Michael (Quintessa Limited, Henley-on-Thames (GB)); Roehlig, Klaus-Juergen (Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (DE)); Chapman, Neil (Independent Consultant (XX)); Wilmot, Roger (Galson Sciences Limited, Oakham (GB))

    2008-03-15

    In 2006, SKB published a safety assessment (SR-Can) as part of its work to support a licence application for the construction of a final repository for spent nuclear fuel. The purposes of the SR-Can project were stated in the main project report to be: 1. To make a first assessment of the safety of potential KBS-3 repositories at Forsmark and Laxemar to dispose of canisters as specified in the application for the encapsulation plant. 2. To provide feedback to design development, to SKB's research and development (R and D) programme, to further site investigations and to future safety assessments. 3. To foster a dialogue with the authorities that oversee SKB's activities, i.e. the Swedish Nuclear Power Inspectorate, SKI, and the Swedish Radiation Protection Authority, SSI, regarding interpretation of applicable regulations, as a preparation for the SR-Site project. To help inform their review of SKB's proposed approach to development of the longterm safety case, the authorities appointed three international expert review teams to carry out a review of SKB's SR-Can safety assessment report. Comments from one of these teams - the Safety Assessment Methodology (SAM) review team - are presented in this document. The SAM review team's scope of work included an examination of SKB's documentation of the assessment ('Long-term safety for KBS-3 Repositories at Forsmark and Laxemar - a first evaluation' and several supporting reports) and hearings with SKB staff and contractors, held in March 2007. As directed by SKI and SSI, the SAM review team focused on methodological aspects and sought to determine whether SKB's proposed safety assessment methodology is likely to be suitable for use in the future SR-Site and to assess its consistency with the Swedish regulatory framework. No specific evaluation of long-term safety or site acceptability was undertaken by any of the review teams. SKI and SSI's Terms of Reference for the SAM

  11. Men Do Matter: Ethnographic Insights on the Socially Supportive Role of the African American Uncle in the Lives of Inner-City African American Male Youth

    Science.gov (United States)

    Richardson, Joseph B., Jr.

    2009-01-01

    This article examines the role of the African American uncle as a vital yet overlooked form of social support and social capital in the lives of adolescent African American male sons living in single-female-headed households. Research rarely examines the affective roles and functions of men in Black families; moreover, poor urban Black male youth…

  12. Soil Radiological Characterisation Methodology

    International Nuclear Information System (INIS)

    Attiogbe, Julien; Aubonnet, Emilie; De Maquille, Laurence; De Moura, Patrick; Desnoyers, Yvon; Dubot, Didier; Feret, Bruno; Fichet, Pascal; Granier, Guy; Iooss, Bertrand; Nokhamzon, Jean-Guy; Ollivier Dehaye, Catherine; Pillette-Cousin, Lucien; Savary, Alain

    2014-12-01

    This report presents the general methodology and best practice approaches which combine proven existing techniques for sampling and characterisation to assess the contamination of soils prior to remediation. It is based on feedback of projects conducted by main French nuclear stakeholders involved in the field of remediation and dismantling (EDF, CEA, AREVA and IRSN). The application of this methodology will enable the project managers to obtain the elements necessary for the drawing up of files associated with remediation operations, as required by the regulatory authorities. It is applicable to each of the steps necessary for the piloting of remediation work-sites, depending on the objectives targeted (release into the public domain, re-use, etc.). The main part describes the applied statistical methodology with the exploratory analysis and variogram data, identification of singular points and their location. The results obtained permit assessment of a mapping to identify the contaminated surface and subsurface areas. It stakes the way for radiological site characterisation since the initial investigations from historical and functional analysis to check that the remediation objectives have been met. It follows an example application from the feedback of the remediation of a contaminated site on the Fontenay aux Roses facility. It is supplemented by a glossary of main terms used in the field from different publications or international standards. This technical report is a support of the ISO Standard ISO ISO/TC 85/SC 5 N 18557 'Sampling and characterisation principles for soils, buildings and infrastructures contaminated by radionuclides for remediation purposes'. (authors) [fr

  13. Quark matter

    Energy Technology Data Exchange (ETDEWEB)

    Csernai, L.; Kampert, K. H.

    1994-10-15

    Precisely one decade ago the GSI (Darmstadt)/LBL (Berkeley) Collaboration at the Berkeley Bevalac reported clear evidence for collective sidewards flow in high energy heavy ion collisions. This milestone observation clearly displayed the compression and heating up of nuclear matter, providing new insights into how the behaviour of nuclear matter changes under very different conditions. This year, evidence for azimuthally asymmetric transverse flow at ten times higher projectile energy (11 GeV per nucleon gold on gold collisions) was presented by the Brookhaven E877 collaboration at the recent European Research Conference on ''Physics of High Energy Heavy Ion Collisions'', held in Helsinki from 17-22 June.

  14. Self-compassion matters: The relationships between perceived social support, self-compassion, and subjective well-being among LGB individuals in Turkey.

    Science.gov (United States)

    Toplu-Demirtaş, Ezgi; Kemer, Gülşah; Pope, Amber L; Moe, Jeffry L

    2018-04-01

    Research on the well-being of lesbian, gay, and bisexual (LGB) people has predominately focused on Western (-ized) societies where individualism, and not collectivism, is emphasized. In the present study, we utilized a mediator model via Structural Equation Modeling (SEM) to examine the relationships between self-compassion (i.e., self-kindness, common humanity, and mindfulness), perceived social support (i.e., family, friends, and significant others), and subjective well-being (i.e., life satisfaction, positive affect, and negative affect) in a sample of LGB-identified individuals living in Turkey, a traditionally collectivistic culture (Hofstede, 2001). A sample of 291 LGB individuals (67 lesbian, 128 gay, and 96 bisexual) completed an online survey including the Positive and Negative Affect Schedule, Satisfaction with Life Scale, Multidimensional Scale of Perceived Social Support Scale, and Self-kindness, Common Humanity, and Mindfulness subscales of the Self-Compassion Scale. The results of SEM for the hypothesized mediator model revealed that self-compassion mediated the relationships between perceived social support from family and significant others and subjective well-being, explaining the 77% of the variance in subjective well-being. Implications for the literature base on LGB well-being are discussed, with a focus on the cross-cultural applications. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  15. Dark Matter

    Indian Academy of Sciences (India)

    As if this was not enough, it turns out that if our knowledge of ... are thought to contain dark matter, although the evidences from them are the .... protons, electrons, neutrons ... ratio of protons to neutrons was close to unity then as they were in ...

  16. Quantum matter

    International Nuclear Information System (INIS)

    Buechler, Hans Peter; Calcarco, Tommaso; Dressel, Martin

    2008-01-01

    The following topics are dealt with: Artificial atoms and molecules, tailored from solids, fractional flux quanta, molecular magnets, controlled interaction in quantum gases, the theory of quantum correlations in mott matter, cold gases, and mesoscopic systems, Bose-Einstein condensates on the chip, on the route to the quantum computer, a quantum computer in diamond. (HSI)

  17. Molecule Matters

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 14; Issue 4. Molecule Matters – van der Waals Molecules - History and Some Perspectives on Intermolecular Forces. E Arunan. Feature Article Volume 14 Issue 4 April 2009 pp 346-356 ...

  18. Molecule Matters

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 16; Issue 12. Molecule Matters - Dinitrogen. A G Samuelson J Jabadurai. Volume 16 Issue 12 ... Author Affiliations. A G Samuelson1 J Jabadurai1. Department of Inroganic and Physical Chemistry, Indian Institute of Science, Bangalore 560 012, India.

  19. Interstellar matter

    International Nuclear Information System (INIS)

    Mezger, P.G.

    1978-01-01

    An overview of the formation of our galaxy is presented followed by a summary of recent work in star formation and related topics. Selected discussions are given on interstellar matter including absorption characteristics of dust, the fully ionised component of the ISM and the energy density of lyc-photons in the solar neighbourhood and the diffuse galactic IR radiation

  20. Dark Matter

    Indian Academy of Sciences (India)

    The study of gas clouds orbiting in the outer regions of spiral galaxies has revealed that their gravitational at- traction is much larger than the stars alone can provide. Over the last twenty years, astronomers have been forced to postulate the presence of large quantities of 'dark matter' to explain their observations. They are ...

  1. Molecule Matters

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 11; Issue 9. Molecule Matters - A Chromium Compound with a Quintuple Bond. K C Kumara Swamy. Feature Article Volume 11 Issue 9 September 2006 pp 72-75. Fulltext. Click here to view fulltext PDF. Permanent link:

  2. Dark Matter

    International Nuclear Information System (INIS)

    Audouze, J.; Tran Thanh Van, J.

    1988-01-01

    The book begins with the papers devoted to the experimental search of signatures of the dark matter which governs the evolution of the Universe as a whole. A series of contributions describe the presently considered experimental techniques (cryogenic detectors, supraconducting detectors...). A real dialogue concerning these techniques has been instaured between particle physicists and astrophysicists. After the progress report of the particle physicists, the book provides the reader with an updated situation concerning the research in cosmology. The second part of the book is devoted to the analysis of the backgrounds at different energies such as the possible role of the cooling flows in the constitution of massive galactic halos. Any search of dark matter implies necessarily the analysis of the spatial distributions of the large scale structures of the Universe. This report is followed by a series of statistical analyses of these distributions. These analyses concern mainly universes filled up with cold dark matter. The last paper of this third part concerns the search of clustering in the spatial distribution of QSOs. The presence of dark matter should affect the solar neighborhood and related to the existence of galactic haloes. The contributions are devoted to the search of such local dark matter. Primordial nucleosynthesis provides a very powerful tool to set up quite constraining limitations on the overall baryonic density. Even if on takes into account the inhomogeneities in density possibly induced by the Quark-Hadron transition, this baryonic density should be much lower than the overall density deduced from the dynamical models of Universe or the inflationary theories

  3. Final report of the accident phenomenology and consequence (APAC) methodology evaluation. Spills Working Group

    Energy Technology Data Exchange (ETDEWEB)

    Brereton, S.; Shinn, J. [Lawrence Livermore National Lab., CA (United States); Hesse, D [Battelle Columbus Labs., OH (United States); Kaninich, D. [Westinghouse Savannah River Co., Aiken, SC (United States); Lazaro, M. [Argonne National Lab., IL (United States); Mubayi, V. [Brookhaven National Lab., Upton, NY (United States)

    1997-08-01

    The Spills Working Group was one of six working groups established under the Accident Phenomenology and Consequence (APAC) methodology evaluation program. The objectives of APAC were to assess methodologies available in the accident phenomenology and consequence analysis area and to evaluate their adequacy for use in preparing DOE facility safety basis documentation, such as Basis for Interim Operation (BIO), Justification for Continued Operation (JCO), Hazard Analysis Documents, and Safety Analysis Reports (SARs). Additional objectives of APAC were to identify development needs and to define standard practices to be followed in the analyses supporting facility safety basis documentation. The Spills Working Group focused on methodologies for estimating four types of spill source terms: liquid chemical spills and evaporation, pressurized liquid/gas releases, solid spills and resuspension/sublimation, and resuspension of particulate matter from liquid spills.

  4. Separation of BSA through FAU-type zeolite ceramic composite membrane formed on tubular ceramic support: Optimization of process parameters by hybrid response surface methodology and biobjective genetic algorithm.

    Science.gov (United States)

    Vinoth Kumar, R; Ganesh Moorthy, I; Pugazhenthi, G

    2017-08-09

    In this study, Faujasite (FAU) zeolite was coated on low-cost tubular ceramic support as a separating layer through hydrothermal route. The mixture of silicate and aluminate solutions was used to create a zeolitic separation layer on the support. The prepared zeolite ceramic composite membrane was characterized using X-ray powder diffraction (XRD), Fourier transform infrared spectroscopy (FTIR), particle size distribution (PSD), field emission scanning electron microscopy (FESEM), and zeta potential measurements. The porosity of ceramic support (53%) was reduced by the deposition of FAU (43%) zeolite layer. The pore size and water permeability of the membrane were evaluated as 0.179 µm and 1.62 × 10 -7  m 3 /m 2  s kPa, respectively, which are lower than that of the support (pore size of 0.309 µm and water permeability of 5.93 × 10 -7  m 3 /m 2  s kPa). The permeate flux and rejection potential of the prepared membrane were evaluated by microfiltration of bovine serum albumin (BSA). To study the influences of three independent variables such as operating pressure (68.94-275.79 kPa), concentration of BSA (100-500 ppm), and solution pH (2-4) on permeate flux and percentage of rejection, the response surface methodology (RSM) was used. The predicted models for permeate flux and rejection were further subjected to biobjective genetic algorithm (GA). The hybrid RSM-GA approach resulted in a maximum permeate flux of 2.66 × 10 -5  m 3 /m 2  s and BSA rejection of 88.02%, at which the optimum conditions were attained as 100 ppm BSA concentration, 2 pH solution, and 275.79 kPa applied pressure. In addition, the separation efficiency was compared with other membranes applied for BSA separation to know the potential of the fabricated FAU zeolite ceramic composite membrane.

  5. Biogas and reduction of organic matter in anaerobic reactor with continuous flow means support; Producao de biogas e reducao de materia organica em reatores anaerobicos de fluxo continuo com meio suporte

    Energy Technology Data Exchange (ETDEWEB)

    Kunzler, Kathia Regina; Gomes, Simone Damasceno; Goncalves, Jefferson Luiz; Kuczman, Osvaldo [Universidade Estadual do Oeste do Parana (PGEAGRI/UNIOESTE), Cascavel, PR (Brazil). Programa de Pos-Graduacao em Engenharia Agricola], Emails: kathiark@yahoo.com.br, simoned@unioeste.br; Piana, Pitagoras Augusto [Universidade Estadual do Oeste do Parana (UNIOESTE), Toledo, PR (Brazil)

    2010-07-01

    Starch processing industries are to obtain cassava starch. Its main residue is the effluent resulting from pressing the roots, Manipueira, high organic load and toxic. In this study, we compared the removal efficiency of organic loading and biogas production in anaerobic reactors, with the support means bamboo in different relations width: height. The first lesion diameter of 15 cm and a length of 90 cm, a ratio 1:6 and the second with a diameter of 20 cm and 60 cm long, ratio of 1:3. The support medium consisted of rings of bamboo with 10 cm length and diameters between 1.7 and 2.5 cm. The loads applied were 0.519, 1.156, 1.471, 3.813, 4.347, 4.708 and 5.601gDQO/L.day. To evaluate the removal efficiency of organic matter, the samples were subjected to analysis of DQO biogas production was assessed in terms of organic load removed. Bamboo as a support allowed the application of higher loads. The higher efficiency in the production of biogas was produced in the reactor with the highest ratio width: height, being more significant for the organic load of 5, 601 gCOD/L.day, showing more stability this. (author)

  6. STUDYING FOREST ROOT SYSTEMS - AN OVERVIEW OF METHODOLOGICAL PROBLEMS

    Science.gov (United States)

    The study of tree root systems is central to understanding forest ecosystem carbon and nutrient cycles, nutrient and water uptake, C allocation patterns by trees, soil microbial populations, adaptation of trees to stress, soil organic matter production, etc. Methodological probl...

  7. Clinical trial methodology

    National Research Council Canada - National Science Library

    Peace, Karl E; Chen, Ding-Geng

    2011-01-01

    ... in the pharmaceutical industry, Clinical trial methodology emphasizes the importance of statistical thinking in clinical research and presents the methodology as a key component of clinical research...

  8. Methodological Problems of Nanotechnoscience

    Science.gov (United States)

    Gorokhov, V. G.

    Recently, we have reported on the definitions of nanotechnology as a new type of NanoTechnoScience and on the nanotheory as a cluster of the different natural and engineering theories. Nanotechnology is not only a new type of scientific-engineering discipline, but it evolves also in a “nonclassical” way. Nanoontology or nano scientific world view has a function of the methodological orientation for the choice the theoretical means and methods toward a solution to the scientific and engineering problems. This allows to change from one explanation and scientific world view to another without any problems. Thus, nanotechnology is both a field of scientific knowledge and a sphere of engineering activity, in other words, NanoTechnoScience is similar to Systems Engineering as the analysis and design of large-scale, complex, man/machine systems but micro- and nanosystems. Nano systems engineering as well as Macro systems engineering includes not only systems design but also complex research. Design orientation has influence on the change of the priorities in the complex research and of the relation to the knowledge, not only to “the knowledge about something”, but also to the knowledge as the means of activity: from the beginning control and restructuring of matter at the nano-scale is a necessary element of nanoscience.

  9. Engineering radioecology: Methodological considerations

    International Nuclear Information System (INIS)

    Nechaev, A.F.; Projaev, V.V.; Sobolev, I.A.; Dmitriev, S.A.

    1995-01-01

    The term ''radioecology'' has been widely recognized in scientific and technical societies. At the same time, this scientific school (radioecology) does not have a precise/generally acknowledged structure, unified methodical basis, fixed subjects of investigation, etc. In other words, radioecology is a vast, important but rather amorphous conglomerate of various ideas, amalgamated mostly by their involvement in biospheric effects of ionizing radiation and some conceptual stereotypes. This paradox was acceptable up to a certain time. However, with the termination of the Cold War and because of remarkable political changes in the world, it has become possible to convert the problem of environmental restoration from the scientific sphere in particularly practical terms. Already the first steps clearly showed an imperfection of existing technologies, managerial and regulatory schemes; lack of qualified specialists, relevant methods and techniques; uncertainties in methodology of decision-making, etc. Thus, building up (or maybe, structuring) of special scientific and technological basis, which the authors call ''engineering radioecology'', seems to be an important task. In this paper they endeavored to substantiate the last thesis and to suggest some preliminary ideas concerning the subject matter of engineering radioecology

  10. Disposal Of Waste Matter

    International Nuclear Information System (INIS)

    Kim, Jeong Hyeon; Lee, Seung Mu

    1989-02-01

    This book deals with disposal of waste matter management of soiled waste matter in city with introduction, definition of waste matter, meaning of management of waste matter, management system of waste matter, current condition in the country, collect and transportation of waste matter disposal liquid waste matter, industrial waste matter like plastic, waste gas sludge, pulp and sulfuric acid, recycling technology of waste matter such as recycling system of Black clawson, Monroe and Rome.

  11. The NLC Software Requirements Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shoaee, Hamid

    2002-08-20

    We describe the software requirements and development methodology developed for the NLC control system. Given the longevity of that project, and the likely geographical distribution of the collaborating engineers, the planned requirements management process is somewhat more formal than the norm in high energy physics projects. The short term goals of the requirements process are to accurately estimate costs, to decompose the problem, and to determine likely technologies. The long term goal is to enable a smooth transition from high level functional requirements to specific subsystem and component requirements for individual programmers, and to support distributed development. The methodology covers both ends of that life cycle. It covers both the analytical and documentary tools for software engineering, and project management support. This paper introduces the methodology, which is fully described in [1].

  12. Quark matter

    International Nuclear Information System (INIS)

    Csernai, L.; Kampert, K.H.

    1994-01-01

    Precisely one decade ago the GSI (Darmstadt)/LBL (Berkeley) Collaboration at the Berkeley Bevalac reported clear evidence for collective sidewards flow in high energy heavy ion collisions. This milestone observation clearly displayed the compression and heating up of nuclear matter, providing new insights into how the behaviour of nuclear matter changes under very different conditions. This year, evidence for azimuthally asymmetric transverse flow at ten times higher projectile energy (11 GeV per nucleon gold on gold collisions) was presented by the Brookhaven E877 collaboration at the recent European Research Conference on ''Physics of High Energy Heavy Ion Collisions'', held in Helsinki from 17-22 June

  13. Elementary process theory: a formal axiomatic system with a potential application as a foundational framework for physics supporting gravitational repulsion of matter and antimatter

    International Nuclear Information System (INIS)

    Cabbolet, M.J.T.F.

    2010-01-01

    Theories of modern physics predict that antimatter having rest mass will be attracted by the earth's gravitational field, but the actual coupling of antimatter with gravitation has not been established experimentally. The purpose of the present research was to identify laws of physics that would govern the universe if antimatter having rest mass would be repulsed by the earth's gravitational field. As a result, a formalized axiomatic system was developed together with interpretation rules for the terms of the language: the intention is that every theorem of the system yields a true statement about physical reality. Seven non-logical axioms of this axiomatic system form the elementary process theory (EPT): this is then a scheme of elementary principles describing the dynamics of individual processes taking place at supersmall scale. It is demonstrated how gravitational repulsion functions in the universe of the EPT, and some observed particles and processes have been formalized in the framework of the EPT. Incompatibility of quantum mechanics (QM) and General Relativity (GR) with the EPT is proven mathematically; to demonstrate applicability to real world problems to which neither QM nor GR applies, the EPT has been applied to a theory of the Planck era of the universe. The main conclusions are that a completely formalized framework for physics has been developed supporting the existence of gravitational repulsion and that the present results give rise to a potentially progressive research program. (Abstract Copyright [2010], Wiley Periodicals, Inc.)

  14. Web-based modelling of energy, water and matter fluxes to support decision making in mesoscale catchments??the integrative perspective of GLOWA-Danube

    Science.gov (United States)

    Ludwig, R.; Mauser, W.; Niemeyer, S.; Colgan, A.; Stolz, R.; Escher-Vetter, H.; Kuhn, M.; Reichstein, M.; Tenhunen, J.; Kraus, A.; Ludwig, M.; Barth, M.; Hennicker, R.

    The GLOWA-initiative (Global Change of the water cycle), funded by the German Ministry of Research and Education (BMBF), has been established to address the manifold consequences of Global Change on regional water resources in a variety of catchment areas with different natural and cultural characteristics. Within this framework, the GLOWA-Danube project is dealing with the Upper Danube watershed as a representative mesoscale test site (∼75.000 km 2) for mountain-foreland regions in the temperate mid-latitudes. The principle objective is to identify, examine and develop new techniques of coupled distributed modelling for the integration of natural and socio-economic sciences. The transdisciplinary research in GLOWA-Danube develops an integrated decision support system, called DANUBIA, to investigate the sustainability of future water use. GLOWA-Danube, which is scheduled for a total run-time of eight years to operationally implement and establish DANUBIA, comprises a university-based network of experts with water-related competence in the fields of engineering, natural and social sciences. Co-operation with a network of stakeholders in water resources management of the Upper Danube catchment ensures that practical issues and future problems in the water sector of the region can be addressed. In order to synthesize a common understanding between the project partners, a standardized notation of parameters and functions and a platform-independent structure of computational methods and interfaces has been established, by making use of the unified modelling language, an industry standard for the structuring and co-ordination of large projects in software development [Booch et al., The Unified Modelling Language User Guide, Addison-Wesley, Reading, 1999]. DANUBIA is object-oriented, spatially distributed and raster-based at its core. It applies the concept of “proxels” (process pixels) as its basic objects, which have different dimensions depending on the viewing

  15. Media Matter

    Directory of Open Access Journals (Sweden)

    Holger Pötzsch

    2017-02-01

    Full Text Available The present contribution maps materialist advances in media studies. Based on the assumption that matter and materiality constitute significant aspects of communication processes and practices, I introduce four fields of inquiry - technology, political economy, ecology, and the body - and argue that these perspectives enable a more comprehensive understanding of the implications of contemporary technologically afforded forms of interaction. The article shows how each perspective can balance apologetic and apocalyptic approaches to the impact of in particular digital technologies, before it demonstrates the applicability of an integrated framework with reference to the techno-politics of NSA surveillance and the counter-practices of WikiLeaks.

  16. Play Matters

    DEFF Research Database (Denmark)

    Sicart (Vila), Miguel Angel

    ? In Play Matters, Miguel Sicart argues that to play is to be in the world; playing is a form of understanding what surrounds us and a way of engaging with others. Play goes beyond games; it is a mode of being human. We play games, but we also play with toys, on playgrounds, with technologies and design......, but not necessarily fun. Play can be dangerous, addictive, and destructive. Along the way, Sicart considers playfulness, the capacity to use play outside the context of play; toys, the materialization of play--instruments but also play pals; playgrounds, play spaces that enable all kinds of play; beauty...

  17. Does Market Remoteness Matter?

    OpenAIRE

    Moctar, Ndiaye; Elodie, Maitre d’Hôtel; Tristan, Le Cotty

    2015-01-01

    This paper addresses the role of market remoteness in explaining maize price volatility in Burkina Faso. A model of price formation is introduced to demonstrate formally that transport costs between urban and rural markets exacerbate maize price volatility. Empirical support is provided to the proposition by exploring an unusually rich data set of monthly maize price series across 28 markets over 2004-13. The methodology relies on an autoregressive conditional heteroskedasticity model to inve...

  18. Alternative Perspectives on Sustainability: Indigenous Knowledge and Methodologies

    Directory of Open Access Journals (Sweden)

    Meg Parsons

    2017-02-01

    Full Text Available Indigenous knowledge (IK is now recognized as being critical to the development of effective, equitable and meaningful strategies to address socio-ecological crises. However efforts to integrate IK and Western science frequently encounter difficulties due to different systems of knowledge production and underlying worldviews. New approaches are needed so that sustainability can progress on the terms that matter the most for the people involved. In this paper we discuss a case study from Aotearoa New Zealand where an indigenous community is in the process of renegotiating and enacting new indigenous-led approaches to address coupled socio-ecological crises. We reflect on novel methodological approaches that highlight the ways in which projects/knowledge are co-produced by a multiplicity of human and non-human actors. To this end we draw on conceptualizations of environmental ethics offered by indigenous scholars and propose alternative bodies of thought, methods, and practices that can support the wider sustainability agenda.

  19. Marginal Matter

    Science.gov (United States)

    van Hecke, Martin

    2013-03-01

    All around us, things are falling apart. The foam on our cappuccinos appears solid, but gentle stirring irreversibly changes its shape. Skin, a biological fiber network, is firm when you pinch it, but soft under light touch. Sand mimics a solid when we walk on the beach but a liquid when we pour it out of our shoes. Crucially, a marginal point separates the rigid or jammed state from the mechanical vacuum (freely flowing) state - at their marginal points, soft materials are neither solid nor liquid. Here I will show how the marginal point gives birth to a third sector of soft matter physics: intrinsically nonlinear mechanics. I will illustrate this with shock waves in weakly compressed granular media, the nonlinear rheology of foams, and the nonlinear mechanics of weakly connected elastic networks.

  20. Safety class methodology

    International Nuclear Information System (INIS)

    Donner, E.B.; Low, J.M.; Lux, C.R.

    1992-01-01

    DOE Order 6430.1A, General Design Criteria (GDC), requires that DOE facilities be evaluated with respect to ''safety class items.'' Although the GDC defines safety class items, it does not provide a methodology for selecting safety class items. The methodology described in this paper was developed to assure that Safety Class Items at the Savannah River Site (SRS) are selected in a consistent and technically defensible manner. Safety class items are those in the highest of four categories determined to be of special importance to nuclear safety and, merit appropriately higher-quality design, fabrication, and industrial test standards and codes. The identification of safety class items is approached using a cascading strategy that begins at the 'safety function' level (i.e., a cooling function, ventilation function, etc.) and proceeds down to the system, component, or structure level. Thus, the items that are required to support a safety function are SCls. The basic steps in this procedure apply to the determination of SCls for both new project activities, and for operating facilities. The GDC lists six characteristics of SCls to be considered as a starting point for safety item classification. They are as follows: 1. Those items whose failure would produce exposure consequences that would exceed the guidelines in Section 1300-1.4, ''Guidance on Limiting Exposure of the Public,'' at the site boundary or nearest point of public access 2. Those items required to maintain operating parameters within the safety limits specified in the Operational Safety Requirements during normal operations and anticipated operational occurrences. 3. Those items required for nuclear criticality safety. 4. Those items required to monitor the release of radioactive material to the environment during and after a Design Basis Accident. Those items required to achieve, and maintain the facility in a safe shutdown condition 6. Those items that control Safety Class Item listed above

  1. Workshops as a Research Methodology

    DEFF Research Database (Denmark)

    Ørngreen, Rikke; Levinsen, Karin Tweddell

    2017-01-01

    , and workshops as a research methodology. Focusing primarily on the latter, this paper presents five studies on upper secondary and higher education teachers’ professional development and on teaching and learning through video conferencing. Through analysis and discussion of these studies’ findings, we argue......This paper contributes to knowledge on workshops as a research methodology, and specifically on how such workshops pertain to e-learning. A literature review illustrated that workshops are discussed according to three different perspectives: workshops as a means, workshops as practice...... that workshops provide a platform that can aid researchers in identifying and exploring relevant factors in a given domain by providing means for understanding complex work and knowledge processes that are supported by technology (for example, e-learning). The approach supports identifying factors...

  2. Cosmology and Dark Matter

    CERN Document Server

    Tkachev, Igor

    2017-01-01

    This lecture course covers cosmology from the particle physicist perspective. Therefore, the emphasis will be on the evidence for the new physics in cosmological and astrophysical data together with minimal theoretical frameworks needed to understand and appreciate the evidence. I review the case for non-baryonic dark matter and describe popular models which incorporate it. In parallel, the story of dark energy will be developed, which includes accelerated expansion of the Universe today, the Universe origin in the Big Bang, and support for the Inflationary theory in CMBR data.

  3. An Improved Cambridge Filter Pad Extraction Methodology to Obtain More Accurate Water and “Tar” Values: In Situ Cambridge Filter Pad Extraction Methodology

    Directory of Open Access Journals (Sweden)

    Ghosh David

    2014-07-01

    Full Text Available Previous investigations by others and internal investigations at Philip Morris International (PMI have shown that the standard trapping and extraction procedure used for conventional cigarettes, defined in the International Standard ISO 4387 (Cigarettes -- Determination of total and nicotine-free dry particulate matter using a routine analytical smoking machine, is not suitable for high-water content aerosols. Errors occur because of water losses during the opening of the Cambridge filter pad holder to remove the filter pad as well as during the manual handling of the filter pad, and because the commercially available filter pad holder, which is constructed out of plastic, may adsorb water. This results in inaccurate values for the water content, and erroneous and overestimated values for Nicotine Free Dry Particulate Matter (NFDPM. A modified 44 mm Cambridge filter pad holder and extraction equipment which supports in situ extraction methodology has been developed and tested. The principle of the in situ extraction methodology is to avoid any of the above mentioned water losses by extracting the loaded filter pad while kept in the Cambridge filter pad holder which is hermetically sealed by two caps. This is achieved by flushing the extraction solvent numerous times through the hermetically sealed Cambridge filter pad holder by means of an in situ extractor. The in situ methodology showed a significantly more complete water recovery, resulting in more accurate NFDPM values for high-water content aerosols compared to the standard ISO methodology. The work presented in this publication demonstrates that the in situ extraction methodology applies to a wider range of smoking products and smoking regimens, whereas the standard ISO methodology only applies to a limited range of smoking products and smoking regimens, e.g., conventional cigarettes smoked under ISO smoking regimen. In cases where a comparison of yields between the PMI HTP and

  4. Baryonic matter and beyond

    OpenAIRE

    Fukushima, Kenji

    2014-01-01

    We summarize recent developments in identifying the ground state of dense baryonic matter and beyond. The topics include deconfinement from baryonic matter to quark matter, a diquark mixture, topological effect coupled with chirality and density, and inhomogeneous chiral condensates.

  5. Silicon quantum dots: surface matters

    Czech Academy of Sciences Publication Activity Database

    Dohnalová, K.; Gregorkiewicz, T.; Kůsová, Kateřina

    2014-01-01

    Roč. 26, č. 17 (2014), 1-28 ISSN 0953-8984 R&D Projects: GA ČR GPP204/12/P235 Institutional support: RVO:68378271 Keywords : silicon quantum dots * quantum dot * surface chemistry * quantum confinement Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 2.346, year: 2014

  6. Front Matter

    Directory of Open Access Journals (Sweden)

    HLRC Editor

    2016-08-01

    Full Text Available Higher Learning Research Communications (HLRC, ISSN: 2157-6254 [Online] is published collaboratively by Walden University (USA, Universidad Andrés Bello (Chile, Universidad Europea de Madrid (Spain and Istanbul Bilgi University (Turkey. Written communication to HLRC should be addressed to the office of the Executive Director at Laureate Education, Inc. 701 Brickell Ave Ste. 1700, Miami, FL 33131, USA. HLRC is designed for open access and online distribution through www.hlrcjournal.com. The views and statements expressed in this journal do not necessarily reflect the views of Laureate Education, Inc. or any of its affiliates (collectively “Laureate”. Laureate does not warrant the accuracy, reliability, currency or completeness of those views or statements and does not accept any legal liability arising from any reliance on the views, statements and subject matter of the journal. Acknowledgements The Guest Editors gratefully acknowledge the substantial contribution of the readers for the blind peer review of essays submitted for this special issue as exemplars of individuals from around the world who have come together in a collective endeavor for the common good: Robert Bringle (Indiana University Purdue University Indianapolis, US, Linda Buckley (University of the Pacific, US, Guillermo Calleja (Universidad Rey Juan Carlos, Spain, Eva Egron-Polak (International Association of Universities, France, Heather Friesen (Abu Dhabi University, UAE, Saran Gill (National University of Malaysia, Malaysia, Chester Haskell (higher education consultant, US, Kanokkarn Kaewnuch (National Institute for Development Administration, Thailand, Gil Latz (Indiana University Purdue University Indianapolis, US, Molly Lee (higher education consultant, Malaysia, Deane Neubauer (East-West Center at University of Hawaii, US, Susan Sutton (Bryn Mawr College, US, Francis Wambalaba (United States International University, Kenya, and Richard Winn (higher education

  7. Management matters.

    Science.gov (United States)

    Gould, Rebecca A; Canter, Deborah

    2008-11-01

    Fewer than 50% of registered dietitians (RDs) supervise personnel and 76% have no budget authority. Because higher salaries are tied to increasing levels of authority and responsibility, RDs must seek management and leadership roles to enjoy the increased remuneration tied to such positions. Advanced-level practice in any area of dietetics demands powerful communication abilities, proficiency in budgeting and finance, comfort with technology, higher-order decision-making/problem-solving skills, and well-honed human resource management capabilities, all foundational to competent management practice. As RDs envision the future of the dietetics profession, practitioners must evaluate management competence in both hard and soft skills. Just as research is needed to support evidenced-based clinical practice, the same is needed to support management practice across the profession. Dietetics educators and preceptors should be as enthusiastic about management practice as they are clinical practice when educating and mentoring future professionals. Such encouragement and support can mean that new RDs and dietetic technicians, registered, will understand what it takes to advance to higher levels of responsibility, authority, and subsequent enhanced remuneration. In the ever-changing social, legal, ethical, political, economic, technological, and ecological environments of work, food and nutrition professionals who are willing to step forward and assume the risks and responsibilities of management also will share in the rewards, and propel the profession to new heights of recognition and respect.

  8. Scenario development methodologies

    International Nuclear Information System (INIS)

    Eng, T.; Hudson, J.; Stephansson, O.

    1994-11-01

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are a) Event tree analysis, b) Influence diagrams and c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs

  9. Dark matter in spiral galaxies

    International Nuclear Information System (INIS)

    Albada, T.S. van; Sancisi, R.

    1986-01-01

    Mass models of spiral galaxies based on the observed light distribution, assuming constant M/L for bulge and disc, are able to reproduce the observed rotation curves in the inner regions, but fail to do so increasingly towards and beyond the edge of the visible material. The discrepancy in the outer region can be accounted for by invoking dark matter; some galaxies require at least four times as much dark matter as luminous matter. There is no evidence for a dependence on galaxy luminosity or morphological type. Various arguments support the idea that a distribution of visible matter with constant M/L is responsible for the circular velocity in the inner region, i.e. inside approximately 2.5 disc scalelengths. Luminous matter and dark matter seem to 'conspire' to produce the flat observed rotation curves in the outer region. It seems unlikely that this coupling between disc and halo results from the large-scale gravitational interaction between the two components. Attempts to determine the shape of dark halos have not yet produced convincing results. (author)

  10. IMSF: Infinite Methodology Set Framework

    Science.gov (United States)

    Ota, Martin; Jelínek, Ivan

    Software development is usually an integration task in enterprise environment - few software applications work autonomously now. It is usually a collaboration of heterogeneous and unstable teams. One serious problem is lack of resources, a popular result being outsourcing, ‘body shopping’, and indirectly team and team member fluctuation. Outsourced sub-deliveries easily become black boxes with no clear development method used, which has a negative impact on supportability. Such environments then often face the problems of quality assurance and enterprise know-how management. The used methodology is one of the key factors. Each methodology was created as a generalization of a number of solved projects, and each methodology is thus more or less connected with a set of task types. When the task type is not suitable, it causes problems that usually result in an undocumented ad-hoc solution. This was the motivation behind formalizing a simple process for collaborative software engineering. Infinite Methodology Set Framework (IMSF) defines the ICT business process of adaptive use of methods for classified types of tasks. The article introduces IMSF and briefly comments its meta-model.

  11. Unattended Monitoring System Design Methodology

    International Nuclear Information System (INIS)

    Drayer, D.D.; DeLand, S.M.; Harmon, C.D.; Matter, J.C.; Martinez, R.L.; Smith, J.D.

    1999-01-01

    A methodology for designing Unattended Monitoring Systems starting at a systems level has been developed at Sandia National Laboratories. This proven methodology provides a template that describes the process for selecting and applying appropriate technologies to meet unattended system requirements, as well as providing a framework for development of both training courses and workshops associated with unattended monitoring. The design and implementation of unattended monitoring systems is generally intended to respond to some form of policy based requirements resulting from international agreements or domestic regulations. Once the monitoring requirements are established, a review of the associated process and its related facilities enables identification of strategic monitoring locations and development of a conceptual system design. The detailed design effort results in the definition of detection components as well as the supporting communications network and data management scheme. The data analyses then enables a coherent display of the knowledge generated during the monitoring effort. The resultant knowledge is then compared to the original system objectives to ensure that the design adequately addresses the fundamental principles stated in the policy agreements. Implementation of this design methodology will ensure that comprehensive unattended monitoring system designs provide appropriate answers to those critical questions imposed by specific agreements or regulations. This paper describes the main features of the methodology and discusses how it can be applied in real world situations

  12. Size matter!

    DEFF Research Database (Denmark)

    Hansen, Pelle Guldborg; Jespersen, Andreas Maaløe; Skov, Laurits Rhoden

    2015-01-01

    trash bags according to size of plates and weighed in bulk. Results Those eating from smaller plates (n=145) left significantly less food to waste (aver. 14,8g) than participants eating from standard plates (n=75) (aver. 20g) amounting to a reduction of 25,8%. Conclusions Our field experiment tests...... the hypothesis that a decrease in the size of food plates may lead to significant reductions in food waste from buffets. It supports and extends the set of circumstances in which a recent experiment found that reduced dinner plates in a hotel chain lead to reduced quantities of leftovers....

  13. Methodology of Credit Analysis Development

    Directory of Open Access Journals (Sweden)

    Slađana Neogradi

    2017-12-01

    Full Text Available The subject of research presented in this paper refers to the definition of methodology for the development of credit analysis in companies and its application in lending operations in the Republic of Serbia. With the developing credit market, there is a growing need for a well-developed risk and loss prevention system. In the introduction the process of bank analysis of the loan applicant is presented in order to minimize and manage the credit risk. By examining the subject matter, the process of processing the credit application is described, the procedure of analyzing the financial statements in order to get an insight into the borrower's creditworthiness. In the second part of the paper, the theoretical and methodological framework is presented applied in the concrete company. In the third part, models are presented which banks should use to protect against exposure to risks, i.e. their goal is to reduce losses on loan operations in our country, as well as to adjust to market conditions in an optimal way.

  14. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  15. Practical Matters

    DEFF Research Database (Denmark)

    Elle, Birgitte

    Denmark has recently on a small scale tried new initiatives and ways of thinking teacher education out, which implied the students simultaneous trainee employment at a school, and their maintaining of the study activities at the teacher university college. This Danish initiative differs from other...... in western countries well known ways of organising teacher education as school-based, with a strong workplace focus, as well as from the use of the teacher assistant as support staff in schools. This paper discusses key findings and some theoretical implications from the simultaneous follow-up research......-up research made use of notions of decentredness in the ‘situated learning theory’ (Lave & Wenger) and of ‘communities of practice’ (Wenger), and this theoretical basis was further expanded with the notions of power relations and de-naturalised perspectives, (Foucault), and neo-liberalism (Rose). The follow...

  16. Conducting compositions of matter

    Science.gov (United States)

    Viswanathan, Tito (Inventor)

    2000-01-01

    The invention provides conductive compositions of matter, as well as methods for the preparation of the conductive compositions of matter, solutions comprising the conductive compositions of matter, and methods of preparing fibers or fabrics having improved anti-static properties employing the conductive compositions of matter.

  17. Introduction to LCA Methodology

    DEFF Research Database (Denmark)

    Hauschild, Michael Z.

    2018-01-01

    In order to offer the reader an overview of the LCA methodology in the preparation of the more detailed description of its different phases, a brief introduction is given to the methodological framework according to the ISO 14040 standard and the main elements of each of its phases. Emphasis...

  18. Methodologies, languages and tools

    International Nuclear Information System (INIS)

    Amako, Katsuya

    1994-01-01

    This is a summary of the open-quotes Methodologies, Languages and Toolsclose quotes session in the CHEP'94 conference. All the contributions to methodologies and languages are relevant to the object-oriented approach. Other topics presented are related to various software tools in the down-sized computing environment

  19. Archetype modeling methodology.

    Science.gov (United States)

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  20. Menopause and Methodological Doubt

    Science.gov (United States)

    Spence, Sheila

    2005-01-01

    Menopause and methodological doubt begins by making a tongue-in-cheek comparison between Descartes' methodological doubt and the self-doubt that can arise around menopause. A hermeneutic approach is taken in which Cartesian dualism and its implications for the way women are viewed in society are examined, both through the experiences of women…

  1. VEM: Virtual Enterprise Methodology

    DEFF Research Database (Denmark)

    Tølle, Martin; Vesterager, Johan

    2003-01-01

    This chapter presents a virtual enterprise methodology (VEM) that outlines activities to consider when setting up and managing virtual enterprises (VEs). As a methodology the VEM helps companies to ask the right questions when preparing for and setting up an enterprise network, which works...

  2. Data Centric Development Methodology

    Science.gov (United States)

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  3. The Methodology of Magpies

    Science.gov (United States)

    Carter, Susan

    2014-01-01

    Arts/Humanities researchers frequently do not explain methodology overtly; instead, they "perform" it through their use of language, textual and historic cross-reference, and theory. Here, methodologies from literary studies are shown to add to Higher Education (HE) an exegetical and critically pluralist approach. This includes…

  4. Methodology for performing surveys for fixed contamination

    International Nuclear Information System (INIS)

    Durham, J.S.; Gardner, D.L.

    1994-10-01

    This report describes a methodology for performing instrument surveys for fixed contamination that can be used to support the release of material from radiological areas, including release to controlled areas and release from radiological control. The methodology, which is based on a fast scan survey and a series of statistical, fixed measurements, meets the requirements of the U.S. Department of Energy Radiological Control Manual (RadCon Manual) (DOE 1994) and DOE Order 5400.5 (DOE 1990) for surveys for fixed contamination and requires less time than a conventional scan survey. The confidence interval associated with the new methodology conforms to the draft national standard for surveys. The methodology that is presented applies only to surveys for fixed contamination. Surveys for removable contamination are not discussed, and the new methodology does not affect surveys for removable contamination

  5. Condensed elementary particle matter

    International Nuclear Information System (INIS)

    Kajantie, K.

    1996-01-01

    Quark matter is a special case of condensed elementary particle matter, matter governed by the laws of particle physics. The talk discusses how far one can get in the study of particle matter by reducing the problem to computations based on the action. As an example the computation of the phase diagram of electroweak matter is presented. It is quite possible that ultimately an antireductionist attitude will prevail: experiments will reveal unpredicted phenomena not obviously reducible to the study of the action. (orig.)

  6. Comparison of the sanitary effects of energy chains. Methodological aspects

    International Nuclear Information System (INIS)

    Fagnani, F.

    1979-01-01

    Beyond technical and economical matters, the development of an industrial technology involves more or less numerous indirect consequences. From this viewpoint, the author analysis the methodological problems raised in evaluating the sanitary and ecological problems of the different energy-producing lines and considers successively the matter of technical interdependences, protection and safety regulations and selection of sites, classification of risks and measuring problems in relation to sanitary effects [fr

  7. Fundamentals of carbon dioxide-enhanced oil recovery (CO2-EOR): a supporting document of the assessment methodology for hydrocarbon recovery using CO2-EOR associated with carbon sequestration

    Science.gov (United States)

    Verma, Mahendra K.

    2015-01-01

    The objective of this report is to provide basic technical information regarding the CO2-EOR process, which is at the core of the assessment methodology, to estimate the technically recoverable oil within the fields of the identified sedimentary basins of the United States. Emphasis is on CO2-EOR because this is currently one technology being considered as an ultimate long-term geologic storage solution for CO2 owing to its economic profitability from incremental oil production offsetting the cost of carbon sequestration.

  8. Principles of Thermodynamics, one of the supports of the green economy and the role of the school in their awareness

    Directory of Open Access Journals (Sweden)

    Juan R. Cardentey Lorente

    2008-09-01

    Full Text Available A methodological request has been strengthened in the last decades, which asks for a critical thinking about the economic reality, taking into account certain principles and fundamental processes, generalized to higher states of matter development. In this way, the principles of thermodynamics or the biological evolution appear as epistemological perspectives of knowledge. The matter is how to design a new “economy of sustainability” which do es not destroy the natural resources a nd the ecologic systems that support it. So the Ecoeconomy emerges aiming at the reconstruction of the biophysical bases of the economic process which claims for the economical efficiency, social justice and sustainability.

  9. GPS system simulation methodology

    Science.gov (United States)

    Ewing, Thomas F.

    1993-01-01

    The following topics are presented: background; Global Positioning System (GPS) methodology overview; the graphical user interface (GUI); current models; application to space nuclear power/propulsion; and interfacing requirements. The discussion is presented in vugraph form.

  10. Nonlinear Image Denoising Methodologies

    National Research Council Canada - National Science Library

    Yufang, Bao

    2002-01-01

    In this thesis, we propose a theoretical as well as practical framework to combine geometric prior information to a statistical/probabilistic methodology in the investigation of a denoising problem...

  11. Clinical trial methodology

    National Research Council Canada - National Science Library

    Peace, Karl E; Chen, Ding-Geng

    2011-01-01

    "Now viewed as its own scientific discipline, clinical trial methodology encompasses the methods required for the protection of participants in a clinical trial and the methods necessary to provide...

  12. Soil organic matter studies

    International Nuclear Information System (INIS)

    1977-01-01

    A total of 77 papers were presented and discussed during this symposium, 37 are included in this Volume II. The topics covered in this volume include: biochemical transformation of organic matter in soils; bitumens in soil organic matter; characterization of humic acids; carbon dating of organic matter in soils; use of modern techniques in soil organic matter research; use of municipal sludge with special reference to heavy metals constituents, soil nitrogen, and physical and chemical properties of soils; relationship of soil organic matter and plant metabolism; interaction between agrochemicals and organic matter; and peat. Separate entries have been prepared for those 20 papers which discuss the use of nuclear techniques in these studies

  13. Methodology of sustainability accounting

    Directory of Open Access Journals (Sweden)

    O.H. Sokil

    2017-03-01

    Full Text Available Modern challenges of the theory and methodology of accounting are realized through the formation and implementation of new concepts, the purpose of which is to meet the needs of users in standard and unique information. The development of a methodology for sustainability accounting is a key aspect of the management of an economic entity. The purpose of the article is to form the methodological bases of accounting for sustainable development and determine its goals, objectives, object, subject, methods, functions and key aspects. The author analyzes the theoretical bases of the definition and considers the components of the traditional accounting methodology. Generalized structural diagram of the methodology for accounting for sustainable development is offered in the article. The complex of methods and principles of sustainable development accounting for systematized and non-standard provisions has been systematized. The new system of theoretical and methodological provisions of accounting for sustainable development is justified in the context of determining its purpose, objective, subject, object, methods, functions and key aspects.

  14. Topical issues of psychological research materials on matters related to extremism

    Directory of Open Access Journals (Sweden)

    Sekerazh T.N.

    2014-12-01

    Full Text Available The article deals with methodological support psychological and linguistic research "extremist" materials. Presents a comprehensive psycho-linguistic approach to the examination of information materials on matters related to combating extremism and terrorism, and certain provisions of the methodology developed by the Russian federal center of judicial examination of the Ministry of Justice of the Russian Federation. Based on the analysis of the "verbal" crimes related to criminal legal interpretation of extremism and terrorism, highlighted the types of prohibited public expression of communicative action, corresponding to the seven types of "extremist" values. The article outlines the key features of psychological analysis "extremist" materials research stages. It is shown that the complex (psycho-linguistic approach to the study of materials of extremist orientation, is scientifically sound, methodically proven, appropriate to the needs of law enforcement, judicial and investigative practice.

  15. System Anthropological Psychology: Methodological Foundations

    Directory of Open Access Journals (Sweden)

    Vitaliy Y. Klochko

    2012-01-01

    Full Text Available The article considers methodological foundations of the system anthropologicalpsychology (SAP as a scientific branch developed by a well-represented groupof Siberian scientists. SAP is a theory based on axiomatics of cultural-historicalpsychology of L.S. Vygotsky and transspective analysis as a specially developedmeans to define the tendencies of science developing as a self-organizing system.Transspective analysis has revealed regularities in a constantly growing complexityof professional-psychological thinking along the course of emergence ofscientific cognition. It has proved that the field of modern psychology is shapedby theories constructed with ideation of different grades of complexity. The concept“dynamics of the paradigm of science” is introduced; it allows transitions tobe acknowledged from ordinary-binary logic characteristics of the classical scienceto a binary-ternary logic, adequate to non-classical science and then to aternary-multidimensional logic, which is now at the stage of emergence. The latteris employed in SAP construction. It involves the following basic methodologicalprinciples: the principle of directed (selective interaction and the principle ofgenerative effect of selective interaction. The concept of “complimentary interaction”applied in natural as well as humanitarian sciences is reconsidered in thecontext of psychology. The conclusion is made that the principle of selectivity anddirectedness of interaction is relevant to the whole Universe embracing all kindsof systems including the living ones. Different levels of matter organization representingsemantic structures of various complexity use one and the same principleof meaning making through which the Universe ensures its sustainability asa self-developing phenomenon. This methodology provides an explanation fornature and stages of emergence of multidimensional life space of an individual,which comes as a foundation for generation of such features of

  16. Baryonic Dark Matter

    OpenAIRE

    Silk, Joseph

    1994-01-01

    In the first two of these lectures, I present the evidence for baryonic dark matter and describe possible forms that it may take. The final lecture discusses formation of baryonic dark matter, and sets the cosmological context.

  17. Grammar of the matter

    International Nuclear Information System (INIS)

    Jacob, M.

    1992-01-01

    In this paper, the author describes the structure of the matter and presents the families of elementary particles (fermions) and the interaction messengers (bosons) with their properties. He presents the actual status and future trends of research on nuclear matter

  18. Dark matter detectors

    International Nuclear Information System (INIS)

    Forster, G.

    1995-01-01

    A fundamental question of astrophysics and cosmology is the nature of dark matter. Astrophysical observations show clearly the existence of some kind of dark matter, though they cannot yet reveal its nature. Dark matter can consist of baryonic particles, or of other (known or unknown) elementary particles. Baryonic dark matter probably exists in the form of dust, gas, or small stars. Other elementary particles constituting the dark matter can possibly be measured in terrestrial experiments. Possibilities for dark matter particles are neutrinos, axions and weakly interacting massive particles (WIMPs). While a direct detection of relic neutrinos seems at the moment impossible, there are experiments looking for baryonic dark matter in the form of Massive Compact Halo Objects, and for particle dark matter in the form of axions and WIMPS. (orig.)

  19. Danish emission inventory for particular matter (PM)

    Energy Technology Data Exchange (ETDEWEB)

    Nielsen, M; Winther, M; Illerup, J B; Hjort Mikkelsen, M

    2003-11-01

    The first Danish emission inventory that was reported in 2002 was a provisional-estimate based on data presently available. This report documents methodology, emission factors and references used for an improved Danish emission inventory for particulate matter. Further results of the improved emission inventory for the year 2000 are shown. The particulate matter emission inventory includes TSP, PM,, and PM, The report covers emission inventories for transport and stationary combustion. An appendix covering emissions from agriculture is also included. For the transport sector, both exhaust and non-exhaust emission such as tyre and break wear and road abrasion are included. (au)

  20. The Evidence-base for Using Ontologies and Semantic Integration Methodologies to Support Integrated Chronic Disease Management in Primary and Ambulatory Care: Realist Review. Contribution of the IMIA Primary Health Care Informatics WG.

    Science.gov (United States)

    Liyanage, H; Liaw, S-T; Kuziemsky, C; Terry, A L; Jones, S; Soler, J K; de Lusignan, S

    2013-01-01

    Most chronic diseases are managed in primary and ambulatory care. The chronic care model (CCM) suggests a wide range of community, technological, team and patient factors contribute to effective chronic disease management. Ontologies have the capability to enable formalised linkage of heterogeneous data sources as might be found across the elements of the CCM. To describe the evidence base for using ontologies and other semantic integration methods to support chronic disease management. We reviewed the evidence-base for the use of ontologies and other semantic integration methods within and across the elements of the CCM. We report them using a realist review describing the context in which the mechanism was applied, and any outcome measures. Most evidence was descriptive with an almost complete absence of empirical research and important gaps in the evidence-base. We found some use of ontologies and semantic integration methods for community support of the medical home and for care in the community. Ubiquitous information technology (IT) and other IT tools were deployed to support self-management support, use of shared registries, health behavioural models and knowledge discovery tools to improve delivery system design. Data quality issues restricted the use of clinical data; however there was an increased use of interoperable data and health system integration. Ontologies and semantic integration methods are emergent with limited evidence-base for their implementation. However, they have the potential to integrate the disparate community wide data sources to provide the information necessary for effective chronic disease management.

  1. Dence Cold Matter

    Directory of Open Access Journals (Sweden)

    Stavinskiy Alexey

    2014-04-01

    Full Text Available Possible way to create dense cold baryonic matter in the laboratory is discussed. The density of this matter is comparable or even larger than the density of neutron star core. The properties of this matter can be controlled by trigger conditions. Experimental program for the study of properties of dense cold matter for light and heavy ion collisions at initial energy range √sNN~2-3GeV is proposed..

  2. Dark Matter Effective Theory

    DEFF Research Database (Denmark)

    Del Nobile, Eugenio; Sannino, Francesco

    2012-01-01

    We organize the effective (self)interaction terms for complex scalar dark matter candidates which are either an isosinglet, isodoublet or an isotriplet with respect to the weak interactions. The classification has been performed ordering the operators in inverse powers of the dark matter cutoff...... scale. We assume Lorentz invariance, color and charge neutrality. We also introduce potentially interesting dark matter induced flavor-changing operators. Our general framework allows for model independent investigations of dark matter properties....

  3. Nonthermal Supermassive Dark Matter

    Science.gov (United States)

    Chung, Daniel J. H.; Kolb, Edward W.; Riotto, Antonio

    1999-01-01

    We discuss several cosmological production mechanisms for nonthermal supermassive dark matter and argue that dark matter may he elementary particles of mass much greater than the weak scale. Searches for dark matter should ma be limited to weakly interacting particles with mass of the order of the weak scale, but should extend into the supermassive range as well.

  4. Nonthermal Supermassive Dark Matter

    International Nuclear Information System (INIS)

    Chung, D.J.; Chung, D.J.; Kolb, E.W.; Kolb, E.W.; Riotto, A.

    1998-01-01

    We discuss several cosmological production mechanisms for nonthermal supermassive dark matter and argue that dark matter may be elementary particles of mass much greater than the weak scale. Searches for dark matter should not be limited to weakly interacting particles with mass of the order of the weak scale, but should extend into the supermassive range as well. copyright 1998 The American Physical Society

  5. Nonthermal Supermassive Dark Matter

    OpenAIRE

    Chung, Daniel J. H.; Kolb, Edward W.; Riotto, Antonio

    1998-01-01

    We discuss several cosmological production mechanisms for nonthermal supermassive dark matter and argue that dark matter may be elementary particles of mass much greater than the weak scale. Searches for dark matter should not be limited to weakly interacting particles with mass of the order of the weak scale, but should extend into the supermassive range as well.

  6. Matter and Energy

    CERN Document Server

    Karam, P Andrew

    2011-01-01

    In Matter and Energy, readers will learn about the many forms of energy, the wide variety of particles in nature, and Albert Einstein's world-changing realization of how matter can be changed into pure energy. The book also examines the recent discoveries of dark matter and dark energy and the future of the universe.

  7. Regional Shelter Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  8. Hanford Site baseline risk assessment methodology

    International Nuclear Information System (INIS)

    1993-03-01

    This methodology has been developed to prepare human health and environmental evaluations of risk as part of the Comprehensive Environmental Response, Compensation, and Liability Act remedial investigations (RIs) and the Resource Conservation and Recovery Act facility investigations (FIs) performed at the Hanford Site pursuant to the Hanford Federal Facility Agreement and Consent Order referred to as the Tri-Party Agreement. Development of the methodology has been undertaken so that Hanford Site risk assessments are consistent with current regulations and guidance, while providing direction on flexible, ambiguous, or undefined aspects of the guidance. The methodology identifies Site-specific risk assessment considerations and integrates them with approaches for evaluating human and environmental risk that can be factored into the risk assessment program supporting the Hanford Site cleanup mission. Consequently, the methodology will enhance the preparation and review of individual risk assessments at the Hanford Site

  9. Hanford Site Risk Assessment Methodology. Revision 3

    International Nuclear Information System (INIS)

    1995-05-01

    This methodology has been developed to prepare human health and ecological evaluations of risk as part of the Comprehensive Environmental Response, Compensation, and Liability Act of 1980 (CERCLA) remedial investigations (RI) and the Resource conservation and Recovery Act of 1976 (RCRA) facility investigations (FI) performed at the Hanford Site pursuant to the hanford Federal Facility Agreement and Consent Order (Ecology et al. 1994), referred to as the Tri-Party Agreement. Development of the methodology has been undertaken so that Hanford Site risk assessments are consistent with current regulations and guidance, while providing direction on flexible, ambiguous, or undefined aspects of the guidance. The methodology identifies site-specific risk assessment considerations and integrates them with approaches for evaluating human and ecological risk that can be factored into the risk assessment program supporting the Hanford Site cleanup mission. Consequently, the methodology will enhance the preparation and review of individual risk assessments at the Hanford Site

  10. The policy trail methodology

    DEFF Research Database (Denmark)

    Holford, John; Larson, Anne; Melo, Susana

    of ‘policy trail’, arguing that it can overcome ‘methodological nationalism’ and link structure and agency in research on the ‘European educational space’. The ‘trail’ metaphor, she suggests, captures the intentionality and the erratic character of policy. The trail connects sites and brings about change......, but – although policy may be intended to be linear, with specific outcomes – policy often has to bend, and sometimes meets insurmountable obstacles. This symposium outlines and develops the methodology, but also reports on research undertaken within a major FP7 project (LLLIght’in’Europe, 2012-15) which made use......In recent years, the “policy trail” has been proposed as a methodology appropriate to the shifting and fluid governance of lifelong learning in the late modern world (Holford et al. 2013, Holford et al. 2013, Cort 2014). The contemporary environment is marked by multi-level governance (global...

  11. PREFACE: Quark Matter 2008

    Science.gov (United States)

    Jan-e~Alam; Subhasis~Chattopadhyay; Tapan~Nayak

    2008-10-01

    Quark Matter 2008—the 20th International Conference on Ultra-Relativistic Nucleus-Nucleus Collisions was held in Jaipur, the Pink City of India, from 4-10 February, 2008. Organizing Quark Matter 2008 in India itself indicates the international recognition of the Indian contribution to the field of heavy-ion physics, which was initiated and nurtured by Bikash Sinha, Chair of the conference. The conference was inaugurated by the Honourable Chief Minister of Rajasthan, Smt. Vasundhara Raje followed by the key note address by Professor Carlo Rubbia. The scientific programme started with the theoretical overview, `SPS to RHIC and onwards to LHC' by Larry McLerran followed by several theoretical and experimental overview talks on the ongoing experiments at SPS and RHIC. The future experiments at the LHC, FAIR and J-PARC, along with the theoretical predictions, were discussed in great depth. Lattice QCD predictions on the nature of the phase transition and critical point were vigorously debated during several plenary and parallel session presentations. The conference was enriched by the presence of an unprecedented number of participants; about 600 participants representing 31 countries across the globe. This issue contains papers based on plenary talks and oral presentations presented at the conference. Besides invited and contributed talks, there were also a large number of poster presentations. Members of the International Advisory Committee played a pivotal role in the selection of speakers, both for plenary and parallel session talks. The contributions of the Organizing Committee in all aspects, from helping to prepare the academic programme down to arranging local hospitality, were much appreciated. We thank the members of both the committees for making Quark Matter 2008 a very effective and interesting platform for scientific deliberations. Quark Matter 2008 was financially supported by: Air Liquide (New Delhi) Board of Research Nuclear Sciences (Mumbai) Bose

  12. Changing methodologies in TESOL

    CERN Document Server

    Spiro, Jane

    2013-01-01

    Covering core topics from vocabulary and grammar to teaching, writing speaking and listening, this textbook shows you how to link research to practice in TESOL methodology. It emphasises how current understandings have impacted on the language classroom worldwide and investigates the meaning of 'methods' and 'methodology' and the importance of these for the teacher: as well as the underlying assumptions and beliefs teachers bring to bear in their practice. By introducing you to language teaching approaches, you will explore the way these are influenced by developments in our understanding of l

  13. Secretly asymmetric dark matter

    Science.gov (United States)

    Agrawal, Prateek; Kilic, Can; Swaminathan, Sivaramakrishnan; Trendafilova, Cynthia

    2017-01-01

    We study a mechanism where the dark matter number density today arises from asymmetries generated in the dark sector in the early Universe, even though the total dark matter number remains zero throughout the history of the Universe. The dark matter population today can be completely symmetric, with annihilation rates above those expected from thermal weakly interacting massive particles. We give a simple example of this mechanism using a benchmark model of flavored dark matter. We discuss the experimental signatures of this setup, which arise mainly from the sector that annihilates the symmetric component of dark matter.

  14. Dark Matter Caustics

    International Nuclear Information System (INIS)

    Natarajan, Aravind

    2010-01-01

    The continuous infall of dark matter with low velocity dispersion in galactic halos leads to the formation of high density structures called caustics. Dark matter caustics are of two kinds : outer and inner. Outer caustics are thin spherical shells surrounding galaxies while inner caustics have a more complicated structure that depends on the dark matter angular momentum distribution. The presence of a dark matter caustic in the plane of the galaxy modifies the gas density in its neighborhood which may lead to observable effects. Caustics are also relevant to direct and indirect dark matter searches.

  15. Dark Matter Searches

    International Nuclear Information System (INIS)

    Moriyama, Shigetaka

    2008-01-01

    Recent cosmological as well as historical observations of rotational curves of galaxies strongly suggest the existence of dark matter. It is also widely believed that dark matter consists of unknown elementary particles. However, astrophysical observations based on gravitational effects alone do not provide sufficient information on the properties of dark matter. In this study, the status of dark matter searches is investigated by observing high-energy neutrinos from the sun and the earth and by observing nuclear recoils in laboratory targets. The successful detection of dark matter by these methods facilitates systematic studies of its properties. Finally, the XMASS experiment, which is due to start at the Kamioka Observatory, is introduced

  16. Implications of the DAMA and CRESST experiments for mirror matter-type dark matter

    International Nuclear Information System (INIS)

    Foot, R.

    2004-01-01

    Mirror atoms are expected to be a significant component of the galactic dark matter halo if mirror matter is identified with the nonbaryonic dark matter in the Universe. Mirror matter can interact with ordinary matter via gravity and via the photon-mirror photon kinetic mixing interaction--causing mirror charged particles to couple to ordinary photons with an effective electric charge εe. This means that the nuclei of mirror atoms can elastically scatter off the nuclei of ordinary atoms, leading to nuclear recoils, which can be detected in existing dark matter experiments. We show that the dark matter experiments most sensitive to this type of dark matter candidate (via the nuclear recoil signature) are the DAMA/NaI and CRESST/Sapphire experiments. Furthermore, we show that the impressive annual modulation signal obtained by the DAMA/NaI experiment can be explained by mirror matter-type dark matter for vertical bar ε vertical bar ∼5x10 -9 and is supported by DAMA's absolute rate measurement as well as the CRESST/Sapphire data. This value of vertical bar ε vertical bar is consistent with the value obtained from various solar system anomalies including the Pioneer spacecraft anomaly, anomalous meteorite events and lack of small craters on the asteroid Eros. It is also consistent with standard big bang nucleosynthesis

  17. Survey of Dynamic PSA Methodologies

    International Nuclear Information System (INIS)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung; Kim, Taewan

    2015-01-01

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  18. Survey of Dynamic PSA Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung [Kyung Hee University, Yongin (Korea, Republic of); Kim, Taewan [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2015-05-15

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  19. Impeded Dark Matter

    Energy Technology Data Exchange (ETDEWEB)

    Kopp, Joachim; Liu, Jia [PRISMA Cluster of Excellence & Mainz Institute for Theoretical Physics,Johannes Gutenberg University,Staudingerweg 7, 55099 Mainz (Germany); Slatyer, Tracy R. [Center for Theoretical Physics, Massachusetts Institute of Technology,Cambridge, MA 02139 (United States); Wang, Xiao-Ping [PRISMA Cluster of Excellence & Mainz Institute for Theoretical Physics,Johannes Gutenberg University,Staudingerweg 7, 55099 Mainz (Germany); Xue, Wei [Center for Theoretical Physics, Massachusetts Institute of Technology,Cambridge, MA 02139 (United States)

    2016-12-12

    We consider dark matter models in which the mass splitting between the dark matter particles and their annihilation products is tiny. Compared to the previously proposed Forbidden Dark Matter scenario, the mass splittings we consider are much smaller, and are allowed to be either positive or negative. To emphasize this modification, we dub our scenario “Impeded Dark Matter”. We demonstrate that Impeded Dark Matter can be easily realized without requiring tuning of model parameters. For negative mass splitting, we demonstrate that the annihilation cross-section for Impeded Dark Matter depends linearly on the dark matter velocity or may even be kinematically forbidden, making this scenario almost insensitive to constraints from the cosmic microwave background and from observations of dwarf galaxies. Accordingly, it may be possible for Impeded Dark Matter to yield observable signals in clusters or the Galactic center, with no corresponding signal in dwarfs. For positive mass splitting, we show that the annihilation cross-section is suppressed by the small mass splitting, which helps light dark matter to survive increasingly stringent constraints from indirect searches. As specific realizations for Impeded Dark Matter, we introduce a model of vector dark matter from a hidden SU(2) sector, and a composite dark matter scenario based on a QCD-like dark sector.

  20. Impeded Dark Matter

    International Nuclear Information System (INIS)

    Kopp, Joachim; Liu, Jia; Slatyer, Tracy R.; Wang, Xiao-Ping; Xue, Wei

    2016-01-01

    We consider dark matter models in which the mass splitting between the dark matter particles and their annihilation products is tiny. Compared to the previously proposed Forbidden Dark Matter scenario, the mass splittings we consider are much smaller, and are allowed to be either positive or negative. To emphasize this modification, we dub our scenario “Impeded Dark Matter”. We demonstrate that Impeded Dark Matter can be easily realized without requiring tuning of model parameters. For negative mass splitting, we demonstrate that the annihilation cross-section for Impeded Dark Matter depends linearly on the dark matter velocity or may even be kinematically forbidden, making this scenario almost insensitive to constraints from the cosmic microwave background and from observations of dwarf galaxies. Accordingly, it may be possible for Impeded Dark Matter to yield observable signals in clusters or the Galactic center, with no corresponding signal in dwarfs. For positive mass splitting, we show that the annihilation cross-section is suppressed by the small mass splitting, which helps light dark matter to survive increasingly stringent constraints from indirect searches. As specific realizations for Impeded Dark Matter, we introduce a model of vector dark matter from a hidden SU(2) sector, and a composite dark matter scenario based on a QCD-like dark sector.

  1. Collapsed Dark Matter Structures.

    Science.gov (United States)

    Buckley, Matthew R; DiFranzo, Anthony

    2018-02-02

    The distributions of dark matter and baryons in the Universe are known to be very different: The dark matter resides in extended halos, while a significant fraction of the baryons have radiated away much of their initial energy and fallen deep into the potential wells. This difference in morphology leads to the widely held conclusion that dark matter cannot cool and collapse on any scale. We revisit this assumption and show that a simple model where dark matter is charged under a "dark electromagnetism" can allow dark matter to form gravitationally collapsed objects with characteristic mass scales much smaller than that of a Milky-Way-type galaxy. Though the majority of the dark matter in spiral galaxies would remain in the halo, such a model opens the possibility that galaxies and their associated dark matter play host to a significant number of collapsed substructures. The observational signatures of such structures are not well explored but potentially interesting.

  2. Collapsed Dark Matter Structures

    Science.gov (United States)

    Buckley, Matthew R.; DiFranzo, Anthony

    2018-02-01

    The distributions of dark matter and baryons in the Universe are known to be very different: The dark matter resides in extended halos, while a significant fraction of the baryons have radiated away much of their initial energy and fallen deep into the potential wells. This difference in morphology leads to the widely held conclusion that dark matter cannot cool and collapse on any scale. We revisit this assumption and show that a simple model where dark matter is charged under a "dark electromagnetism" can allow dark matter to form gravitationally collapsed objects with characteristic mass scales much smaller than that of a Milky-Way-type galaxy. Though the majority of the dark matter in spiral galaxies would remain in the halo, such a model opens the possibility that galaxies and their associated dark matter play host to a significant number of collapsed substructures. The observational signatures of such structures are not well explored but potentially interesting.

  3. Sterile neutrino dark matter

    CERN Document Server

    Merle, Alexander

    2017-01-01

    This book is a new look at one of the hottest topics in contemporary science, Dark Matter. It is the pioneering text dedicated to sterile neutrinos as candidate particles for Dark Matter, challenging some of the standard assumptions which may be true for some Dark Matter candidates but not for all. So, this can be seen either as an introduction to a specialized topic or an out-of-the-box introduction to the field of Dark Matter in general. No matter if you are a theoretical particle physicist, an observational astronomer, or a ground based experimentalist, no matter if you are a grad student or an active researcher, you can benefit from this text, for a simple reason: a non-standard candidate for Dark Matter can teach you a lot about what we truly know about our standard picture of how the Universe works.

  4. On indexes and subject matter of “global competitiveness”

    Directory of Open Access Journals (Sweden)

    A. V. Korotkov

    2017-01-01

    Full Text Available The aim of the research is to analyze the subject matter of a country’s competitiveness and to characterize statistical indexes of competitiveness known in the international practice from the perspective of a more elaborated theory of market competition. This aim follows from the identified problems. First, there are no generally accepted interpretation and joint understanding of competition and competitiveness at country level. Even the international organizations giving estimations of global competitiveness disagree on definitions of competitiveness. Secondly, there is no relation to the theory of market competition in the available source materials on competitiveness of the country without original methodology. Thirdly, well-known statistical indexes of global competitiveness do not have enough theoretical justification and differ in sets of factors. All this highlights the incompleteness of the methodology and methodological support of studying competitiveness at country level.Materials and methods. The research is based on the methodology of statistics, economic theory and marketing. The authors followed the basic principle of statistical methodology – requirement of continuous combination of qualitative and quantitative analysis, when the research begins and ends with qualitative analysis. A most important section of statistical methodology is widely used – construction of statistical indexes. In the course of the analysis, a method of statistical classifications is applied. A significant role in the present research is given to the method of generalizing and analogue method, realizing that related terms should mean similar and almost similar contents. Modeling of competition and competitiveness is widely used in the present research, which made it possible to develop a logical model of competition following from the competition theory.Results. Based on the definitions’ survey the analysis of the subject matter of global

  5. A Methodology for Safety Culture Impact Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Han, Kiyoon; Jae, Moosung [Hanyang Univ., Seoul (Korea, Republic of)

    2014-05-15

    The purpose of this study is to develop methodology for assessing safety culture impact on nuclear power plants. A new methodology for assessing safety culture impact index has been developed and applied for the reference nuclear power plants. The developed SCII model might contribute to comparing the level of safety culture among nuclear power plants as well as to improving the safety of nuclear power plants. Safety culture is defined to be fundamental attitudes and behaviors of the plant staff which demonstrate that nuclear safety is the most important consideration in all activities conducted in nuclear power operation. Through several accidents of nuclear power plant including the Fukusima Daiichi in 2011 and Chernovyl accidents in 1986, the safety of nuclear power plant is emerging into a matter of interest. From the accident review report, it can be easily found out that safety culture is important and one of dominant contributors to accidents. However, the impact methodology for assessing safety culture has not been established analytically yet. It is difficult to develop the methodology for assessing safety culture impact quantitatively.

  6. A Methodology for Safety Culture Impact Assessment

    International Nuclear Information System (INIS)

    Han, Kiyoon; Jae, Moosung

    2014-01-01

    The purpose of this study is to develop methodology for assessing safety culture impact on nuclear power plants. A new methodology for assessing safety culture impact index has been developed and applied for the reference nuclear power plants. The developed SCII model might contribute to comparing the level of safety culture among nuclear power plants as well as to improving the safety of nuclear power plants. Safety culture is defined to be fundamental attitudes and behaviors of the plant staff which demonstrate that nuclear safety is the most important consideration in all activities conducted in nuclear power operation. Through several accidents of nuclear power plant including the Fukusima Daiichi in 2011 and Chernovyl accidents in 1986, the safety of nuclear power plant is emerging into a matter of interest. From the accident review report, it can be easily found out that safety culture is important and one of dominant contributors to accidents. However, the impact methodology for assessing safety culture has not been established analytically yet. It is difficult to develop the methodology for assessing safety culture impact quantitatively

  7. Synthesis of methodology development and case studies

    OpenAIRE

    Roetter, R.P.; Keulen, van, H.; Laar, van, H.H.

    2000-01-01

    The .Systems Research Network for Ecoregional Land Use Planning in Support of Natural Resource Management in Tropical Asia (SysNet). was financed under the Ecoregional Fund, administered by the International Service for National Agricultural Research (ISNAR). The objective of the project was to develop and evaluate methodologies and tools for land use analysis, and apply them at the subnational scale to support agricultural and environmental policy formulation. In the framework of this projec...

  8. Computer Network Operations Methodology

    Science.gov (United States)

    2004-03-01

    means of their computer information systems. Disrupt - This type of attack focuses on disrupting as “attackers might surreptitiously reprogram enemy...by reprogramming the computers that control distribution within the power grid. A disruption attack introduces disorder and inhibits the effective...between commanders. The use of methodologies is widespread and done subconsciously to assist individuals in decision making. The processes that

  9. SCI Hazard Report Methodology

    Science.gov (United States)

    Mitchell, Michael S.

    2010-01-01

    This slide presentation reviews the methodology in creating a Source Control Item (SCI) Hazard Report (HR). The SCI HR provides a system safety risk assessment for the following Ares I Upper Stage Production Contract (USPC) components (1) Pyro Separation Systems (2) Main Propulsion System (3) Reaction and Roll Control Systems (4) Thrust Vector Control System and (5) Ullage Settling Motor System components.

  10. A Functional HAZOP Methodology

    DEFF Research Database (Denmark)

    Liin, Netta; Lind, Morten; Jensen, Niels

    2010-01-01

    A HAZOP methodology is presented where a functional plant model assists in a goal oriented decomposition of the plant purpose into the means of achieving the purpose. This approach leads to nodes with simple functions from which the selection of process and deviation variables follow directly...

  11. Complicating Methodological Transparency

    Science.gov (United States)

    Bridges-Rhoads, Sarah; Van Cleave, Jessica; Hughes, Hilary E.

    2016-01-01

    A historical indicator of the quality, validity, and rigor of qualitative research has been the documentation and disclosure of the behind-the-scenes work of the researcher. In this paper, we use what we call "methodological data" as a tool to complicate the possibility and desirability of such transparency. Specifically, we draw on our…

  12. Methodological Advances in Dea

    NARCIS (Netherlands)

    L. Cherchye (Laurens); G.T. Post (Thierry)

    2001-01-01

    textabstractWe survey the methodological advances in DEA over the last 25 years and discuss the necessary conditions for a sound empirical application. We hope this survey will contribute to the further dissemination of DEA, the knowledge of its relative strengths and weaknesses, and the tools

  13. NUSAM Methodology for Assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Leach, Janice [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Snell, Mark K. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-07-01

    This document provides a methodology for the performance-based assessment of security systems designed for the protection of nuclear and radiological materials and the processes that produce and/or involve them. It is intended for use with both relatively simple installations and with highly regulated complex sites with demanding security requirements.

  14. MIRD methodology. Part 1

    International Nuclear Information System (INIS)

    Rojo, Ana M.

    2004-01-01

    This lecture develops the MIRD (Medical Internal Radiation Dose) methodology for the evaluation of the internal dose due to the administration of radiopharmaceuticals. In this first part, the basic concepts and the main equations are presented. The ICRP Dosimetric System is also explained. (author)

  15. Response Surface Methodology

    NARCIS (Netherlands)

    Kleijnen, Jack P.C.

    2014-01-01

    Abstract: This chapter first summarizes Response Surface Methodology (RSM), which started with Box and Wilson’s article in 1951 on RSM for real, non-simulated systems. RSM is a stepwise heuristic that uses first-order polynomials to approximate the response surface locally. An estimated polynomial

  16. MIRD methodology. Part 2

    International Nuclear Information System (INIS)

    Gomez Parada, Ines

    2004-01-01

    This paper develops the MIRD (Medical Internal Radiation Dose) methodology for the evaluation of the internal dose due to the administration of radiopharmaceuticals. In this second part, different methods for the calculation of the accumulated activity are presented, together with the effective half life definition. Different forms of Retention Activity curves are also shown. (author)

  17. Methodological issues involved in conducting qualitative research ...

    African Journals Online (AJOL)

    The purpose of this article is to describe the methodological issues involved in conducting qualitative research to explore and describe nurses' experience of being directly involved with termination of pregnancies and developing guidelines for support for these nurses. The article points out the sensitivity and responsibility ...

  18. Nucleation in Polymers and Soft Matter

    Science.gov (United States)

    Xu, Xiaofei; Ting, Christina L.; Kusaka, Isamu; Wang, Zhen-Gang

    2014-04-01

    Nucleation is a ubiquitous phenomenon in many physical, chemical, and biological processes. In this review, we describe recent progress on the theoretical study of nucleation in polymeric fluids and soft matter, including binary mixtures (polymer blends, polymers in poor solvents, compressible polymer-small molecule mixtures), block copolymer melts, and lipid membranes. We discuss the methodological development for studying nucleation as well as novel insights and new physics obtained in the study of the nucleation behavior in these systems.

  19. Common Occupational Disability Tests and Case Law References: An Ontario MVA perspective on interpretation and best practice methodology supporting a holistic model, Part I of III (Pre-104 IRB).

    Science.gov (United States)

    Salmon, J Douglas; Gouws, Jacques J; Bachmann, Corina Anghel

    2016-05-01

    This three-part paper presents practical holistic models of determining impairment and occupational disability with respect to common "own occupation" and "any occupation" definitions. The models consider physical, emotional and cognitive impairments in unison, and draw upon case law support for empirically based functional assessment of secondary cognitive symptoms arising from psychological conditions, including chronic pain disorders. Case law is presented, primarily in the context of Ontario motor vehicle accident legislation, to demonstrate how triers of fact have addressed occupational disability in the context of chronic pain; and interpreted the "own occupation" and "any occupation" definitions. In interpreting the definitions of "own occupation" and "any occupation", courts have considered various concepts, such as: work as an integrated whole, competitive productivity, demonstrated job performance vs. employment, work adaptation relative to impairment stability, suitable work, retraining considerations, self-employment, and remuneration/socio-economic status. The first segment of the paper reviews the above concepts largely in the context of pre-104 Income Replacement Benefit (IRB) entitlement, while the second segment focuses on post-104 IRB entitlement. In the final segment, the paper presents a critical evaluation of computerized transferable skills analysis (TSAs) in the occupational disability context. By contrast, support is offered for the notion that (neuro) psychovocational assessments and situational work assessments should play a key role in "own occupation" disability determination, even where specific vocational rehabilitation/retraining recommendations are not requested by the referral source (e.g., insurer disability examination).

  20. Strategies for dark matter detection

    International Nuclear Information System (INIS)

    Silk, J.

    1988-01-01

    The present status of alternative forms of dark matter, both baryonic and nonbaryonic, is reviewed. Alternative arguments are presented for the predominance of either cold dark matter (CDM) or of baryonic dark matter (BDM). Strategies are described for dark matter detection, both for dark matter that consists of weakly interacting relic particles and for dark matter that consists of dark stellar remnants

  1. Stars of strange matter

    International Nuclear Information System (INIS)

    Bethe, H.A.; Brown, G.E.; Cooperstein, J.

    1987-01-01

    We investigate suggestions that quark matter with strangeness per baryon of order unity may be stable. We model this matter at nuclear matter densities as a gas of close packed Λ-particles. From the known mass of the Λ-particle we obtain an estimate of the energy and chemical potential of strange matter at nuclear densities. These are sufficiently high to preclude any phase transition from neutron matter to strange matter in the region near nucleon matter density. Including effects from gluon exchange phenomenologically, we investigate higher densities, consistently making approximations which underestimate the density of transition. In this way we find a transition density ρ tr > or approx.7ρ 0 , where ρ 0 is nuclear matter density. This is not far from the maximum density in the center of the most massive neutron stars that can be constructed. Since we have underestimated ρ tr and still find it to be ∝7ρ 0 , we do not believe that the transition from neutron to quark matter is likely in neutron stars. Moreover, measured masses of observed neutron stars are ≅1.4 M sun , where M sun is the solar mass. For such masses, the central (maximum) density is ρ c 0 . Transition to quark matter is certainly excluded for these densities. (orig.)

  2. Hidden charged dark matter

    International Nuclear Information System (INIS)

    Feng, Jonathan L.; Kaplinghat, Manoj; Tu, Huitzu; Yu, Hai-Bo

    2009-01-01

    Can dark matter be stabilized by charge conservation, just as the electron is in the standard model? We examine the possibility that dark matter is hidden, that is, neutral under all standard model gauge interactions, but charged under an exact (\\rm U)(1) gauge symmetry of the hidden sector. Such candidates are predicted in WIMPless models, supersymmetric models in which hidden dark matter has the desired thermal relic density for a wide range of masses. Hidden charged dark matter has many novel properties not shared by neutral dark matter: (1) bound state formation and Sommerfeld-enhanced annihilation after chemical freeze out may reduce its relic density, (2) similar effects greatly enhance dark matter annihilation in protohalos at redshifts of z ∼ 30, (3) Compton scattering off hidden photons delays kinetic decoupling, suppressing small scale structure, and (4) Rutherford scattering makes such dark matter self-interacting and collisional, potentially impacting properties of the Bullet Cluster and the observed morphology of galactic halos. We analyze all of these effects in a WIMPless model in which the hidden sector is a simplified version of the minimal supersymmetric standard model and the dark matter is a hidden sector stau. We find that charged hidden dark matter is viable and consistent with the correct relic density for reasonable model parameters and dark matter masses in the range 1 GeV ∼ X ∼< 10 TeV. At the same time, in the preferred range of parameters, this model predicts cores in the dark matter halos of small galaxies and other halo properties that may be within the reach of future observations. These models therefore provide a viable and well-motivated framework for collisional dark matter with Sommerfeld enhancement, with novel implications for astrophysics and dark matter searches

  3. Metodología para la implementación del soporte nutricional enteral personalizado como alternativa de la nutrición enteral domiciliaria Methodology for implementation of personalized enteral nutritional support as an alternative for enteral nutrition at home

    Directory of Open Access Journals (Sweden)

    Rafael Jiménez García

    2012-09-01

    . Objective: to show a methodology for the implementation of the personalized enteral nutritional support at home with centralized resources, as an alternative for home nutrition for pediatric patients. Methods: on the basis of the design of the methodology for pediatric enteral nutrition units involving the performance of the hospital nutritional support groups, a methodology was designed that, through concrete actions, manages to integrate the clinical-medical levels and the management levels. Results: this methodology is based on the integration of the primary health care and the secondary health care, in a reciprocal way, in which the centralized control of resources allows both economizing them and organizing them according to the demands on the part of the managing structures. At the same time, the methodological design creates a space for the education of parents and the systematic control of the nutritional support, all of which grants it preventive connotation in line with the objectives of the community-based medicine. Conclusions: the methodology submitted by our working group is a pediatric alternative for the development of home enteral nutrition, as a way of providing nutritional service, with more integration between the primary and the secondary health care levels.

  4. Dark matter in the universe

    Energy Technology Data Exchange (ETDEWEB)

    Turner, M.S. (Fermi National Accelerator Lab., Batavia, IL (USA) Chicago Univ., IL (USA). Enrico Fermi Inst.)

    1990-11-01

    What is the quantity and composition of material in the Universe This is one of the most fundamental questions we can ask about the Universe, and its answer bears on a number of important issues including the formation of structure in the Universe, and the ultimate fate and the earliest history of the Universe. Moreover, answering this question could lead to the discovery of new particles, as well as shedding light on the nature of the fundamental interactions. At present, only a partial answer is at hand: Most of the material in the Universe does not give off detectable radiation, i.e., is dark;'' the dark matter associated with bright galaxies contributes somewhere between 10% and 30% of the critical density (by comparison luminous matter contributes less than 1%); baryonic matter contributes between 1.1% and 12% of critical. The case for the spatially-flat, Einstein-de Sitter model is supported by three compelling theoretical arguments--structure formation, the temporal Copernican principle, and inflation--and by some observational data. If {Omega} is indeed unity--or even just significantly greater than 0.1--then there is a strong case for a Universe comprised of nonbaryonic matter. There are three well motivated particle dark-matter candidates: an axion of mass 10{sup {minus}6} eV to 10{sup {minus}4} eV; a neutralino of mass 10 GeV to about 3 TeV; or a neutrino of mass 20 eV to 90 eV. All three possibilities can be tested by experiments that are either being planned or are underway. 63 refs.

  5. Dark matter in the universe

    International Nuclear Information System (INIS)

    Turner, M.S.; Chicago Univ., IL

    1990-11-01

    What is the quantity and composition of material in the Universe? This is one of the most fundamental questions we can ask about the Universe, and its answer bears on a number of important issues including the formation of structure in the Universe, and the ultimate fate and the earliest history of the Universe. Moreover, answering this question could lead to the discovery of new particles, as well as shedding light on the nature of the fundamental interactions. At present, only a partial answer is at hand: Most of the material in the Universe does not give off detectable radiation, i.e., is ''dark;'' the dark matter associated with bright galaxies contributes somewhere between 10% and 30% of the critical density (by comparison luminous matter contributes less than 1%); baryonic matter contributes between 1.1% and 12% of critical. The case for the spatially-flat, Einstein-de Sitter model is supported by three compelling theoretical arguments--structure formation, the temporal Copernican principle, and inflation--and by some observational data. If Ω is indeed unity--or even just significantly greater than 0.1--then there is a strong case for a Universe comprised of nonbaryonic matter. There are three well motivated particle dark-matter candidates: an axion of mass 10 -6 eV to 10 -4 eV; a neutralino of mass 10 GeV to about 3 TeV; or a neutrino of mass 20 eV to 90 eV. All three possibilities can be tested by experiments that are either being planned or are underway. 63 refs

  6. Dark matter in the Universe

    Energy Technology Data Exchange (ETDEWEB)

    Turner, M.S. (Fermi National Accelerator Lab., Batavia, IL (USA) Chicago Univ., IL (USA). Enrico Fermi Inst.)

    1991-03-01

    What is the quantity and composition of material in the universe This is one of the most fundamental questions we can ask about the universe, and its answer bears on a number of important issues including the formation of structure in the universe, and the ultimate fate and the earliest history of the universe. Moreover, answering this question could lead to the discovery of new particles, as well as shedding light on the nature of the fundamental interactions. At present, only a partial answer is at hand: most of the material in the universe does not give off detectable radiation, i.e., is dark;'' the dark matter associated with bright galaxies contributes somewhere between 10% and 30% of the critical density (by comparison luminous matter contributes less than 1%); baryonic matter contributes between 1.1% and 12% of critical. The case for the spatially-flat, Einstein-de Sitter model is supported by three compelling theoretical arguments -- structure formation, the temporal Copernican principle, and inflation -- and by some observational data. If {Omega} is indeed unity--or even just significantly greater than 0.1--then there is a strong case for a universe comprised of nonbaryonic matter. There are three well motivated particle dark-matter candidates: an axion of mass 10{sup {minus}6} eV to 10{sup {minus}4} eV; a neutralino of mass 10 GeV to about 3 TeV; or a neutrino of mass 20 eV to 90 eV. All three possibilities can be tested by experiments that are either being planned or are underway. 71 refs., 6 figs.

  7. Codecaying Dark Matter.

    Science.gov (United States)

    Dror, Jeff Asaf; Kuflik, Eric; Ng, Wee Hao

    2016-11-18

    We propose a new mechanism for thermal dark matter freeze-out, called codecaying dark matter. Multicomponent dark sectors with degenerate particles and out-of-equilibrium decays can codecay to obtain the observed relic density. The dark matter density is exponentially depleted through the decay of nearly degenerate particles rather than from Boltzmann suppression. The relic abundance is set by the dark matter annihilation cross section, which is predicted to be boosted, and the decay rate of the dark sector particles. The mechanism is viable in a broad range of dark matter parameter space, with a robust prediction of an enhanced indirect detection signal. Finally, we present a simple model that realizes codecaying dark matter.

  8. Diseases of white matter

    International Nuclear Information System (INIS)

    Holland, B.A.

    1987-01-01

    The diagnosis of white matter abnormalities was revolutionized by the advent of computed tomography (CT), which provided a noninvasive method of detection and assessment of progression of a variety of white matter processes. However, the inadequacies of CT were recognized early, including its relative insensitivity to small foci of abnormal myelin in the brain when correlated with autopsy findings and its inability to image directly white matter diseases of the spinal cord. Magnetic resonance imaging (MRI), on the other hand, sensitive to the slight difference in tissue composition of normal gray and white matter and to subtle increase in water content associated with myelin disorders, is uniquely suited for the examination of white matter pathology. Its clinical applications include the evaluation of the normal process of myelination in childhood and the various white matter diseases, including disorders of demyelination and dysmyelination

  9. A methodology for comprehensive strategic planning and program prioritization

    Science.gov (United States)

    Raczynski, Christopher Michael

    2008-10-01

    This process developed in this work, Strategy Optimization for the Allocation of Resources (SOAR), is a strategic planning methodology based off Integrated Product and Process Development and systems engineering techniques. Utilizing a top down approach, the process starts with the creation of the organization vision and its measures of effectiveness. These measures are prioritized based on their application to external world scenarios which will frame the future. The programs which will be used to accomplish this vision are identified by decomposing the problem. Information is gathered on the programs as to the application, cost, schedule, risk, and other pertinent information. The relationships between the levels of the hierarchy are mapped utilizing subject matter experts. These connections are then utilized to determine the overall benefit of the programs to the vision of the organization. Through a Multi-Objective Genetic Algorithm a tradespace of potential program portfolios can be created amongst which the decision maker can allocate resources. The information and portfolios are presented to the decision maker through the use of a Decision Support System which collects and visualizes all the data in a single location. This methodology was tested utilizing a science and technology planning exercise conducted by the United States Navy. A thorough decomposition was defined and technology programs identified which had the potential to provide benefit to the vision. The prioritization of the top level capabilities was performed through the use of a rank ordering scheme and a previous naval application was used to demonstrate a cumulative voting scheme. Voting was performed utilizing the Nominal Group Technique to capture the relationships between the levels of the hierarchy. Interrelationships between the technologies were identified and a MOGA was utilized to optimize portfolios with respect to these constraints and information was placed in a DSS. This

  10. Soft Systems Methodology

    Science.gov (United States)

    Checkland, Peter; Poulter, John

    Soft systems methodology (SSM) is an approach for tackling problematical, messy situations of all kinds. It is an action-oriented process of inquiry into problematic situations in which users learn their way from finding out about the situation, to taking action to improve it. The learning emerges via an organised process in which the situation is explored using a set of models of purposeful action (each built to encapsulate a single worldview) as intellectual devices, or tools, to inform and structure discussion about a situation and how it might be improved. This paper, written by the original developer Peter Checkland and practitioner John Poulter, gives a clear and concise account of the approach that covers SSM's specific techniques, the learning cycle process of the methodology and the craft skills which practitioners develop. This concise but theoretically robust account nevertheless includes the fundamental concepts, techniques, core tenets described through a wide range of settings.

  11. Transparent Guideline Methodology Needed

    DEFF Research Database (Denmark)

    Lidal, Ingeborg; Norén, Camilla; Mäkelä, Marjukka

    2013-01-01

    As part of learning at the Nordic Workshop of Evidence-based Medicine, we have read with interest the practice guidelines for central venous access, published in your Journal in 2012.1 We appraised the quality of this guideline using the checklist developed by The Evidence-Based Medicine Working ...... are based on best currently available evidence. Our concerns are in two main categories: the rigor of development, including methodology of searching, evaluating, and combining the evidence; and editorial independence, including funding and possible conflicts of interest....... Group.2 Similar criteria for guideline quality have been suggested elsewhere.3 Our conclusion was that this much needed guideline is currently unclear about several aspects of the methodology used in developing the recommendations. This means potential users cannot be certain that the recommendations...

  12. Web survey methodology

    CERN Document Server

    Callegaro, Mario; Vehovar, Asja

    2015-01-01

    Web Survey Methodology guides the reader through the past fifteen years of research in web survey methodology. It both provides practical guidance on the latest techniques for collecting valid and reliable data and offers a comprehensive overview of research issues. Core topics from preparation to questionnaire design, recruitment testing to analysis and survey software are all covered in a systematic and insightful way. The reader will be exposed to key concepts and key findings in the literature, covering measurement, non-response, adjustments, paradata, and cost issues. The book also discusses the hottest research topics in survey research today, such as internet panels, virtual interviewing, mobile surveys and the integration with passive measurements, e-social sciences, mixed modes and business intelligence. The book is intended for students, practitioners, and researchers in fields such as survey and market research, psychological research, official statistics and customer satisfaction research.

  13. Clumpy cold dark matter

    Science.gov (United States)

    Silk, Joseph; Stebbins, Albert

    1993-01-01

    A study is conducted of cold dark matter (CDM) models in which clumpiness will inhere, using cosmic strings and textures suited to galaxy formation. CDM clumps of 10 million solar mass/cu pc density are generated at about z(eq) redshift, with a sizable fraction surviving. Observable implications encompass dark matter cores in globular clusters and in galactic nuclei. Results from terrestrial dark matter detection experiments may be affected by clumpiness in the Galactic halo.

  14. Steganography: LSB Methodology

    Science.gov (United States)

    2012-08-02

    of LSB steganography in grayscale and color images . In J. Dittmann, K. Nahrstedt, and P. Wohlmacher, editors, Proceedings of the ACM, Special...Fridrich, M. Gojan and R. Du paper titled “Reliable detection of LSB steganography in grayscale and color images ”. From a general perspective Figure 2...REPORT Steganography : LSB Methodology (Progress Report) 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: In computer science, steganography is the science

  15. Supplement to the Disposal Criticality Analysis Methodology

    International Nuclear Information System (INIS)

    Thomas, D.A.

    1999-01-01

    The methodology for evaluating criticality potential for high-level radioactive waste and spent nuclear fuel after the repository is sealed and permanently closed is described in the Disposal Criticality Analysis Methodology Topical Report (DOE 1998b). The topical report provides a process for validating various models that are contained in the methodology and states that validation will be performed to support License Application. The Supplement to the Disposal Criticality Analysis Methodology provides a summary of data and analyses that will be used for validating these models and will be included in the model validation reports. The supplement also summarizes the process that will be followed in developing the model validation reports. These reports will satisfy commitments made in the topical report, and thus support the use of the methodology for Site Recommendation and License Application. It is concluded that this report meets the objective of presenting additional information along with references that support the methodology presented in the topical report and can be used both in validation reports and in answering request for additional information received from the Nuclear Regulatory Commission concerning the topical report. The data and analyses summarized in this report and presented in the references are not sufficient to complete a validation report. However, this information will provide a basis for several of the validation reports. Data from several references in this report have been identified with TBV-1349. Release of the TBV governing this data is required prior to its use in quality affecting activities and for use in analyses affecting procurement, construction, or fabrication. Subsequent to the initiation of TBV-1349, DOE issued a concurrence letter (Mellington 1999) approving the request to identify information taken from the references specified in Section 1.4 as accepted data

  16. An overview of IPSN research on the evolution of the natural systems in support of the French methodology for the safety evaluation of radwaste disposal in deep geological formations

    International Nuclear Information System (INIS)

    Escalier des Orres, P.; Granier, T.; Mohammadioun, B.

    1992-01-01

    A regulatory guidance has been recently set up in France for the safety assessment of radwaste deep geological disposal: the present paper concerns the requirements related to bedrock stability issues and their technical background. This regulation relies in particular on a long term effort of the Protection and Nuclear Safety Institute (IPSN) of the French Atomic Energy Commission (CEA), which ensures two main duties: it carries out research programs in the area of protection and nuclear safety and provides expertise to the safety authorities. It should be noted that ANDRA (the French National Radioactive Waste Management Agency) is responsible for the safety of radioactive waste management and relies, for safety demonstration purposes, on its research programs. IPSN, in support of the safety authorities, is in charge of the verification of the applicant's safety demonstration and develops its own research programs in order to achieve an independent capability in safety analysis expertise. We present here the major axes of the Institute research program devoted to the assessment of seismic events consequences on the groundwater system. 19 refs., 8 figs

  17. An overview of IPSN research on the evolution of the natural systems in support of the French methodology for the safety evaluation of radwaste disposal in deep geological formations

    International Nuclear Information System (INIS)

    Escalier des Orres, P.; Granier, T.; Mohammadioun, B.

    1992-01-01

    A regulatory guidance has been recently set up in France for the safety assessment of radwaste deep in geological disposal: the present paper concerns the requirements related to bedrock stability issues and their technical background. This regulation relies in particular on a long term effort of the Protection and Nuclear Safety Institute (IPSN) of the French Atomic Energy Commission (CEA), which ensures two main duties: it carries out research programs in the area of protection and nuclear safety and provides expertise to the safety authorities. It should be noted that ANDRA (the French National Radioactive Waste Management Agency) is responsible for the safety of radioactive waste management and relies, for safety demonstration purposes, on its research programs. IPSN, in support of the safety authorities, is in charge of the verification of the applicant's safety demonstration and develops its own research programs in order to achieve an independent capability in safety analysis expertise. We present here the major axes of the Institute research program devoted to the assessment of seismic events consequences on the groundwater system. 19 refs., 8 figs

  18. Hybrid Dark Matter

    OpenAIRE

    Chao, Wei

    2018-01-01

    Dark matter can be produced in the early universe via the freeze-in or freeze-out mechanisms. Both scenarios were investigated in references, but the production of dark matters via the combination of these two mechanisms are not addressed. In this paper we propose a hybrid dark matter model where dark matters have two components with one component produced thermally and the other one produced non-thermally. We present for the first time the analytical calculation for the relic abundance of th...

  19. The quark matter

    International Nuclear Information System (INIS)

    Rho, Mannque.

    1980-04-01

    The present status of our understanding of the physics of hadronic (nuclear or neutron) matter under extreme conditions, in particular at high densities is discussed. This is a problem which challenges three disciplines of physics: nuclear physics, astrophysics and particle physics. It is generally believed that we now have a correct and perhaps ultimate theory of the strong interactions, namely quantum chromodynamics (QCD). The constituents of this theory are quarks and gluons, so highly dense matters should be describable in terms of these constituents alone. This is a question that addresses directly to the phenomenon of quark confinement, one of the least understood aspects in particle physics. For nuclear physics, the possibility of a phase change between nuclear matter and quark matter introduces entirely new degrees of freedom in the description of nuclei and will bring perhaps a deeper understanding of nuclear dynamics. In astrophysics, the properties of neutron stars will be properly understood only when the equation of state of 'neutron' matter at densities exceeding that of nuclear matter can be realiably calculated. Most fascinating is the possibility of quark stars existing in nature, not entirely an absurd idea. Finally the quark matter - nuclear matter phase transition must have occured in the early stage of universe when matter expanded from high temperature and density; this could be an essential ingredient in the big-bang cosmology

  20. Soft matter physics

    CERN Document Server

    Doi, Masao

    2013-01-01

    Soft matter (polymers, colloids, surfactants and liquid crystals) are an important class of materials in modern technology. They also form the basis of many future technologies, for example in medical and environmental applications. Soft matter shows complex behaviour between fluids and solids, and used to be a synonym of complex materials. Due to the developments of the past two decades, soft condensed matter can now be discussed on the same sound physical basis as solid condensedmatter. The purpose of this book is to provide an overview of soft matter for undergraduate and graduate students

  1. Searching for dark matter

    Science.gov (United States)

    Mateo, Mario

    1994-01-01

    Three teams of astronomers believe they have independently found evidence for dark matter in our galaxy. A brief history of the search for dark matter is presented. The use of microlensing-event observation for spotting dark matter is described. The equipment required to observe microlensing events and three groups working on dark matter detection are discussed. The three groups are the Massive Compact Halo Objects (MACHO) Project team, the Experience de Recherche d'Objets Sombres (EROS) team, and the Optical Gravitational Lensing Experiment (OGLE) team. The first apparent detections of microlensing events by the three teams are briefly reported.

  2. Quark matter and cosmology

    International Nuclear Information System (INIS)

    Schramm, D.N.; Fields, B.; Thomas, D.

    1992-01-01

    The possible implications of the quark-hadron transition for cosmology are explored. Possible surviving signatures are discussed. In particular, the possibility of generating a dark matter candidate such as strange nuggets or planetary mass black holes is noted. Much discussion is devoted to the possible role of the transition for cosmological nucleosynthesis. It is emphasized that even an optimized first order phase transition will not significantly alter the nucleosynthesis constraints on the cosmological baryon density nor on neutrino counting. However, it is noted that Be and B observations in old stars may eventually be able to be a signature of a cosmologically significant quark-hadron transition. It is pointed out that the critical point in this regard is whether the observed B/Be ratio can be produced by spallation processes or requires cosmological input. Spallation cannot produce a B/Be ratio below 7.6. A supporting signature would be Be and B ratios to oxygen that greatly exceed galactic values. At present, all data is still consistent with a spallagenic origin

  3. CATHARE code development and assessment methodologies

    International Nuclear Information System (INIS)

    Micaelli, J.C.; Barre, F.; Bestion, D.

    1995-01-01

    The CATHARE thermal-hydraulic code has been developed jointly by Commissariat a l'Energie Atomique (CEA), Electricite de France (EdF), and Framatorne for safety analysis. Since the beginning of the project (September 1979), development and assessment activities have followed a methodology supported by two series of experimental tests: separate effects tests and integral effects tests. The purpose of this paper is to describe this methodology, the code assessment status, and the evolution to take into account two new components of this program: the modeling of three-dimensional phenomena and the requirements of code uncertainty evaluation

  4. Methodology for evaluation of diagnostic performance

    International Nuclear Information System (INIS)

    Metz, C.E.

    1992-01-01

    Effort in this project during the past year has focused on the development, refinement, and distribution of computer software that will allow current Receiver Operating Characteristic (ROC) methodology to be used conveniently and reliably by investigators in a variety of evaluation tasks in diagnostic medicine; and on the development of new ROC methodology that will broaden the spectrum of evaluation tasks and/or experimental settings to which the fundamental approach can be applied. Progress has been limited by the amount of financial support made available to the project

  5. Methodology for the interactive graphic simulator construction

    International Nuclear Information System (INIS)

    Milian S, Idalmis; Rodriguez M, Lazaro; Lopez V, Miguel A.

    1997-01-01

    The PC-supported Interactive Graphic Simulators (IGS) have successfully been used for industrial training programs in many countries. This paper is intended to illustrate the general methodology applied by our research team for the construction of this kind of conceptual or small scale simulators. The information and tools available to achieve this goal are also described. The applicability of the present methodology was confirmed with the construction of a set of IGS for nuclear power plants operators training programs in Cuba. One of them, relating reactor kinetics, is shown and briefly described in this paper. (author). 11 refs., 3 figs

  6. White matter abnormalities in tuberous sclerosis complex

    Energy Technology Data Exchange (ETDEWEB)

    Griffiths, P.D. [Sheffield Univ. (United Kingdom). Academic Dept. of Radiology; Bolton, P. [Cambridge Univ. (United Kingdom). Section of Developmental Psychiatry; Verity, C. [Addenbrooke`s NHS Trust, Cambridge (United Kingdom). Dept. of Paediatric Radiology

    1998-09-01

    The aim of this study was to investigate and describe the range of white matter abnormalities in children with tuberous sclerosis complex by means of MR imaging. Material and Methods: A retrospective cross-sectional study was performed on the basis of MR imaging findings in 20 cases of tuberous sclerosis complex in children aged 17 years or younger. Results: White matter abnormalities were present in 19/20 (95%) cases of tuberous sclerosis complex. These were most frequently (19/20 cases) found in relation to cortical tubers in the supratentorial compartment. White matter abnormalities related to tubers were found in the cerebellum in 3/20 (15%) cases. White matter abnormalities described as radial migration lines were found in relation to 5 tubers in 3 (15%) children. In 4/20 (20%) cases, white matter abnormalities were found that were not related to cortical tubers. These areas had the appearance of white matter cysts in 3 cases and infarction in the fourth. In the latter case there was a definable event in the clinical history, supporting the diagnosis of stroke. Conclusion: A range of white matter abnormalities were found by MR imaging in tuberous sclerosis complex, the commonest being gliosis and hypomyelination related to cortical tubers. Radial migration lines were seen infrequently in relation to cortical tubers and these are thought to represent heterotopic glia and neurons along the expected path of cortical migration. (orig.)

  7. Methodology applied to develop the DHIE: applied methodology

    CSIR Research Space (South Africa)

    Herselman, Marlien

    2016-12-01

    Full Text Available This section will address the methodology that was applied to develop the South African Digital Health Innovation Ecosystem (DHIE). Each chapter under Section B represents a specific phase in the methodology....

  8. Moynihan: A Methodological Note

    Science.gov (United States)

    Swan, L. Alex

    1974-01-01

    Presents an analysis of the significant parts of the census data Moynihan used to support his argument to determine whether the conclusions he reached are supported by such data, finding that he presents no data to substantiate his argument that black social problems are a function of family instability. (Author/JM)

  9. Dark matter and cosmology

    Energy Technology Data Exchange (ETDEWEB)

    Schramm, D.N.

    1992-03-01

    The cosmological dark matter problem is reviewed. The Big Bang Nucleosynthesis constraints on the baryon density are compared with the densities implied by visible matter, dark halos, dynamics of clusters, gravitational lenses, large-scale velocity flows, and the {Omega} = 1 flatness/inflation argument. It is shown that (1) the majority of baryons are dark; and (2) non-baryonic dark matter is probably required on large scales. It is also noted that halo dark matter could be either baryonic or non-baryonic. Descrimination between ``cold`` and ``hot`` non-baryonic candidates is shown to depend on the assumed ``seeds`` that stimulate structure formation. Gaussian density fluctuations, such as those induced by quantum fluctuations, favor cold dark matter, whereas topological defects such as strings, textures or domain walls may work equally or better with hot dark matter. A possible connection between cold dark matter, globular cluster ages and the Hubble constant is mentioned. Recent large-scale structure measurements, coupled with microwave anisotropy limits, are shown to raise some questions for the previously favored density fluctuation picture. Accelerator and underground limits on dark matter candidates are also reviewed.

  10. Dark matter and cosmology

    Energy Technology Data Exchange (ETDEWEB)

    Schramm, D.N.

    1992-03-01

    The cosmological dark matter problem is reviewed. The Big Bang Nucleosynthesis constraints on the baryon density are compared with the densities implied by visible matter, dark halos, dynamics of clusters, gravitational lenses, large-scale velocity flows, and the {Omega} = 1 flatness/inflation argument. It is shown that (1) the majority of baryons are dark; and (2) non-baryonic dark matter is probably required on large scales. It is also noted that halo dark matter could be either baryonic or non-baryonic. Descrimination between cold'' and hot'' non-baryonic candidates is shown to depend on the assumed seeds'' that stimulate structure formation. Gaussian density fluctuations, such as those induced by quantum fluctuations, favor cold dark matter, whereas topological defects such as strings, textures or domain walls may work equally or better with hot dark matter. A possible connection between cold dark matter, globular cluster ages and the Hubble constant is mentioned. Recent large-scale structure measurements, coupled with microwave anisotropy limits, are shown to raise some questions for the previously favored density fluctuation picture. Accelerator and underground limits on dark matter candidates are also reviewed.

  11. Superball dark matter

    CERN Document Server

    Kusenko, A

    1999-01-01

    Supersymmetric models predict a natural dark-matter candidate, stable baryonic Q-balls. They could be copiously produced in the early Universe as a by-product of the Affleck-Dine baryogenesis. I review the cosmological and astrophysical implications, methods of detection, and the present limits on this form of dark matter.

  12. Dark matter detection - II

    International Nuclear Information System (INIS)

    Zacek, Viktor

    2015-01-01

    The quest for the mysterious missing mass of the universe has become one of the big challenges of today's particle physics and cosmology. Astronomical observations show that only 1% of the matter of the universe is luminous. Moreover there is now convincing evidence that 85% of all gravitationally observable matter in the universe is of a new exotic kind, different from the 'ordinary' matter surrounding us. In a series of three lectures we discuss past, recent and future efforts made world-wide to detect and/or decipher the nature of Dark Matter. In Lecture I we review our present knowledge of the Dark Matter content of the Universe and how experimenters search for it's candidates; In Lecture II we discuss so-called 'direct detection' techniques which allow to search for scattering of galactic dark matter particles with detectors in deep-underground laboratories; we discuss the interpretation of experimental results and the challenges posed by different backgrounds; In Lecture III we take a look at the 'indirect detection' of the annihilation of dark matter candidates in astrophysical objects, such as our sun or the center of the Milky Way; In addition we will have a look at efforts to produce Dark Matter particles directly at accelerators and we shall close with a look at alternative nonparticle searches and future prospects. (author)

  13. Dark matter and cosmology

    International Nuclear Information System (INIS)

    Schramm, D.N.

    1992-03-01

    The cosmological dark matter problem is reviewed. The Big Bang Nucleosynthesis constraints on the baryon density are compared with the densities implied by visible matter, dark halos, dynamics of clusters, gravitational lenses, large-scale velocity flows, and the Ω = 1 flatness/inflation argument. It is shown that (1) the majority of baryons are dark; and (2) non-baryonic dark matter is probably required on large scales. It is also noted that halo dark matter could be either baryonic or non-baryonic. Descrimination between ''cold'' and ''hot'' non-baryonic candidates is shown to depend on the assumed ''seeds'' that stimulate structure formation. Gaussian density fluctuations, such as those induced by quantum fluctuations, favor cold dark matter, whereas topological defects such as strings, textures or domain walls may work equally or better with hot dark matter. A possible connection between cold dark matter, globular cluster ages and the Hubble constant is mentioned. Recent large-scale structure measurements, coupled with microwave anisotropy limits, are shown to raise some questions for the previously favored density fluctuation picture. Accelerator and underground limits on dark matter candidates are also reviewed

  14. Matter in transition

    International Nuclear Information System (INIS)

    Anderson, Lara B.; Gray, James; Raghuram, Nikhil; Taylor, Washington

    2016-01-01

    We explore a novel type of transition in certain 6D and 4D quantum field theories, in which the matter content of the theory changes while the gauge group and other parts of the spectrum remain invariant. Such transitions can occur, for example, for SU(6) and SU(7) gauge groups, where matter fields in a three-index antisymmetric representation and the fundamental representation are exchanged in the transition for matter in the two-index antisymmetric representation. These matter transitions are realized by passing through superconformal theories at the transition point. We explore these transitions in dual F-theory and heterotic descriptions, where a number of novel features arise. For example, in the heterotic description the relevant 6D SU(7) theories are described by bundles on K3 surfaces where the geometry of the K3 is constrained in addition to the bundle structure. On the F-theory side, non-standard representations such as the three-index antisymmetric representation of SU(N) require Weierstrass models that cannot be realized from the standard SU(N) Tate form. We also briefly describe some other situations, with groups such as Sp(3), SO(12), and SU(3), where analogous matter transitions can occur between different representations. For SU(3), in particular, we find a matter transition between adjoint matter and matter in the symmetric representation, giving an explicit Weierstrass model for the F-theory description of the symmetric representation that complements another recent analogous construction.

  15. CONFERENCE: Quark matter 88

    International Nuclear Information System (INIS)

    Jacob, Maurice

    1988-01-01

    The 'Quark Matter' Conference caters for physicists studying nuclear matter under extreme conditions. The hope is that relativistic (high energy) heavy ion collisions allow formation of the long-awaited quark-gluon plasma, where the inter-quark 'colour' force is no longer confined inside nucleon-like dimensions

  16. Dark matter detection - I

    International Nuclear Information System (INIS)

    Zacek, Viktor

    2015-01-01

    The quest for the mysterious missing mass of the universe has become one of the big challenges of today's particle physics and cosmology. Astronomical observations show that only 1% of the matter of the universe is luminous. Moreover there is now convincing evidence that 85% of all gravitationally observable matter in the universe is of a new exotic kind, different from the 'ordinary' matter surrounding us. In a series of three lectures we discuss past, recent and future efforts made world-wide to detect and/or decipher the nature of Dark Matter. In Lecture I we review our present knowledge of the Dark Matter content of the Universe and how experimenters search for it's candidates; In Lecture II we discuss so-called 'direct detection' techniques which allow to search for scattering of galactic dark matter particles with detectors in deep-underground laboratories; we discuss the interpretation of experimental results and the challenges posed by different backgrounds; In Lecture III we take a look at the 'indirect detection' of the annihilation of dark matter candidates in astrophysical objects, such as our sun or the center of the Milky Way; In addition we will have a look at efforts to produce Dark Matter particles directly at accelerators and we shall close with a look at alternative nonparticle searches and future prospects. (author)

  17. Dark matter detection - III

    International Nuclear Information System (INIS)

    Zacek, Viktor

    2015-01-01

    The quest for the missing mass of the universe has become one of the big challenges of todays particle physics and cosmology. Astronomical observations show that only 1% of the matter of the Universe is luminous. Moreover there is now convincing evidence that 85% of all gravitationally observable matter in the Universe is of a new exotic kind, different from the 'ordinary' matter surrounding us. In a series of three lectures we discuss past, recent and future efforts made world- wide to detect and/or decipher the nature of Dark Matter. In Lecture I we review our present knowledge of the Dark Matter content of the Universe and how experimenters search for it's candidates; In Lecture II we discuss so-called 'direct detection' techniques which allow to search for scattering of galactic dark matter particles with detectors in deep-underground laboratories; we discuss the interpretation of experimental results and the challenges posed by different backgrounds; In Lecture III we take a look at the 'indirect detection' of the annihilation of dark matter candidates in astrophysical objects, such as our sun or the center of the Milky Way; In addition we will have a look at efforts to produce Dark Matter particles directly at accelerators and we shall close with a look at alternative nonparticle searches and future prospects. (author)

  18. Asymptotically Safe Dark Matter

    DEFF Research Database (Denmark)

    Sannino, Francesco; Shoemaker, Ian M.

    2015-01-01

    We introduce a new paradigm for dark matter (DM) interactions in which the interaction strength is asymptotically safe. In models of this type, the coupling strength is small at low energies but increases at higher energies, and asymptotically approaches a finite constant value. The resulting...... searches are the primary ways to constrain or discover asymptotically safe dark matter....

  19. Asymmetric dark matter

    International Nuclear Information System (INIS)

    Kaplan, David E.; Luty, Markus A.; Zurek, Kathryn M.

    2009-01-01

    We consider a simple class of models in which the relic density of dark matter is determined by the baryon asymmetry of the Universe. In these models a B-L asymmetry generated at high temperatures is transferred to the dark matter, which is charged under B-L. The interactions that transfer the asymmetry decouple at temperatures above the dark matter mass, freezing in a dark matter asymmetry of order the baryon asymmetry. This explains the observed relation between the baryon and dark matter densities for the dark matter mass in the range 5-15 GeV. The symmetric component of the dark matter can annihilate efficiently to light pseudoscalar Higgs particles a or via t-channel exchange of new scalar doublets. The first possibility allows for h 0 →aa decays, while the second predicts a light charged Higgs-like scalar decaying to τν. Direct detection can arise from Higgs exchange in the first model or a nonzero magnetic moment in the second. In supersymmetric models, the would-be lightest supersymmetric partner can decay into pairs of dark matter particles plus standard model particles, possibly with displaced vertices.

  20. Case Study Research Methodology

    Directory of Open Access Journals (Sweden)

    Mark Widdowson

    2011-01-01

    Full Text Available Commenting on the lack of case studies published in modern psychotherapy publications, the author reviews the strengths of case study methodology and responds to common criticisms, before providing a summary of types of case studies including clinical, experimental and naturalistic. Suggestions are included for developing systematic case studies and brief descriptions are given of a range of research resources relating to outcome and process measures. Examples of a pragmatic case study design and a hermeneutic single-case efficacy design are given and the paper concludes with some ethical considerations and an exhortation to the TA community to engage more widely in case study research.

  1. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  2. Geotechnical site assessment methodology

    International Nuclear Information System (INIS)

    Tunbridge, L.W.; Richards, L.R.

    1985-09-01

    The reports comprising this volume concern the research conducted on geotechnical site assessment methodology at the Carwynnen test mine in granites in Cornwall, with particular reference to the effect of structures imposed by discontinuities on the engineering behaviour of rock masses. The topics covered are: in-situ stress measurements using (a) the hydraulic fracturing method, or (b) the US Bureau of Mines deformation probe; scanline discontinuity survey - coding form and instructions, and data; applicability of geostatistical estimation methods to scalar rock properties; comments on in-situ stress at the Carwynnen test mine and the state of stress in the British Isles. (U.K.)

  3. Microphysics evolution and methodology

    International Nuclear Information System (INIS)

    Dionisio, J.S.

    1990-01-01

    A few general features of microscopics evolution and their relationship with microscopics methodology are briefly surveyed. Several pluri-disciplinary and interdisciplinary aspects of microscopics research are also discussed in the present scientific context. The need for an equilibrium between individual tendencies and collective constraints required by team work, already formulated thirty years ago by Frederic Joliot, is particularly stressed in the present conjuncture of Nuclear Research favouring very large team projects and discouraging individual initiatives. The increasing importance of the science of science (due to their multiple social, economical, ecological aspects) and the stronger competition between national and international tendencies of scientific (and technical) cooperation are also discussed. (author)

  4. MIRD methodology; Metodologia MIRD

    Energy Technology Data Exchange (ETDEWEB)

    Rojo, Ana M [Autoridad Regulatoria Nuclear, Buenos Aires (Argentina); Gomez Parada, Ines [Sociedad Argentina de Radioproteccion, Buenos Aires (Argentina)

    2004-07-01

    The MIRD (Medical Internal Radiation Dose) system was established by the Society of Nuclear Medicine of USA in 1960 to assist the medical community in the estimation of the dose in organs and tissues due to the incorporation of radioactive materials. Since then, 'MIRD Dose Estimate Report' (from the 1 to 12) and 'Pamphlets', of great utility for the dose calculations, were published. The MIRD system was planned essentially for the calculation of doses received by the patients during nuclear medicine diagnostic procedures. The MIRD methodology for the absorbed doses calculations in different tissues is explained.

  5. Beam optimization: improving methodology

    International Nuclear Information System (INIS)

    Quinteiro, Guillermo F.

    2004-01-01

    Different optimization techniques commonly used in biology and food technology allow a systematic and complete analysis of response functions. In spite of the great interest in medical and nuclear physics in the problem of optimizing mixed beams, little attention has been given to sophisticate mathematical tools. Indeed, many techniques are perfectly suited to the typical problem of beam optimization. This article is intended as a guide to the use of two methods, namely Response Surface Methodology and Simplex, that are expected to fasten the optimization process and, meanwhile give more insight into the relationships among the dependent variables controlling the response

  6. Literacy research methodologies

    CERN Document Server

    Duke, Nell K

    2012-01-01

    The definitive reference on literacy research methods, this book serves as a key resource for researchers and as a text in graduate-level courses. Distinguished scholars clearly describe established and emerging methodologies, discuss the types of questions and claims for which each is best suited, identify standards of quality, and present exemplary studies that illustrate the approaches at their best. The book demonstrates how each mode of inquiry can yield unique insights into literacy learning and teaching and how the methods can work together to move the field forward.   New to This Editi

  7. Methodology of site studies

    International Nuclear Information System (INIS)

    Caries, J.C.; Hugon, J.; Grauby, A.

    1980-01-01

    This methodology consists in an essentially dynamic, estimated and follow-up analysis of the impact of discharges on all the environment compartments, whether natural or not, that play a part in the protection of man and his environment. It applies at two levels, to wit: the choice of site, or the detailed study of the site selected. Two examples of its application will be developed, namely: at the choice of site level in the case of marine sites, and of the detailed study level of the chosen site in that of a riverside site [fr

  8. Alternative pricing methodologies

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    With the increased interest in competitive market forces and growing recognition of the deficiencies in current practices, FERC and others are exploring alternatives to embedded cost pricing. A number of these alternatives are discussed in this chapter. Marketplace pricing, discussed briefly here, is the subject of the next chapter. Obviously, the pricing formula may combine several of these methodologies. One utility of which the authors are aware is seeking a price equal to the sum of embedded costs, opportunity costs, line losses, value of service, FERC's percentage adder formula and a contract service charge

  9. Inelastic dark matter

    International Nuclear Information System (INIS)

    Smith, David; Weiner, Neal

    2001-01-01

    Many observations suggest that much of the matter of the universe is nonbaryonic. Recently, the DAMA NaI dark matter direct detection experiment reported an annual modulation in their event rate consistent with a WIMP relic. However, the Cryogenic Dark Matter Search (CDMS) Ge experiment excludes most of the region preferred by DAMA. We demonstrate that if the dark matter can only scatter by making a transition to a slightly heavier state (Δm∼100 keV), the experiments are no longer in conflict. Moreover, differences in the energy spectrum of nuclear recoil events could distinguish such a scenario from the standard WIMP scenario. Finally, we discuss the sneutrino as a candidate for inelastic dark matter in supersymmetric theories

  10. Baryonic dark matter

    International Nuclear Information System (INIS)

    Uson, Juan M.

    2000-01-01

    Many searches for baryonic dark matter have been conducted but, so far, all have been unsuccessful. Indeed, no more than 1% of the dark matter can be in the form of hydrogen burning stars. It has recently been suggested that most of the baryons in the universe are still in the form of ionized gas so that it is possible that there is no baryonic dark matter. Although it is likely that a significant fraction of the dark matter in the Milky Way is in a halo of non-baryonic matter, the data do not exclude the possibility that a considerable amount, perhaps most of it, could be in a tenuous halo of diffuse ionized gas

  11. Dark matter universe.

    Science.gov (United States)

    Bahcall, Neta A

    2015-10-06

    Most of the mass in the universe is in the form of dark matter--a new type of nonbaryonic particle not yet detected in the laboratory or in other detection experiments. The evidence for the existence of dark matter through its gravitational impact is clear in astronomical observations--from the early observations of the large motions of galaxies in clusters and the motions of stars and gas in galaxies, to observations of the large-scale structure in the universe, gravitational lensing, and the cosmic microwave background. The extensive data consistently show the dominance of dark matter and quantify its amount and distribution, assuming general relativity is valid. The data inform us that the dark matter is nonbaryonic, is "cold" (i.e., moves nonrelativistically in the early universe), and interacts only weakly with matter other than by gravity. The current Lambda cold dark matter cosmology--a simple (but strange) flat cold dark matter model dominated by a cosmological constant Lambda, with only six basic parameters (including the density of matter and of baryons, the initial mass fluctuations amplitude and its scale dependence, and the age of the universe and of the first stars)--fits remarkably well all the accumulated data. However, what is the dark matter? This is one of the most fundamental open questions in cosmology and particle physics. Its existence requires an extension of our current understanding of particle physics or otherwise point to a modification of gravity on cosmological scales. The exploration and ultimate detection of dark matter are led by experiments for direct and indirect detection of this yet mysterious particle.

  12. Methodology for Environmental Impact Assessment; Metodik foer miljoekonsekvensbedoemning

    Energy Technology Data Exchange (ETDEWEB)

    Malmlund, Anna (Structor Miljoebyraan Stockholm AB (Sweden))

    2010-12-15

    This report is an appendix to 'Environmental Impact Assessment Interim storage, encapsulation and disposal of spent nuclear fuel'. The appendix presents the methodology and criteria used in support investigations to conduct impact assessments.

  13. Present stage evaluation of Furnas calculus methodology qualification

    International Nuclear Information System (INIS)

    1987-07-01

    This technical note is about the present stage evaluation of FURNAS Calculus Methodology Qualification related to reload licensing process and licensing support of operation questions of Angra 1 NPP concerning transient and Core ThermalHydraulic areas. (Author) [pt

  14. Proposing C4ISR Architecture Methodology for Homeland Security

    National Research Council Canada - National Science Library

    Farah-Stapleton, Monica F; Dimarogonas, James; Eaton, Rodney; Deason, Paul J

    2004-01-01

    This presentation presents how a network architecture methodology developed for the Army's Future Force could be applied to the requirements of Civil Support, Homeland Security/Homeland Defense (CS HLS/HLD...

  15. Ratcheting Up The Search for Dark Matter

    Energy Technology Data Exchange (ETDEWEB)

    McDermott, Samuel Dylan [Univ. of Michigan, Ann Arbor, MI (United States)

    2014-01-01

    The last several years have included remarkable advances in two of the primary areas of fundamental particle physics: the search for dark matter and the discovery of the Higgs boson. This dissertation will highlight some contributions made on the forefront of these exciting fields. Although the circumstantial evidence supporting the dark matter hypothesis is now almost undeniably significant, indisputable direct proof is still lacking. As the direct searches for dark matter continue, we can maximize our prospects of discovery by using theoretical techniques complementary to the observational searches to rule out additional, otherwise accessible parameter space. In this dissertation, I report bounds on a wide range of dark matter theories. The models considered here cover the spectrum from the canonical case of self-conjugate dark matter with weak-scale interactions, to electrically charged dark matter, to non-annihilating, non-fermionic dark matter. These bounds are obtained from considerations of astrophysical and cosmological data, including, respectively: diffuse gamma ray photon observations; structure formation considerations, along with an explication of the novel local dark matter structure due to galactic astrophysics; and the existence of old pulsars in dark-matter-rich environments. I also consider the prospects for a model of neutrino dark matter which has been motivated by a wide set of seemingly contradictory experimental results. In addition, I include a study that provides the tools to begin solving the speculative ``inverse'' problem of extracting dark matter properties solely from hypothetical nuclear energy spectra, which we may face if dark matter is discovered with multiple direct detection experiments. In contrast to the null searches for dark matter, we have the example of the recent discovery of the Higgs boson. The Higgs boson is the first fundamental scalar particle ever observed, and precision measurements of the production and

  16. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  17. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  18. Review of indirect detection of dark matter with neutrinos

    Science.gov (United States)

    Danninger, Matthias

    2017-09-01

    Dark Matter could be detected indirectly through the observation of neutrinos produced in dark matter self-annihilations or decays. Searches for such neutrino signals have resulted in stringent constraints on the dark matter self-annihilation cross section and the scattering cross section with matter. In recent years these searches have made significant progress in sensitivity through new search methodologies, new detection channels, and through the availability of rich datasets from neutrino telescopes and detectors, like IceCube, ANTARES, Super-Kamiokande, etc. We review recent experimental results and put them in context with respect to other direct and indirect dark matter searches. We also discuss prospects for discoveries at current and next generation neutrino detectors.

  19. RHIC Data Correlation Methodology

    International Nuclear Information System (INIS)

    Michnoff, R.; D'Ottavio, T.; Hoff, L.; MacKay, W.; Satogata, T.

    1999-01-01

    A requirement for RHIC data plotting software and physics analysis is the correlation of data from all accelerator data gathering systems. Data correlation provides the capability for a user to request a plot of multiple data channels vs. time, and to make meaningful time-correlated data comparisons. The task of data correlation for RHIC requires careful consideration because data acquisition triggers are generated from various asynchronous sources including events from the RHIC Event Link, events from the two Beam Sync Links, and other unrelated clocks. In order to correlate data from asynchronous acquisition systems a common time reference is required. The RHIC data correlation methodology will allow all RHIC data to be converted to a common wall clock time, while still preserving native acquisition trigger information. A data correlation task force team, composed of the authors of this paper, has been formed to develop data correlation design details and provide guidelines for software developers. The overall data correlation methodology will be presented in this paper

  20. Intelligent systems engineering methodology

    Science.gov (United States)

    Fouse, Scott

    1990-01-01

    An added challenge for the designers of large scale systems such as Space Station Freedom is the appropriate incorporation of intelligent system technology (artificial intelligence, expert systems, knowledge-based systems, etc.) into their requirements and design. This presentation will describe a view of systems engineering which successfully addresses several aspects of this complex problem: design of large scale systems, design with requirements that are so complex they only completely unfold during the development of a baseline system and even then continue to evolve throughout the system's life cycle, design that involves the incorporation of new technologies, and design and development that takes place with many players in a distributed manner yet can be easily integrated to meet a single view of the requirements. The first generation of this methodology was developed and evolved jointly by ISX and the Lockheed Aeronautical Systems Company over the past five years on the Defense Advanced Research Projects Agency/Air Force Pilot's Associate Program, one of the largest, most complex, and most successful intelligent systems constructed to date. As the methodology has evolved it has also been applied successfully to a number of other projects. Some of the lessons learned from this experience may be applicable to Freedom.

  1. SMART performance analysis methodology

    International Nuclear Information System (INIS)

    Lim, H. S.; Kim, H. C.; Lee, D. J.

    2001-04-01

    To ensure the required and desired operation over the plant lifetime, the performance analysis for the SMART NSSS design is done by means of the specified analysis methodologies for the performance related design basis events(PRDBE). The PRDBE is an occurrence(event) that shall be accommodated in the design of the plant and whose consequence would be no more severe than normal service effects of the plant equipment. The performance analysis methodology which systematizes the methods and procedures to analyze the PRDBEs is as follows. Based on the operation mode suitable to the characteristics of the SMART NSSS, the corresponding PRDBEs and allowable range of process parameters for these events are deduced. With the developed control logic for each operation mode, the system thermalhydraulics are analyzed for the chosen PRDBEs using the system analysis code. Particularly, because of different system characteristics of SMART from the existing commercial nuclear power plants, the operation mode, PRDBEs, control logic, and analysis code should be consistent with the SMART design. This report presents the categories of the PRDBEs chosen based on each operation mode and the transition among these and the acceptance criteria for each PRDBE. It also includes the analysis methods and procedures for each PRDBE and the concept of the control logic for each operation mode. Therefore this report in which the overall details for SMART performance analysis are specified based on the current SMART design, would be utilized as a guide for the detailed performance analysis

  2. Relative Hazard Calculation Methodology

    International Nuclear Information System (INIS)

    DL Strenge; MK White; RD Stenner; WB Andrews

    1999-01-01

    The methodology presented in this document was developed to provide a means of calculating the RH ratios to use in developing useful graphic illustrations. The RH equation, as presented in this methodology, is primarily a collection of key factors relevant to understanding the hazards and risks associated with projected risk management activities. The RH equation has the potential for much broader application than generating risk profiles. For example, it can be used to compare one risk management activity with another, instead of just comparing it to a fixed baseline as was done for the risk profiles. If the appropriate source term data are available, it could be used in its non-ratio form to estimate absolute values of the associated hazards. These estimated values of hazard could then be examined to help understand which risk management activities are addressing the higher hazard conditions at a site. Graphics could be generated from these absolute hazard values to compare high-hazard conditions. If the RH equation is used in this manner, care must be taken to specifically define and qualify the estimated absolute hazard values (e.g., identify which factors were considered and which ones tended to drive the hazard estimation)

  3. Insights into PRA methodologies

    International Nuclear Information System (INIS)

    Gallagher, D.; Lofgren, E.; Atefi, B.; Liner, R.; Blond, R.; Amico, P.

    1984-08-01

    Probabilistic Risk Assessments (PRAs) for six nuclear power plants were examined to gain insight into how the choice of analytical methods can affect the results of PRAs. The PRA sreflectope considered was limited to internally initiated accidents sequences through core melt. For twenty methodological topic areas, a baseline or minimal methodology was specified. The choice of methods for each topic in the six PRAs was characterized in terms of the incremental level of effort above the baseline. A higher level of effort generally reflects a higher level of detail or a higher degree of sophistication in the analytical approach to a particular topic area. The impact on results was measured in terms of how additional effort beyond the baseline level changed the relative importance and ordering of dominant accident sequences compared to what would have been observed had methods corresponding to the baseline level of effort been employed. This measure of impact is a more useful indicator of how methods affect perceptions of plant vulnerabilities than changes in core melt frequency would be. However, the change in core melt frequency was used as a secondary measure of impact for nine topics where availability of information permitted. Results are presented primarily in the form of effort-impact matrices for each of the twenty topic areas. A suggested effort-impact profile for future PRAs is presented

  4. Scrum methodology in banking environment

    OpenAIRE

    Strihová, Barbora

    2015-01-01

    Bachelor thesis "Scrum methodology in banking environment" is focused on one of agile methodologies called Scrum and description of the methodology used in banking environment. Its main goal is to introduce the Scrum methodology and outline a real project placed in a bank focused on software development through a case study, address problems of the project, propose solutions of the addressed problems and identify anomalies of Scrum in software development constrained by the banking environmen...

  5. Experimental Economics: Some Methodological Notes

    OpenAIRE

    Fiore, Annamaria

    2009-01-01

    The aim of this work is presenting in a self-contained paper some methodological aspects as they are received in the current experimental literature. The purpose has been to make a critical review of some very influential papers dealing with methodological issues. In other words, the idea is to have a single paper where people first approaching experimental economics can find summarised (some) of the most important methodological issues. In particular, the focus is on some methodological prac...

  6. Nucleation of strange matter in dense stellar cores

    International Nuclear Information System (INIS)

    Horvath, J.E.; Benvenuto, O.G.; Vucetich, H.

    1992-01-01

    We investigate the nucleation of strange quark matter inside hot, dense nuclear matter. Applying Zel'dovich's kinetic theory of nucleation we find a lower limit of the temperature T for strange-matter bubbles to appear, which happens to be satisfied inside the Kelvin-Helmholtz cooling era of a compact star life but not much after it. Our bounds thus suggest that a prompt conversion could be achieved, giving support to earlier expectations for nonstandard type-II supernova scenarios

  7. Dark matter in the universe

    Science.gov (United States)

    Turner, Michael S.

    1991-01-01

    What is the quantity and composition of material in the Universe? This is one of the most fundamental questions we can ask about the Universe, and its answer bears on a number of important issues including the formation of structure in the Universe, and the ultimate fate and the earliest history of the Universe. Moreover, answering this question could lead to the discovery of new particles, as well as shedding light on the nature of the fundamental interactions. At present, only a partial answer is at hand. Most of the radiation in the Universe does not give off detectable radiation; it is dark. The dark matter associated with bright galaxies contributes somewhere between 10 and 30 percent of the critical density; baryonic matter contributes between 1.1 and 12 percent of the critical. The case for the spatially flat, Einstein-de Sitter model is supported by three compelling theoretical arguments - structure formation, the temporal Copernican principle, and inflation - and by some observational data. If Omega is indeed unity, or even just significantly greater than 0.1, then there is a strong case for a Universe comprised of nonbaryonic matter. There are three well motivated particle dark matter candidates: an axion of mass 10 (exp -6) eV to 10 (exp -4) eV; a neutrino of mass 10 GeV to about 3 TeV; or a neutrino of mass 20 eV to 90 eV. All three possibilities can be tested by experiments that are either planned or are underway.

  8. Exothermic dark matter

    International Nuclear Information System (INIS)

    Graham, Peter W.; Saraswat, Prashant; Harnik, Roni; Rajendran, Surjeet

    2010-01-01

    We propose a novel mechanism for dark matter to explain the observed annual modulation signal at DAMA/LIBRA which avoids existing constraints from every other dark matter direct detection experiment including CRESST, CDMS, and XENON10. The dark matter consists of at least two light states with mass ∼few GeV and splittings ∼5 keV. It is natural for the heavier states to be cosmologically long-lived and to make up an O(1) fraction of the dark matter. Direct detection rates are dominated by the exothermic reactions in which an excited dark matter state downscatters off of a nucleus, becoming a lower energy state. In contrast to (endothermic) inelastic dark matter, the most sensitive experiments for exothermic dark matter are those with light nuclei and low threshold energies. Interestingly, this model can also naturally account for the observed low-energy events at CoGeNT. The only significant constraint on the model arises from the DAMA/LIBRA unmodulated spectrum but it can be tested in the near future by a low-threshold analysis of CDMS-Si and possibly other experiments including CRESST, COUPP, and XENON100.

  9. Evolution of aging assessment methodologies

    International Nuclear Information System (INIS)

    McCrea, L.; Dam, R.; Gold, R.

    2011-01-01

    Under the influence of organizations like the IAEA and INPO the expectations of the regulator and plant operators alike are driving the evolution of aging assessment methodologies. The key result is that these assessments need to be executed more efficiently while supporting risk informed thinking within a living process. Some recent trends impacting aging assessments include new standards from the regulator requiring more frequent updates of aging assessments (RD-334), and broader component coverage driven by equipment reliability program demands (INPO AP-913). These trends point to the need to be able to do aging assessment more efficiently, and to manage the configuration. Some of the challenges include increasing efficiency while maintaining completeness and minimizing error, employing a systematic, well defined approach while maintaining the flexibility to apply the right level of effort to achieve desired results, and in particular, assuring that Aging Related Degradation Mechanisms (ARDMs) are sufficiently addressed. Meeting these needs creates a natural synergy with the Preventive Maintenance living program and therefore lends itself to a more integrated approach. To support this program, the SYSTMSTM software has been enhanced to accommodate for the various facets of an integrated program while meeting the needs described above. The systematic processes in SYSTMS are built with the vision of supporting risk-informed decision making as part of a larger risk-based functional tools suite. This paper intends to show how the utilities can benefit from the cost savings associated with increased assessment efficiency, and utilizing Candu Energy Inc.'s CANDU-specific knowledge-base and experience in aging assessment to get it right the first time. (author)

  10. Evolution of aging assessment methodologies

    Energy Technology Data Exchange (ETDEWEB)

    McCrea, L.; Dam, R.; Gold, R. [Candu Energy Inc., Mississauga, Ontario (Canada)

    2011-07-01

    Under the influence of organizations like the IAEA and INPO the expectations of the regulator and plant operators alike are driving the evolution of aging assessment methodologies. The key result is that these assessments need to be executed more efficiently while supporting risk informed thinking within a living process. Some recent trends impacting aging assessments include new standards from the regulator requiring more frequent updates of aging assessments (RD-334), and broader component coverage driven by equipment reliability program demands (INPO AP-913). These trends point to the need to be able to do aging assessment more efficiently, and to manage the configuration. Some of the challenges include increasing efficiency while maintaining completeness and minimizing error, employing a systematic, well defined approach while maintaining the flexibility to apply the right level of effort to achieve desired results, and in particular, assuring that Aging Related Degradation Mechanisms (ARDMs) are sufficiently addressed. Meeting these needs creates a natural synergy with the Preventive Maintenance living program and therefore lends itself to a more integrated approach. To support this program, the SYSTMSTM software has been enhanced to accommodate for the various facets of an integrated program while meeting the needs described above. The systematic processes in SYSTMS are built with the vision of supporting risk-informed decision making as part of a larger risk-based functional tools suite. This paper intends to show how the utilities can benefit from the cost savings associated with increased assessment efficiency, and utilizing Candu Energy Inc.'s CANDU-specific knowledge-base and experience in aging assessment to get it right the first time. (author)

  11. Dark matter universe

    Science.gov (United States)

    Bahcall, Neta A.

    2015-01-01

    Most of the mass in the universe is in the form of dark matter—a new type of nonbaryonic particle not yet detected in the laboratory or in other detection experiments. The evidence for the existence of dark matter through its gravitational impact is clear in astronomical observations—from the early observations of the large motions of galaxies in clusters and the motions of stars and gas in galaxies, to observations of the large-scale structure in the universe, gravitational lensing, and the cosmic microwave background. The extensive data consistently show the dominance of dark matter and quantify its amount and distribution, assuming general relativity is valid. The data inform us that the dark matter is nonbaryonic, is “cold” (i.e., moves nonrelativistically in the early universe), and interacts only weakly with matter other than by gravity. The current Lambda cold dark matter cosmology—a simple (but strange) flat cold dark matter model dominated by a cosmological constant Lambda, with only six basic parameters (including the density of matter and of baryons, the initial mass fluctuations amplitude and its scale dependence, and the age of the universe and of the first stars)—fits remarkably well all the accumulated data. However, what is the dark matter? This is one of the most fundamental open questions in cosmology and particle physics. Its existence requires an extension of our current understanding of particle physics or otherwise point to a modification of gravity on cosmological scales. The exploration and ultimate detection of dark matter are led by experiments for direct and indirect detection of this yet mysterious particle. PMID:26417091

  12. Different Kinds of Matter(s)

    DEFF Research Database (Denmark)

    Rosfort, René

    2012-01-01

    This article questions the methodological conflation at work in Karen Barad's agential realism. Barad's immense appeal is first explained against the tense background of the nature/culture antagonism in the twentieth century. Then, by using some of the penetrating observations of a seventeen...

  13. Physics of condensed matter

    CERN Document Server

    Misra, Prasanta K

    2012-01-01

    Physics of Condensed Matter is designed for a two-semester graduate course on condensed matter physics for students in physics and materials science. While the book offers fundamental ideas and topic areas of condensed matter physics, it also includes many recent topics of interest on which graduate students may choose to do further research. The text can also be used as a one-semester course for advanced undergraduate majors in physics, materials science, solid state chemistry, and electrical engineering, because it offers a breadth of topics applicable to these majors. The book be

  14. Light, Matter, and Geometry

    DEFF Research Database (Denmark)

    Frisvad, Jeppe Revall

    Interaction of light and matter produces the appearance of materials. To deal with the immense complexity of nature, light and matter is modelled at a macroscopic level in computer graphics. This work is the first to provide the link between the microscopic physical theories of light and matter...... of a material and determine the contents of the material. The book is in four parts. Part I provides the link between microscopic and macroscopic theories of light. Part II describes how to use the properties of microscopic particles to compute the macroscopic properties of materials. Part III illustrates...

  15. QED coherence in matter

    CERN Document Server

    Preparata, Giuliano

    1995-01-01

    Up until now the dominant view of condensed matter physics has been that of an "electrostatic MECCANO" (erector set, for Americans). This book is the first systematic attempt to consider the full quantum-electrodynamical interaction (QED), thus greatly enriching the possible dynamical mechanisms that operate in the construction of the wonderful variety of condensed matter systems, including life itself.A new paradigm is emerging, replacing the "electrostatic MECCANO" with an "electrodynamic NETWORK," which builds condensed matter through the long range (as opposed to the "short range" nature o

  16. Nuclear matter revisited

    International Nuclear Information System (INIS)

    Negele, J.W.; Zabolitzky, J.G.

    1978-01-01

    It is stated that at the Workshop on Nuclear and Dense Matter held at the University of Illinois in May 1977 significant progress was reported that largely resolves many of the questions raised in this journal Vol. 6, p.149, 1976. These include perturbative versus variational methods as applied to nuclear matter, exact solutions for bosons, what is known as the fermion 'homework problem', and various other considerations regarding nuclear matter, including the use of variational methods as opposed to perturbation theory. (15 references) (U.K.)

  17. Geotechnical site assessment methodology

    International Nuclear Information System (INIS)

    Tunbridge, L.W.; Richards, L.R.

    1985-09-01

    A final report summarizing the research conducted on geotechnical site assessment methodology at the Carwynnen test mine in Cornwall. The geological setting of the test site in the Cornubian granite batholith is described. The effect of structure imposed by discontinuities on the engineering behaviour of rock masses is discussed and the scanline survey method of obtaining data on discontinuities in the rock mass is described. The applicability of some methods of statistical analysis for discontinuity data is reviewed. The requirement for remote geophysical methods of characterizing the mass is discussed and experiments using seismic and ultrasonic velocity measurements are reported. Methods of determining the in-situ stresses are described and the final results of a programme of in-situ stress measurements using the overcoring and hydrofracture methods are reported. (author)

  18. UNCOMMON SENSORY METHODOLOGIES

    Directory of Open Access Journals (Sweden)

    Vladimír Vietoris

    2015-02-01

    Full Text Available Sensory science is the young but the rapidly developing field of the food industry. Actually, the great emphasis is given to the production of rapid techniques of data collection, the difference between consumers and trained panel is obscured and the role of sensory methodologists is to prepare the ways for evaluation, by which a lay panel (consumers can achieve identical results as a trained panel. Currently, there are several conventional methods of sensory evaluation of food (ISO standards, but more sensory laboratories are developing methodologies that are not strict enough in the selection of evaluators, their mechanism is easily understandable and the results are easily interpretable. This paper deals with mapping of marginal methods used in sensory evaluation of food (new types of profiles, CATA, TDS, napping.

  19. Proceedings of Arcom Doctoral Workshop Research Methodology

    OpenAIRE

    Scott, Lloyd

    2018-01-01

    Editorial Editorial Welcome to this special doctoral workshop on Research Methodology which forms part of what is now a well-established support mechanism for researchers in the discipline of the Built Environment and more particularly construction management. The ARCOM doctoral series, around now for some seventeen years, has addressed many of the diverse research areas that PhD researchers in the discipline have chosen to focus on in their doctoral journey. This doctoral workshop has as ...

  20. Dark matter: the astrophysical case

    International Nuclear Information System (INIS)

    Silk, J.

    2012-01-01

    The identification of dark matter is one of the most urgent problems in cosmology. I describe the astrophysical case for dark matter, from both an observational and a theoretical perspective. This overview will therefore focus on the observational motivations rather than the particle physics aspects of dark matter constraints on specific dark matter candidates. First, however, I summarize the astronomical evidence for dark matter, then I highlight the weaknesses of the standard cold dark matter model (LCDM) to provide a robust explanation of some observations. The greatest weakness in the dark matter saga is that we have not yet identified the nature of dark matter itself

  1. Situating methodology within qualitative research.

    Science.gov (United States)

    Kramer-Kile, Marnie L

    2012-01-01

    Qualitative nurse researchers are required to make deliberate and sometimes complex methodological decisions about their work. Methodology in qualitative research is a comprehensive approach in which theory (ideas) and method (doing) are brought into close alignment. It can be difficult, at times, to understand the concept of methodology. The purpose of this research column is to: (1) define qualitative methodology; (2) illuminate the relationship between epistemology, ontology and methodology; (3) explicate the connection between theory and method in qualitative research design; and 4) highlight relevant examples of methodological decisions made within cardiovascular nursing research. Although there is no "one set way" to do qualitative research, all qualitative researchers should account for the choices they make throughout the research process and articulate their methodological decision-making along the way.

  2. Structural health monitoring methodology for simply supported bridges: numerical implementation

    Directory of Open Access Journals (Sweden)

    Carlos Alberto Riveros Jerez

    2007-01-01

    Full Text Available El monitoreo de daño en estructuras civiles está recibiendo actualmente gran interés por parte de investigadores debido al gran impacto económico e implicaciones de seguridad relacionadas con una temprana detección de daño estructural. Las técnicas actuales de inspección visual, que en su gran mayoría han sido desarrolladas para detectar daño estructural a nivel local, pueden ser usadas junto con un sistema de monitoreo de daño estructural para inspeccionar zonas específicas de una estructura. En este artículo se presenta una metodología de monitoreo de daño estructural para puentes simplemente apoyados, esta metodología está dividida en cuatro niveles; el primer nivel plantea una óptima localización de sensores usando el concepto de la matriz de información Fisher; para el segundo y tercer nivel se plantea una identificación del sistema estructural con base en excitaciones ambientales y finalmente en el cuarto nivel se presenta un método probabilístico que utiliza el teorema de Bayes para detectar daño estructural. Un modelo en elementos finitos de un puente a escala es empleado para llevar a cabo esta implementación numérica. Los resultados muestran que la metodología propuesta en este artículo puede ser implementada en el sistema Metro de Medellín, pues este sistema está compuesto por una serie de puentes simplemente apoyados, lo cual facilitaría y justificaría la implementación de sistemas de monitoreo de daño para todo el sistema Metro de Medellín.

  3. Methodological Reflections: Designing and Understanding Computer-Supported Collaborative Learning

    Science.gov (United States)

    Hamalainen, Raija

    2012-01-01

    Learning involves more than just a small group of participants, which makes designing and managing collaborative learning processes in higher education a challenging task. As a result, emerging concerns in current research have pointed increasingly to teacher orchestrated learning processes in naturalistic learning settings. In line with this…

  4. Resource Allocation Methodology to Support Mission Area Analysis

    National Research Council Canada - National Science Library

    Parr, John

    1994-01-01

    .... The MAA program envisioned will examine, among other things, potential force structure and modernization trade-offs that are essential to the formulation of an affordable long-term plan for defense resource allocation...

  5. Methodological Support for Service-oriented Design with ISDL

    NARCIS (Netherlands)

    Quartel, Dick; Dijkman, R.M.; van Sinderen, Marten J.

    2004-01-01

    Currently, service-oriented computing is mainly technology-driven. Most developments focus on the technology that enables enterprises to describe, publish and compose application services, and to communicate with applications of other enterprises according to their service descriptions. In this

  6. Methodological Support to Develop Interoperable Applications for Pervasive Healthcare

    NARCIS (Netherlands)

    Cardoso de Moraes, J.L.

    2014-01-01

    The healthcare model currently being used in most countries will soon be inadequate, due to the increasing care costs of a growing population of elderly people, the rapid increase of chronic diseases, the growing demand for new treatments and technologies, and the relative decrease in the number of

  7. A Classification Methodology and Retrieval Model to Support Software Reuse

    Science.gov (United States)

    1988-01-01

    Dewey Decimal Classification ( DDC 18), an enumerative scheme, occupies 40 pages [Buchanan 19791. Langridge [19731 states that the facets listed in the...sense of historical importance or wide spread use. The schemes are: Dewey Decimal Classification ( DDC ), Universal Decimal Classification (UDC...Classification Systems ..... ..... 2.3.3 Library Classification__- .52 23.3.1 Dewey Decimal Classification -53 2.33.2 Universal Decimal Classification 55 2333

  8. Military Assistance for Special Events: Planning Requirements and Support Methodology

    National Research Council Canada - National Science Library

    White, James

    1999-01-01

    .... These events were celebrations and usually conducted with little or no disruption. In 1972 the world of special events was forever changed by the terrorist attack on the Israeli athletes in Munich during the Summer Olympic Games...

  9. Mixed Methodology to Predict Social Meaning for Decision Support

    Science.gov (United States)

    2013-09-01

    centered on the appearance of advertisements for pornography on the HoodUp Web site. As with the informational threads, gang affiliation carries...ARMY CERDEC CP&I (PDFs) ATTN RDER- CPM -IM R SCHULTZ ATTN RDER- CPM -IM T TRUONG 2 US ARMY RESEARCH LAB (PDFs) ARO RESEARCH TRIANGLE PARK ATTN

  10. Methodological spot of establishing silt deposit concentration in Serbian rivers

    Directory of Open Access Journals (Sweden)

    Dragićević Slavoljub

    2007-01-01

    Full Text Available Recent methodology of sampling and establishing silt deposit concentration in Serbian rivers is associated to numerous deficiencies. Daily concentrations of this type of river deposit on the most of the hydrological gauges were obtained on the base of only one measurement, which takes into consideration the matter of representative ness of those samples. Taking the samples of deposit in one point on the profile is little bit problematic because of dispersion of the obtained results. Very important matter is the question of choice of the sampling location. This analyses of data may lead to serious spots in calculating total carried deposit. From the above mentioned reasons, we decided to take precise measurements of silt deposit concentration as well as to establish methodological spots of measurements. The results of these measurements are analyzed and presented in this paper.

  11. Matter-antimatter and matter-matter interactions at intermediate energies

    International Nuclear Information System (INIS)

    Santos, Antonio Carlos Fontes dos

    2002-01-01

    This article presents some of the recent experimental advances on the study on antimatter-matter and matter-matter interactions, and some of the subtle differences stimulated a great theoretical efforts for explanation of the results experimentally observed

  12. Little composite dark matter.

    Science.gov (United States)

    Balkin, Reuven; Perez, Gilad; Weiler, Andreas

    2018-01-01

    We examine the dark matter phenomenology of a composite electroweak singlet state. This singlet belongs to the Goldstone sector of a well-motivated extension of the Littlest Higgs with T -parity. A viable parameter space, consistent with the observed dark matter relic abundance as well as with the various collider, electroweak precision and dark matter direct detection experimental constraints is found for this scenario. T -parity implies a rich LHC phenomenology, which forms an interesting interplay between conventional natural SUSY type of signals involving third generation quarks and missing energy, from stop-like particle production and decay, and composite Higgs type of signals involving third generation quarks associated with Higgs and electroweak gauge boson, from vector-like top-partners production and decay. The composite features of the dark matter phenomenology allows the composite singlet to produce the correct relic abundance while interacting weakly with the Higgs via the usual Higgs portal coupling [Formula: see text], thus evading direct detection.

  13. Inflatable Dark Matter.

    Science.gov (United States)

    Davoudiasl, Hooman; Hooper, Dan; McDermott, Samuel D

    2016-01-22

    We describe a general scenario, dubbed "inflatable dark matter," in which the density of dark matter particles can be reduced through a short period of late-time inflation in the early Universe. The overproduction of dark matter that is predicted within many, otherwise, well-motivated models of new physics can be elegantly remedied within this context. Thermal relics that would, otherwise, be disfavored can easily be accommodated within this class of scenarios, including dark matter candidates that are very heavy or very light. Furthermore, the nonthermal abundance of grand unified theory or Planck scale axions can be brought to acceptable levels without invoking anthropic tuning of initial conditions. A period of late-time inflation could have occurred over a wide range of scales from ∼MeV to the weak scale or above, and could have been triggered by physics within a hidden sector, with small but not necessarily negligible couplings to the standard model.

  14. Dark matter search

    International Nuclear Information System (INIS)

    Bernabei, R.

    2003-01-01

    Some general arguments on the particle Dark Matter search are addressed. The WIMP direct detection technique is mainly considered and recent results obtained by exploiting the annual modulation signature are summarized. (author)

  15. Mind Over Matter: Methamphetamine

    Science.gov (United States)

    ... Teaching Guide and Series / Methamphetamine Mind Over Matter: Methamphetamine (Meth) Print Order Free Publication in: English Spanish ... paranoia, aggressiveness, and hallucinations. The Brain's Response to Methamphetamine Hi, my name's Sara Bellum. Welcome to my ...

  16. Matter Tracking Information System -

    Data.gov (United States)

    Department of Transportation — The Matter Tracking Information System (MTIS) principle function is to streamline and integrate the workload and work activity generated or addressed by our 300 plus...

  17. Lectures on dark matter

    International Nuclear Information System (INIS)

    Seljak, U.

    2001-01-01

    These lectures concentrate on evolution and generation of dark matter perturbations. The purpose of the lectures is to present, in a systematic way, a comprehensive review of the cosmological parameters that can lead to observable effects in the dark matter clustering properties. We begin by reviewing the relativistic linear perturbation theory formalism. We discuss the gauge issue and derive Einstein's and continuity equations for several popular gauge choices. We continue by developing fluid equations for cold dark matter and baryons and Boltzmann equations for photons, massive and massless neutrinos. We then discuss the generation of initial perturbations by the process of inflation and the parameters of that process that can be extracted from the observations. Finally we discuss evolution of perturbations in various regimes and the imprint of the evolution on the dark matter power spectrum both in the linear and in the nonlinear regime. (author)

  18. Prevention Research Matters

    Centers for Disease Control (CDC) Podcasts

    Prevention Research Matters is a series of one-on-one interviews with researchers from 26 university prevention research centers across the country. Their work focuses on preventing and controlling chronic diseases like obesity, cancer, and heart disease.

  19. Dynamics of interstellar matter

    International Nuclear Information System (INIS)

    Kahn, F.D.

    1975-01-01

    A review of the dynamics of interstellar matter is presented, considering the basic equations of fluid flow, plane waves, shock waves, spiral structure, thermal instabilities and early star cocoons. (B.R.H.)

  20. Lectures on dark matter

    Energy Technology Data Exchange (ETDEWEB)

    Seljak, U [Department of Physics, Princeton University, Princeton, NJ (United States)

    2001-11-15

    These lectures concentrate on evolution and generation of dark matter perturbations. The purpose of the lectures is to present, in a systematic way, a comprehensive review of the cosmological parameters that can lead to observable effects in the dark matter clustering properties. We begin by reviewing the relativistic linear perturbation theory formalism. We discuss the gauge issue and derive Einstein's and continuity equations for several popular gauge choices. We continue by developing fluid equations for cold dark matter and baryons and Boltzmann equations for photons, massive and massless neutrinos. We then discuss the generation of initial perturbations by the process of inflation and the parameters of that process that can be extracted from the observations. Finally we discuss evolution of perturbations in various regimes and the imprint of the evolution on the dark matter power spectrum both in the linear and in the nonlinear regime. (author)

  1. Dark matter search

    Energy Technology Data Exchange (ETDEWEB)

    Bernabei, R [Dipto. di Fisica, Universita di Roma ' Tor Vergata' and INFN, sez. Roma2, Rome (Italy)

    2003-08-15

    Some general arguments on the particle Dark Matter search are addressed. The WIMP direct detection technique is mainly considered and recent results obtained by exploiting the annual modulation signature are summarized. (author)

  2. Governance matters II - updated indicators for 2000-01

    OpenAIRE

    Kaufmann, Daniel; Kraay, Aart; Zoido-Lobaton, Pablo

    2002-01-01

    The authors construct aggregate governance indicators for six dimensions of governance, covering 175 countries in 2000-01. They apply the methodology developed in Kaufmann, Kraay, and Zoido-Lobaton ("Aggregating Governance Indicators", Policy Research Working Paper 2195, and "Governance Matters", Policy Research Working Paper 2196, October 1999) to newly available data at governance indica...

  3. Soft Active Matter

    OpenAIRE

    Marchetti, M. C.; Joanny, J. -F.; Ramaswamy, S.; Liverpool, T. B.; Prost, J.; Rao, Madan; Simha, R. Aditi

    2012-01-01

    In this review we summarize theoretical progress in the field of active matter, placing it in the context of recent experiments. Our approach offers a unified framework for the mechanical and statistical properties of living matter: biofilaments and molecular motors in vitro or in vivo, collections of motile microorganisms, animal flocks, and chemical or mechanical imitations. A major goal of the review is to integrate the several approaches proposed in the literature, from semi-microscopic t...

  4. DARK MATTER: Optical shears

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    Evidence for dark matter continues to build up. Last year (December 1993, page 4) excitement rose when the French EROS (Experience de Recherche d'Objets Sombres) and the US/Australia MACHO collaborations reported hints that small inert 'brown dwarf stars could provide some of the Universe's missing matter. In the 1930s, astronomers first began to suspect that there is a lot more to the Universe than meets the eye

  5. Charming dark matter

    Science.gov (United States)

    Jubb, Thomas; Kirk, Matthew; Lenz, Alexander

    2017-12-01

    We have considered a model of Dark Minimal Flavour Violation (DMFV), in which a triplet of dark matter particles couple to right-handed up-type quarks via a heavy colour-charged scalar mediator. By studying a large spectrum of possible constraints, and assessing the entire parameter space using a Markov Chain Monte Carlo (MCMC), we can place strong restrictions on the allowed parameter space for dark matter models of this type.

  6. Dynamics of Soft Matter

    CERN Document Server

    García Sakai, Victoria; Chen, Sow-Hsin

    2012-01-01

    Dynamics of Soft Matter: Neutron Applications provides an overview of neutron scattering techniques that measure temporal and spatial correlations simultaneously, at the microscopic and/or mesoscopic scale. These techniques offer answers to new questions arising at the interface of physics, chemistry, and biology. Knowledge of the dynamics at these levels is crucial to understanding the soft matter field, which includes colloids, polymers, membranes, biological macromolecules, foams, emulsions towards biological & biomimetic systems, and phenomena involving wetting, friction, adhesion, or micr

  7. Matter-antimatter Cosmology

    Science.gov (United States)

    Omnes, R.

    1973-01-01

    The possible existence of antimatter on a large scale in the universe is evaluated. As a starting point, an attempt was made to understand the origin of matter as being essentially analogous to the origin of backgound thermal radiation. Several theories and models are examined, with particular emphasis on nucleon-antinucleon interactions at intermediate energies. Data also cover annihilation interaction with the matter-antimatter boundary to produce the essential fluid motion known as coalesence.

  8. Matter and cosmology

    International Nuclear Information System (INIS)

    Effenberger, R.

    1974-09-01

    The author summarizes some of the many questions and answers which have been raised over the years regarding the nature of matter, the origin of its forms and the associated concept of cosmology including the formation of the universe, our place in it and its course of evolution. An examination of the development of the classical concept of matter and its subsequent transformations within the space-time fields of relativity and quantum theory is also presented

  9. Dark matter: Theoretical perspectives

    International Nuclear Information System (INIS)

    Turner, M.S.

    1993-01-01

    The author both reviews and makes the case for the current theoretical prejudice: a flat Universe whose dominant constituent is nonbaryonic dark matter, emphasizing that this is still a prejudice and not yet fact. The theoretical motivation for nonbaryonic dark matter is discussed in the context of current elementary-particle theory, stressing that (i) there are no dark-matter candidates within the open-quotes standard modelclose quotes of particle physics, (ii) there are several compelling candidates within attractive extensions of the standard model of particle physics, and (iii) the motivation for these compelling candidates comes first and foremost from particle physics. The dark-matter problem is now a pressing issue in both cosmology and particle physics, and the detection of particle dark matter would provide evidence for open-quotes new physics.close quotes The compelling candidates are a very light axion (10 -6 --10 -4 eV), a light neutrino (20--90 eV), and a heavy neutralino (10 GeV--2 TeV). The production of these particles in the early Universe and the prospects for their detection are also discussed. The author briefly mentions more exotic possibilities for the dark matter, including a nonzero cosmological constant, superheavy magnetic monopoles, and decaying neutrinos. 119 refs

  10. Dark matter: Theoretical perspectives

    International Nuclear Information System (INIS)

    Turner, M.S.

    1993-01-01

    I both review and make the case for the current theoretical prejudice: a flat Universe whose dominant constituent is nonbaryonic dark matter, emphasizing that this is still a prejudice and not yet fact. The theoretical motivation for nonbaryonic dark matter is discussed in the context of current elementary-particle theory, stressing that: (1) there are no dark matter candidates within the standard model of particle physics; (2) there are several compelling candidates within attractive extensions of the standard model of particle physics; and (3) the motivation for these compelling candidates comes first and foremost from particle physics. The dark-matter problem is now a pressing issue in both cosmology and particle physics, and the detection of particle dark matter would provide evidence for ''new physics.'' The compelling candidates are: a very light axion ( 10 -6 eV--10 -4 eV); a light neutrino (20 eV--90 eV); and a heavy neutralino (10 GeV--2 TeV). The production of these particles in the early Universe and the prospects for their detection are also discussed. I briefly mention more exotic possibilities for the dark matter, including a nonzero cosmological constant, superheavy magnetic monopoles, and decaying neutrinos

  11. Soil organic matter

    International Nuclear Information System (INIS)

    1976-01-01

    The nature, content and behaviour of the organic matter, or humus, in soil are factors of fundamental importance for soil productivity and the development of optimum conditions for growth of crops under diverse temperate, tropical and arid climatic conditions. In the recent symposium on soil organic matter studies - as in the two preceding ones in 1963 and 1969 - due consideration was given to studies involving the use of radioactive and stable isotopes. However, the latest symposium was a departure from previous efforts in that non-isotopic approaches to research on soil organic matter were included. A number of papers dealt with the behaviour and functions of organic matter and suggested improved management practices, the use of which would contribute to increasing agricultural production. Other papers discussed the turnover of plant residues, the release of plant nutrients through the biodegradation of organic compounds, the nitrogen economy and the dynamics of transformation of organic forms of nitrogen. In addition, consideration was given to studies on the biochemical transformation of organic matter, characterization of humic acids, carbon-14 dating and the development of modern techniques and their impact on soil organic matter research

  12. Dark matter: Theoretical perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Turner, M.S. (Chicago Univ., IL (United States). Enrico Fermi Inst. Fermi National Accelerator Lab., Batavia, IL (United States))

    1993-01-01

    I both review and make the case for the current theoretical prejudice: a flat Universe whose dominant constituent is nonbaryonic dark matter, emphasizing that this is still a prejudice and not yet fact. The theoretical motivation for nonbaryonic dark matter is discussed in the context of current elementary-particle theory, stressing that: (1) there are no dark matter candidates within the standard model of particle physics; (2) there are several compelling candidates within attractive extensions of the standard model of particle physics; and (3) the motivation for these compelling candidates comes first and foremost from particle physics. The dark-matter problem is now a pressing issue in both cosmology and particle physics, and the detection of particle dark matter would provide evidence for new physics.'' The compelling candidates are: a very light axion ( 10[sup [minus]6] eV--10[sup [minus]4] eV); a light neutrino (20 eV--90 eV); and a heavy neutralino (10 GeV--2 TeV). The production of these particles in the early Universe and the prospects for their detection are also discussed. I briefly mention more exotic possibilities for the dark matter, including a nonzero cosmological constant, superheavy magnetic monopoles, and decaying neutrinos.

  13. Dark matter: Theoretical perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Turner, M.S. [Chicago Univ., IL (United States). Enrico Fermi Inst.]|[Fermi National Accelerator Lab., Batavia, IL (United States)

    1993-01-01

    I both review and make the case for the current theoretical prejudice: a flat Universe whose dominant constituent is nonbaryonic dark matter, emphasizing that this is still a prejudice and not yet fact. The theoretical motivation for nonbaryonic dark matter is discussed in the context of current elementary-particle theory, stressing that: (1) there are no dark matter candidates within the standard model of particle physics; (2) there are several compelling candidates within attractive extensions of the standard model of particle physics; and (3) the motivation for these compelling candidates comes first and foremost from particle physics. The dark-matter problem is now a pressing issue in both cosmology and particle physics, and the detection of particle dark matter would provide evidence for ``new physics.`` The compelling candidates are: a very light axion ( 10{sup {minus}6} eV--10{sup {minus}4} eV); a light neutrino (20 eV--90 eV); and a heavy neutralino (10 GeV--2 TeV). The production of these particles in the early Universe and the prospects for their detection are also discussed. I briefly mention more exotic possibilities for the dark matter, including a nonzero cosmological constant, superheavy magnetic monopoles, and decaying neutrinos.

  14. Spaceflight Effect on White Matter Structural Integrity

    Science.gov (United States)

    Lee, Jessica K.; Kopplemans, Vincent; Paternack, Ofer; Bloomberg, Jacob J.; Mulavara, Ajitkumar P.; Seidler, Rachael D.

    2017-01-01

    Recent reports of elevated brain white matter hyperintensity (WMH) counts and volume in postflight astronaut MRIs suggest that further examination of spaceflight's impact on the microstructure of brain white matter is warranted. To this end, retrospective longitudinal diffusion-weighted MRI scans obtained from 15 astronauts were evaluated. In light of the recent reports of microgravity-induced cephalad fluid shift and gray matter atrophy seen in astronauts, we applied a technique to estimate diffusion tensor imaging (DTI) metrics corrected for free water contamination. This approach enabled the analysis of white matter tissue-specific alterations that are unrelated to fluid shifts, occurring from before spaceflight to after landing. After spaceflight, decreased fractional anisotropy (FA) values were detected in an area encompassing the superior and inferior longitudinal fasciculi and the inferior fronto-occipital fasciculus. Increased radial diffusivity (RD) and decreased axial diffusivity (AD) were also detected within overlapping regions. In addition, FA values in the corticospinal tract decreased and RD measures in the precentral gyrus white matter increased from before to after flight. The results show disrupted structural connectivity of white matter in tracts involved in visuospatial processing, vestibular function, and movement control as a result of spaceflight. The findings may help us understand the structural underpinnings of the extensive spaceflight-induced sensorimotor remodeling. Prospective longitudinal assessment of the white matter integrity in astronauts is needed to characterize the evolution of white matter microstructural changes associated with spaceflight, their behavioral consequences, and the time course of recovery. Supported by a grant from the National Space Biomedical Research Institute, NASA NCC 9-58.

  15. Methodological themes and variations

    International Nuclear Information System (INIS)

    Tetlock, P.E.

    1989-01-01

    This paper reports on the tangible progress that has been made in clarifying the underlying processes that affect both the likelihood of war in general and of nuclear war in particular. It also illustrates how difficult it is to make progress in this area. Nonetheless, what has been achieved should not be minimized. We have learned a good deal on both the theoretical and the methodological fronts and, perhaps, most important, we have learned a good deal about the limits of our knowledge. Knowledge of our ignorance---especially in a policy domain where confident, even glib, causal assertions are so common---can be a major contribution in itself. The most important service the behavioral and social sciences can currently provide to the policy making community may well be to make thoughtful skepticism respectable: to sensitize those who make key decisions to the uncertainty surrounding our understanding of international conflict and to the numerous qualifications that now need to be attached to simple causal theories concerning the origins of war

  16. Particle Dark Matter and DAMA/LIBRA

    International Nuclear Information System (INIS)

    Bernabei, R.; Nozzoli, F.; Belli, P.; Cappella, F.; D'Angelo, A.; Prosperi, D.; Cerulli, R.; Dai, C. J.; He, H. L.; Ma, X. H.; Sheng, X. D.; Wang, R. G.; Incicchitti, A.; Montecchia, F.; Ye, Z. P.

    2010-01-01

    The DAMA/LIBRA set-up (about 250 kg highly radiopure NaI(Tl) sensitive mass) is running at the Gran Sasso National Laboratory of the I.N.F.N.. The first DAMA/LIBRA results confirm the evidence for the presence of a Dark Matter particle component in the galactic halo, as pointed out by the former DAMA/NaI set-up; cumulatively the data support such evidence at 8.2 σ C.L. and satisfy all the many peculiarities of the Dark Matter annual modulation signature. The main aspects and prospects of this model independent experimental approach will be outlined.

  17. Dosimetric methodology of the ICRP

    International Nuclear Information System (INIS)

    Eckerman, K.F.

    1994-01-01

    Establishment of guidance for the protection of workers and members of the public from radiation exposures necessitates estimation of the radiation dose to tissues of the body at risk. The dosimetric methodology formulated by the International Commission on Radiological Protection (ICRP) is intended to be responsive to this need. While developed for radiation protection, elements of the methodology are often applied in addressing other radiation issues; e.g., risk assessment. This chapter provides an overview of the methodology, discusses its recent extension to age-dependent considerations, and illustrates specific aspects of the methodology through a number of numerical examples

  18. Transmission pricing: paradigms and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Shirmohammadi, Dariush [Pacific Gas and Electric Co., San Francisco, CA (United States); Vieira Filho, Xisto; Gorenstin, Boris [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, Mario V.P. [Power System Research, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    In this paper we describe the principles of several paradigms and methodologies for pricing transmission services. The paper outlines some of the main characteristics of these paradigms and methodologies such as where they may be used for best results. Due to their popularity, power flow based MW-mile and short run marginal cost pricing methodologies will be covered in some detail. We conclude the paper with examples of the application of these two pricing methodologies for pricing transmission services in Brazil. (author) 25 refs., 2 tabs.

  19. Analytical methodology for nuclear safeguards

    International Nuclear Information System (INIS)

    Ramakumar, K.L.

    2011-01-01

    This paper attempts to briefly describe the analytical methodologies available and also highlight some of the challenges, expectations from nuclear material accounting and control (NUMAC) point of view

  20. Study of methodology diversification in diagnostics

    International Nuclear Information System (INIS)

    Suda, Kazunori; Yonekawa, Tsuyoshi; Yoshikawa, Shinji; Hasegawa, Makoto

    1999-03-01

    There are several research activities to enhance safety and reliability of nuclear power plant operation and maintenance. We are developing a concept of an autonomous operation system where the role of operators is replaced with artificial intelligence. The purpose of the study described in this report is to develop a operator support system in abnormal plant situations. Conventionally, diagnostic modules based on individual methodology such as expert system have been developed and verified. In this report, methodology diversification is considered to integrate diagnostic modules which performance are confirmed using information processing technique. Technical issues to be considered in diagnostic methodology diversification are; 1)reliability of input data, 2)diversification of knowledge models, algorithms and reasoning schemes, 3)mutual complement and robustness. The diagnostic module utilizing the different approaches defined along with strategy of diversification was evaluated using fast breeder plant simulator. As a result, we confirmed that any singular diagnostic module can not meet accuracy criteria for the entire set of anomaly events. In contrast with this, we confirmed that every abnormality could be precisely diagnosed by a mutual combination. In other words, legitimacy of approach selected by strategy of diversification was shown, and methodology diversification attained clear efficiency for abnormal diagnosis. It has been also confirmed that the diversified diagnostic system implemented in this study is able to maintain its accuracy even in case that encountered scale of abnormality is different from reference cases embedded in the knowledge base. (author)

  1. Essential methodological considerations when using grounded theory.

    Science.gov (United States)

    Achora, Susan; Matua, Gerald Amandu

    2016-07-01

    To suggest important methodological considerations when using grounded theory. A research method widely used in nursing research is grounded theory, at the centre of which is theory construction. However, researchers still struggle with some of its methodological issues. Although grounded theory is widely used to study and explain issues in nursing practice, many researchers are still failing to adhere to its rigorous standards. Researchers should articulate the focus of their investigations - the substantive area of interest as well as the focal population. This should be followed by a succinct explanation of the strategies used to collect and analyse data, supported by clear coding processes. Finally, the resolution of the core issues, including the core category and related categories, should be explained to advance readers' understanding. Researchers should endeavour to understand the tenets of grounded theory. This enables 'neophytes' in particular to make methodological decisions that will improve their studies' rigour and fit with grounded theory. This paper complements the current dialogue on improving the understanding of grounded theory methodology in nursing research. The paper also suggests important procedural decisions researchers need to make to preserve their studies' scientific merit and fit with grounded theory.

  2. The methodological defense of realism scrutinized.

    Science.gov (United States)

    Wray, K Brad

    2015-12-01

    I revisit an older defense of scientific realism, the methodological defense, a defense developed by both Popper and Feyerabend. The methodological defense of realism concerns the attitude of scientists, not philosophers of science. The methodological defense is as follows: a commitment to realism leads scientists to pursue the truth, which in turn is apt to put them in a better position to get at the truth. In contrast, anti-realists lack the tenacity required to develop a theory to its fullest. As a consequence, they are less likely to get at the truth. My aim is to show that the methodological defense is flawed. I argue that a commitment to realism does not always benefit science, and that there is reason to believe that a research community with both realists and anti-realists in it may be better suited to advancing science. A case study of the Copernican Revolution in astronomy supports this claim. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Country report: a methodology

    International Nuclear Information System (INIS)

    Colin, A.

    2013-01-01

    This paper describes a methodology which could be applicable to establish a country report. In the framework of nuclear non proliferation appraisal and IAEA safeguards implementation, it is important to be able to assess the potential existence of undeclared nuclear materials and activities as undeclared facilities in the country under review. In our views a country report should aim at providing detailed information on nuclear related activities for each country examined taken 'as a whole' such as nuclear development, scientific and technical capabilities, etc. In order to study a specific country, we need to know if there is already an operating nuclear civil programme or not. In the first case, we have to check carefully if it could divert nuclear material, if there are misused declared facilities or if they operate undeclared facilities and conduct undeclared activities aiming at manufacturing nuclear weapon. In the second case, we should pay attention to the development of a nuclear civil project. A country report is based on a wide span of information (most of the time coming from open sources but sometimes coming also from confidential or private ones). Therefore, it is important to carefully check the nature and the credibility (reliability?) of these sources through cross-check examination. Eventually, it is necessary to merge information from different sources and apply an expertise filter. We have at our disposal a lot of performing tools to help us to assess, understand and evaluate the situation (cartography, imagery, bibliometry, etc.). These tools allow us to offer the best conclusions as far as possible. The paper is followed by the slides of the presentation. (author)

  4. Microbiological Methodology in Astrobiology

    Science.gov (United States)

    Abyzov, S. S.; Gerasimenko, L. M.; Hoover, R. B.; Mitskevich, I. N.; Mulyukin, A. L.; Poglazova, M. N.; Rozanov, A. Y.

    2005-01-01

    Searching for life in astromaterials to be delivered from the future missions to extraterrestrial bodies is undoubtedly related to studies of the properties and signatures of living microbial cells and microfossils on Earth. As model terrestrial analogs of Martian polar subsurface layers are often regarded the Antarctic glacier and Earth permafrost habitats where alive microbial cells preserved viability for millennia years due to entering the anabiotic state. For the future findings of viable microorganisms in samples from extraterrestrial objects, it is important to use a combined methodology that includes classical microbiological methods, plating onto nutrient media, direct epifluorescence and electron microscopy examinations, detection of the elemental composition of cells, radiolabeling techniques, PCR and FISH methods. Of great importance is to ensure authenticity of microorganisms (if any in studied samples) and to standardize the protocols used to minimize a risk of external contamination. Although the convincing evidence of extraterrestrial microbial life will may come from the discovery of living cells in astromaterials, biomorphs and microfossils must also be regarded as a target in search of life evidence bearing in mind a scenario that alive microorganisms had not be preserved and underwent mineralization. Under the laboratory conditions, processes that accompanied fossilization of cyanobacteria were reconstructed, and artificially produced cyanobacterial stromatolites resembles by their morphological properties those found in natural Earth habitats. Regarding the vital importance of distinguishing between biogenic and abiogenic signatures and between living and fossil microorganisms in analyzed samples, it is worthwhile to use some previously developed approaches based on electron microscopy examinations and analysis of elemental composition of biomorphs in situ and comparison with the analogous data obtained for laboratory microbial cultures and

  5. Imperfect Dark Matter

    Energy Technology Data Exchange (ETDEWEB)

    Mirzagholi, Leila; Vikman, Alexander, E-mail: l.mirzagholi@physik.uni-muenchen.de, E-mail: alexander.vikman@lmu.de [Arnold Sommerfeld Center for Theoretical Physics, Ludwig Maximilian University Munich, Theresienstr. 37, Munich, D-80333 Germany (Germany)

    2015-06-01

    We consider cosmology of the recently introduced mimetic matter with higher derivatives (HD). Without HD this system describes irrotational dust—Dark Matter (DM) as we see it on cosmologically large scales. DM particles correspond to the shift-charges—Noether charges of the shifts in the field space. Higher derivative corrections usually describe a deviation from the thermodynamical equilibrium in the relativistic hydrodynamics. Thus we show that mimetic matter with HD corresponds to an imperfect DM which: i) renormalises the Newton's constant in the Friedmann equations, ii) has zero pressure when there is no extra matter in the universe, iii) survives the inflationary expansion which puts the system on a dynamical attractor with a vanishing shift-charge, iv) perfectly tracks any external matter on this attractor, v) can become the main (and possibly the only) source of DM, provided the shift-symmetry in the HD terms is broken during some small time interval in the radiation domination époque. In the second part of the paper we present a hydrodynamical description of general anisotropic and inhomogeneous configurations of the system. This imperfect mimetic fluid has an energy flow in the field's rest frame. We find that in the Eckart and in the Landau-Lifshitz frames the mimetic fluid possesses nonvanishing vorticity appearing already at the first order in the HD. Thus, the structure formation and gravitational collapse should proceed in a rather different fashion from the simple irrotational DM models.

  6. Entropy, matter, and cosmology.

    Science.gov (United States)

    Prigogine, I; Géhéniau, J

    1986-09-01

    The role of irreversible processes corresponding to creation of matter in general relativity is investigated. The use of Landau-Lifshitz pseudotensors together with conformal (Minkowski) coordinates suggests that this creation took place in the early universe at the stage of the variation of the conformal factor. The entropy production in this creation process is calculated. It is shown that these dissipative processes lead to the possibility of cosmological models that start from empty conditions and gradually build up matter and entropy. Gravitational entropy takes a simple meaning as associated to the entropy that is necessary to produce matter. This leads to an extension of the third law of thermodynamics, as now the zero point of entropy becomes the space-time structure out of which matter is generated. The theory can be put into a convenient form using a supplementary "C" field in Einstein's field equations. The role of the C field is to express the coupling between gravitation and matter leading to irreversible entropy production.

  7. Imperfect Dark Matter

    International Nuclear Information System (INIS)

    Mirzagholi, Leila; Vikman, Alexander

    2015-01-01

    We consider cosmology of the recently introduced mimetic matter with higher derivatives (HD). Without HD this system describes irrotational dust—Dark Matter (DM) as we see it on cosmologically large scales. DM particles correspond to the shift-charges—Noether charges of the shifts in the field space. Higher derivative corrections usually describe a deviation from the thermodynamical equilibrium in the relativistic hydrodynamics. Thus we show that mimetic matter with HD corresponds to an imperfect DM which: i) renormalises the Newton's constant in the Friedmann equations, ii) has zero pressure when there is no extra matter in the universe, iii) survives the inflationary expansion which puts the system on a dynamical attractor with a vanishing shift-charge, iv) perfectly tracks any external matter on this attractor, v) can become the main (and possibly the only) source of DM, provided the shift-symmetry in the HD terms is broken during some small time interval in the radiation domination époque. In the second part of the paper we present a hydrodynamical description of general anisotropic and inhomogeneous configurations of the system. This imperfect mimetic fluid has an energy flow in the field's rest frame. We find that in the Eckart and in the Landau-Lifshitz frames the mimetic fluid possesses nonvanishing vorticity appearing already at the first order in the HD. Thus, the structure formation and gravitational collapse should proceed in a rather different fashion from the simple irrotational DM models

  8. Asymmetric condensed dark matter

    Energy Technology Data Exchange (ETDEWEB)

    Aguirre, Anthony; Diez-Tejedor, Alberto, E-mail: aguirre@scipp.ucsc.edu, E-mail: alberto.diez@fisica.ugto.mx [Santa Cruz Institute for Particle Physics and Department of Physics, University of California, Santa Cruz, CA, 95064 (United States)

    2016-04-01

    We explore the viability of a boson dark matter candidate with an asymmetry between the number densities of particles and antiparticles. A simple thermal field theory analysis confirms that, under certain general conditions, this component would develop a Bose-Einstein condensate in the early universe that, for appropriate model parameters, could survive the ensuing cosmological evolution until now. The condensation of a dark matter component in equilibrium with the thermal plasma is a relativistic process, hence the amount of matter dictated by the charge asymmetry is complemented by a hot relic density frozen out at the time of decoupling. Contrary to the case of ordinary WIMPs, dark matter particles in a condensate must be lighter than a few tens of eV so that the density from thermal relics is not too large. Big-Bang nucleosynthesis constrains the temperature of decoupling to the scale of the QCD phase transition or above. This requires large dark matter-to-photon ratios and very weak interactions with standard model particles.

  9. Imperfect Dark Matter

    Science.gov (United States)

    Mirzagholi, Leila; Vikman, Alexander

    2015-06-01

    We consider cosmology of the recently introduced mimetic matter with higher derivatives (HD). Without HD this system describes irrotational dust—Dark Matter (DM) as we see it on cosmologically large scales. DM particles correspond to the shift-charges—Noether charges of the shifts in the field space. Higher derivative corrections usually describe a deviation from the thermodynamical equilibrium in the relativistic hydrodynamics. Thus we show that mimetic matter with HD corresponds to an imperfect DM which: i) renormalises the Newton's constant in the Friedmann equations, ii) has zero pressure when there is no extra matter in the universe, iii) survives the inflationary expansion which puts the system on a dynamical attractor with a vanishing shift-charge, iv) perfectly tracks any external matter on this attractor, v) can become the main (and possibly the only) source of DM, provided the shift-symmetry in the HD terms is broken during some small time interval in the radiation domination époque. In the second part of the paper we present a hydrodynamical description of general anisotropic and inhomogeneous configurations of the system. This imperfect mimetic fluid has an energy flow in the field's rest frame. We find that in the Eckart and in the Landau-Lifshitz frames the mimetic fluid possesses nonvanishing vorticity appearing already at the first order in the HD. Thus, the structure formation and gravitational collapse should proceed in a rather different fashion from the simple irrotational DM models.

  10. 42 CFR 493.649 - Methodology for determining fee amount.

    Science.gov (United States)

    2010-10-01

    ... fringe benefit costs to support the required number of State inspectors, management and direct support... full time equivalent employee. Included in this cost are salary and fringe benefit costs, necessary... 42 Public Health 5 2010-10-01 2010-10-01 false Methodology for determining fee amount. 493.649...

  11. WISPy cold dark matter

    Energy Technology Data Exchange (ETDEWEB)

    Arias, Paola [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Pontificia Univ. Catolica de Chile, Santiago (Chile). Facultad de Fisica; Cadamuro, Davide; Redondo, Javier [Max-Planck-Institut fuer Physik, Muenchen (Germany); Goodsell, Mark [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); European Organization for Nuclear Research (CERN), Geneva (Switzerland); Jaeckel, Joerg [Durham Univ. (United Kingdom). Inst. for Particle Physics Phenomenology; Ringwald, Andreas [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2012-01-15

    Very weakly interacting slim particles (WISPs), such as axion-like particles (ALPs) or hidden photons (HPs), may be non-thermally produced via the misalignment mechanism in the early universe and survive as a cold dark matter population until today. We find that, both for ALPs and HPs whose dominant interactions with the standard model arise from couplings to photons, a huge region in the parameter spaces spanned by photon coupling and ALP or HP mass can give rise to the observed cold dark matter. Remarkably, a large region of this parameter space coincides with that predicted in well motivated models of fundamental physics. A wide range of experimental searches - exploiting haloscopes (direct dark matter searches exploiting microwave cavities), helioscopes (searches for solar ALPs or HPs), or light-shining-through-a-wall techniques - can probe large parts of this parameter space in the foreseeable future. (orig.)

  12. Resonant SIMP dark matter

    Directory of Open Access Journals (Sweden)

    Soo-Min Choi

    2016-07-01

    Full Text Available We consider a resonant SIMP dark matter in models with two singlet complex scalar fields charged under a local dark U(1D. After the U(1D is broken down to a Z5 discrete subgroup, the lighter scalar field becomes a SIMP dark matter which has the enhanced 3→2 annihilation cross section near the resonance of the heavier scalar field. Bounds on the SIMP self-scattering cross section and the relic density can be fulfilled at the same time for perturbative couplings of SIMP. A small gauge kinetic mixing between the SM hypercharge and dark gauge bosons can be used to make SIMP dark matter in kinetic equilibrium with the SM during freeze-out.

  13. Thermal Properties of Matter

    Science.gov (United States)

    Khachan, Joe

    2018-02-01

    The ancient Greeks believed that all matter was composed of four elements: earth, water, air, and fire. By a remarkable coincidence (or perhaps not), today we know that there are four states of matter: solids (e.g. earth), liquids (e.g. water), gasses (e.g. air) and plasma (e.g. ionized gas produced by fire). The plasma state is beyond the scope of this book and we will only look at the first three states. Although on the microscopic level all matter is made from atoms or molecules, everyday experience tells us that the three states have very different properties. The aim of this book is to examine some of these properties and the underlying physics.

  14. Asymmetric Higgsino dark matter.

    Science.gov (United States)

    Blum, Kfir; Efrati, Aielet; Grossman, Yuval; Nir, Yosef; Riotto, Antonio

    2012-08-03

    In the supersymmetric framework, prior to the electroweak phase transition, the existence of a baryon asymmetry implies the existence of a Higgsino asymmetry. We investigate whether the Higgsino could be a viable asymmetric dark matter candidate. We find that this is indeed possible. Thus, supersymmetry can provide the observed dark matter abundance and, furthermore, relate it with the baryon asymmetry, in which case the puzzle of why the baryonic and dark matter mass densities are similar would be explained. To accomplish this task, two conditions are required. First, the gauginos, squarks, and sleptons must all be very heavy, such that the only electroweak-scale superpartners are the Higgsinos. With this spectrum, supersymmetry does not solve the fine-tuning problem. Second, the temperature of the electroweak phase transition must be low, in the (1-10) GeV range. This condition requires an extension of the minimal supersymmetric standard model.

  15. Cerebral white matter hypoplasia

    International Nuclear Information System (INIS)

    Dietrich, R.B.; Shields, W.D.; Sankar, R.

    1990-01-01

    This paper demonstrates the MR imaging findings in children with cerebral white matter hypoplasia (CWMH). The MR studies of four children, aged 3-7 y (mean age, 2.3 y) with a diagnosis of CWMH were reviewed. In all cases multiplanar T1-weighted and T2-weighted spin-echo images were obtained. All children had similar histories of severe developmental delay and nonprogressive neurologic deficits despite normal gestational and birth histories. In two cases there was a history of maternal cocaine abuse. Autopsy correlation was available in one child. The MR images of all four children demonstrated diffuse lack of white matter and enlarged ventricles but normal-appearing gray matter. The corpus callosum, although completely formed, was severely thinned. There was no evidence of gliosis or porencephaly, and the distribution of myelin deposition was normal for age in all cases. Autopsy finding in one child correlated exactly with the MR finding

  16. Dark matter from unification

    DEFF Research Database (Denmark)

    Kainulainen, Kimmo; Tuominen, Kimmo; Virkajärvi, Jussi Tuomas

    2013-01-01

    We consider a minimal extension of the Standard Model (SM), which leads to unification of the SM coupling constants, breaks electroweak symmetry dynamically by a new strongly coupled sector and leads to novel dark matter candidates. In this model, the coupling constant unification requires...... eigenstates of this sector and determine the resulting relic density. The results are constrained by available data from colliders and direct and indirect dark matter experiments. We find the model viable and outline briefly future research directions....... the existence of electroweak triplet and doublet fermions singlet under QCD and new strong dynamics underlying the Higgs sector. Among these new matter fields and a new right handed neutrino, we consider the mass and mixing patterns of the neutral states. We argue for a symmetry stabilizing the lightest mass...

  17. Matter and memory

    CERN Document Server

    Bergson, Henri

    1991-01-01

    Since the end of the last century," Walter Benjamin wrote, "philosophy has made a series of attempts to lay hold of the 'true' experience as opposed to the kind that manifests itself in the standardized, denatured life of the civilized masses. It is customary to classify these efforts under the heading of a philosophy of life. Towering above this literature is Henri Bergson's early monumental work, Matter and Memory."Along with Husserl's Ideas and Heidegger's Being and Time, Bergson's work represents one of the great twentieth-century investigations into perception and memory, movement and time, matter and mind. Arguably Bergson's most significant book, Matter and Memory is essential to an understanding of his philosophy and its legacy.This new edition includes an annotated bibliography prepared by Bruno Paradis.Henri Bergson (1859-1941) was awarded the Nobel Prize in 1927. His works include Time and Free Will, An Introduction to Metaphysics, Creative Evolution, and The Creative Mind.

  18. Interacting hot dark matter

    International Nuclear Information System (INIS)

    Atrio-Barandela, F.; Davidson, S.

    1997-01-01

    We discuss the viability of a light particle (∼30eV neutrino) with strong self-interactions as a dark matter candidate. The interaction prevents the neutrinos from free-streaming during the radiation-dominated regime so galaxy-sized density perturbations can survive. Smaller scale perturbations are damped due to neutrino diffusion. We calculate the power spectrum in the imperfect fluid approximation, and show that it is damped at the length scale one would estimate due to neutrino diffusion. The strength of the neutrino-neutrino coupling is only weakly constrained by observations, and could be chosen by fitting the power spectrum to the observed amplitude of matter density perturbations. The main shortcoming of our model is that interacting neutrinos cannot provide the dark matter in dwarf galaxies. copyright 1997 The American Physical Society

  19. Interacting warm dark matter

    International Nuclear Information System (INIS)

    Cruz, Norman; Palma, Guillermo; Zambrano, David; Avelino, Arturo

    2013-01-01

    We explore a cosmological model composed by a dark matter fluid interacting with a dark energy fluid. The interaction term has the non-linear λρ m α ρ e β form, where ρ m and ρ e are the energy densities of the dark matter and dark energy, respectively. The parameters α and β are in principle not constrained to take any particular values, and were estimated from observations. We perform an analytical study of the evolution equations, finding the fixed points and their stability properties in order to characterize suitable physical regions in the phase space of the dark matter and dark energy densities. The constants (λ,α,β) as well as w m and w e of the EoS of dark matter and dark energy respectively, were estimated using the cosmological observations of the type Ia supernovae and the Hubble expansion rate H(z) data sets. We find that the best estimated values for the free parameters of the model correspond to a warm dark matter interacting with a phantom dark energy component, with a well goodness-of-fit to data. However, using the Bayesian Information Criterion (BIC) we find that this model is overcame by a warm dark matter – phantom dark energy model without interaction, as well as by the ΛCDM model. We find also a large dispersion on the best estimated values of the (λ,α,β) parameters, so even if we are not able to set strong constraints on their values, given the goodness-of-fit to data of the model, we find that a large variety of theirs values are well compatible with the observational data used

  20. Behavioural and Autonomic Regulation of Response to Sensory Stimuli among Children: A Systematic Review of Relationship and Methodology.

    Science.gov (United States)

    Gomez, Ivan Neil; Lai, Cynthia Y Y; Morato-Espino, Paulin Grace; Chan, Chetwyn C H; Tsang, Hector W H

    2017-01-01

    Previous studies have explored the correlates of behavioural and autonomic regulation of response to sensory stimuli in children; however, a comprehensive review of such relationship is lacking. This systematic review was performed to critically appraise the current evidence on such relationship and describe the methods used in these studies. Online databases were systematically searched for peer-reviewed, full-text articles in the English language between 1999 and 2016, initially screened by title and abstract, and appraised and synthesized by two independent review authors. Fourteen Level III-3 cross-sectional studies were included for systematic review, among which six studies explored the relationship between behaviour and physiological regulation of responses to sensory stimuli. Three studies reported significant positive weak correlations among ASD children; however, no correlations were found in typically developing children. Methodological differences related to individual differences among participants, measures used, and varied laboratory experimental setting were noted. This review suggests inconclusive evidence supporting the relationship between behavioural and physiological regulation of responses to sensory stimuli among children. Methodological differences may likely have confounded the results of the current evidence. We present methodological recommendations to address this matter for future researches. This trial is registered with PROSPERO registration number CRD42016043887.